Repository: PacktPublishing/Transformers-for-Natural-Language-Processing Branch: main Commit: 149c3a314e0e Files: 40 Total size: 35.9 MB Directory structure: gitextract_98o3bluv/ ├── .other/ │ └── technical_requirements.md ├── Chapter01/ │ ├── Multi_Head_Attention_Sub_Layer.ipynb │ ├── positional_encoding.ipynb │ └── text.txt ├── Chapter02/ │ ├── BERT_Fine_Tuning_Sentence_Classification_DR.ipynb │ ├── in_domain_train.tsv │ └── out_of_domain_dev.tsv ├── Chapter03/ │ ├── KantaiBERT.ipynb │ └── kant.txt ├── Chapter04/ │ └── Transformer_tasks.ipynb ├── Chapter05/ │ ├── BLEU.py │ ├── Trax_Translation.ipynb │ ├── read.py │ └── read_clean.py ├── Chapter06/ │ ├── OpenAI_GPT_2.ipynb │ ├── Training_OpenAI_GPT_2.ipynb │ ├── gpt-2-train_files/ │ │ ├── accumulate.py │ │ ├── dset.txt │ │ ├── encode.py │ │ ├── load_dataset.py │ │ ├── memory_saving_gradients.py │ │ └── train.py │ └── head_view_bert.ipynb ├── Chapter07/ │ └── Summarizing_Text_with_T5.ipynb ├── Chapter08/ │ ├── Summarizing_Text_V2.ipynb │ ├── Tokenizer.ipynb │ ├── Training_OpenAI_GPT_2_CH08.ipynb │ ├── gpt-2-train_files/ │ │ ├── accumulate.py │ │ ├── encode.py │ │ ├── load_dataset.py │ │ ├── mdset.txt │ │ ├── memory_saving_gradients.py │ │ └── train.py │ └── text.txt ├── Chapter09/ │ └── SRL.ipynb ├── Chapter10/ │ ├── Haystack_QA_Pipeline.ipynb │ └── QA.ipynb ├── Chapter11/ │ └── SentimentAnalysis.ipynb ├── Chapter12/ │ └── Fake_News.ipynb └── README.md ================================================ FILE CONTENTS ================================================ ================================================ FILE: .other/technical_requirements.md ================================================ **Software and hardware specifications** ========================================
Chapter number Software required
(with version)
Hardware specifications OS required
1-12 Python 3.8 x86/AMD64 system Windows,
any Linux distro,
or macOS
Any modern web browser,
such as Firefox, Edge, Safari,
or Chrome (recommended)
**\*Note**: The code in this book is in the form of python notebooks, and can be executed in Google Colaboratory. If you wish to execute these on your local machine, the (Python) package requirements are mentioned in the following section. **Package specifications** ========================== | **Package required** | **Version** | **Installation command (pip)** | |---------------------------|-------------------|--------------------------------| | Transformers | 4.1.1 or higher | `pip install transformer` | | genism | 3.8.3 or higher | `pip install genism` | | TensorFlow | 2.4.0 or higher | `pip install tensorflow` | | NumPy | 1.19.5 or higher | `pip install numpy` | | SciPy | 1.6.0 or higher | `pip install scipy` | | pandas | 1.2.0 or higher | `pip install pandas` | | Matplotlib | 3.3.3 or higher | `pip install matplotlib` | | scikit-learn | 0.24.0 or higher | `pip install scikit-learn` | | toposort | 1.6.0 or higher | `pip install toposort` | | Sentencepiece | 0.1.94 or higher | `pip install sentencepiece` | | trax | 1.3.7 or higher | `pip install trax` | | Allennlp | 1.0.0 or higher | `pip install allennlp` | | Allennlp-models | 1.0.0 or higher | `pip install allennlp-models` | | Farm-haystack | 0.6.0 or higher | `pip install farm-haystack` | | Torch | 1.6.0+cu101 or higher | `pip install torch==1.6.0+cu101` | **\*Note**: This isn’t an exhaustive list of all the packages required to run the codes in this book, but only the essential and most commonly used packages in the book. You will likely encounter some more required packages as you read through the book, which you can install using `pip` or `conda`. ================================================ FILE: Chapter01/Multi_Head_Attention_Sub_Layer.ipynb ================================================ { "nbformat": 4, "nbformat_minor": 0, "metadata": { "colab": { "name": "Multi-Head Attention Sub-Layer.ipynb", "provenance": [], "collapsed_sections": [] }, "kernelspec": { "name": "python3", "display_name": "Python 3" }, "accelerator": "GPU", "widgets": { "application/vnd.jupyter.widget-state+json": { "946c90b82f7f46caa25c885668b75eab": { "model_module": "@jupyter-widgets/controls", "model_name": "HBoxModel", "state": { "_view_name": "HBoxView", "_dom_classes": [], "_model_name": "HBoxModel", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.5.0", "box_style": "", "layout": "IPY_MODEL_4191af78535e4da8bb797690eff84e00", "_model_module": "@jupyter-widgets/controls", "children": [ "IPY_MODEL_9ce3d57b96b64da0b15e3f3626bacb30", "IPY_MODEL_f8da2c91156342a69d9b262f4f993aa4" ] } }, "4191af78535e4da8bb797690eff84e00": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "9ce3d57b96b64da0b15e3f3626bacb30": { "model_module": "@jupyter-widgets/controls", "model_name": "FloatProgressModel", "state": { "_view_name": "ProgressView", "style": "IPY_MODEL_97370923218945c5b80ab468751ac8a7", "_dom_classes": [], "description": "Downloading: 100%", "_model_name": "FloatProgressModel", "bar_style": "success", "max": 230, "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": 230, "_view_count": null, "_view_module_version": "1.5.0", "orientation": "horizontal", "min": 0, "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_0ba4a91f472e4c41ba80ab4025288446" } }, "f8da2c91156342a69d9b262f4f993aa4": { "model_module": "@jupyter-widgets/controls", "model_name": "HTMLModel", "state": { "_view_name": "HTMLView", "style": "IPY_MODEL_15aa4b6f8f784c74804107be249126b9", "_dom_classes": [], "description": "", "_model_name": "HTMLModel", "placeholder": "​", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": " 230/230 [00:01<00:00, 185B/s]", "_view_count": null, "_view_module_version": "1.5.0", "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_edea457617ed4792aeeb65292019ceb4" } }, "97370923218945c5b80ab468751ac8a7": { "model_module": "@jupyter-widgets/controls", "model_name": "ProgressStyleModel", "state": { "_view_name": "StyleView", "_model_name": "ProgressStyleModel", "description_width": "initial", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "bar_color": null, "_model_module": "@jupyter-widgets/controls" } }, "0ba4a91f472e4c41ba80ab4025288446": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "15aa4b6f8f784c74804107be249126b9": { "model_module": "@jupyter-widgets/controls", "model_name": "DescriptionStyleModel", "state": { "_view_name": "StyleView", "_model_name": "DescriptionStyleModel", "description_width": "", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "_model_module": "@jupyter-widgets/controls" } }, "edea457617ed4792aeeb65292019ceb4": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } } } } }, "cells": [ { "cell_type": "markdown", "metadata": { "id": "aXACkAtfNpG0", "colab_type": "text" }, "source": [ "# The Attention Mechanism\n", "Copyright 2020, Denis Rothman, MIT License. Denis Rothman rewrote the reference notebook entirely in basic Python with no frameworks. Three more steps were added, and a Hugging Face transformer example was added. The original images were taken out, redesigned by Denis Rothman for educational purposes, and inserted in the book descriptions of the multi-attention sub-layer.\n", "\n", "[The Reference Colaboratory Notebook was written by Manuel Romero](https://colab.research.google.com/drive/1rPk3ohrmVclqhH7uQ7qys4oznDdAhpzF)\n", "\n", "[A Medium article was written by Raimi Karim](https://towardsdatascience.com/illustrated-self-attention-2d627e33b20a)" ] }, { "cell_type": "code", "metadata": { "id": "veRoFjFRNXwJ", "colab_type": "code", "colab": {} }, "source": [ "import numpy as np\n", "from scipy.special import softmax" ], "execution_count": 0, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "JLe9lWCJNogW", "colab_type": "code", "outputId": "733e039b-343e-4161-9919-19b3a1ec130f", "colab": { "base_uri": "https://localhost:8080/", "height": 90 } }, "source": [ "print(\"Step 1: Input : 3 inputs, d_model=4\")\n", "x =np.array([[1.0, 0.0, 1.0, 0.0], # Input 1\n", " [0.0, 2.0, 0.0, 2.0], # Input 2\n", " [1.0, 1.0, 1.0, 1.0]]) # Input 3\n", "print(x)" ], "execution_count": 0, "outputs": [ { "output_type": "stream", "text": [ "Step 1: Input : 3 inputs, d_model=4\n", "[[1. 0. 1. 0.]\n", " [0. 2. 0. 2.]\n", " [1. 1. 1. 1.]]\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "JZImwtHPN91V", "colab_type": "code", "outputId": "07706940-e200-4956-b957-fe9681139d0d", "colab": { "base_uri": "https://localhost:8080/", "height": 126 } }, "source": [ "print(\"Step 2: weights 3 dimensions x d_model=4\")\n", "print(\"w_query\")\n", "w_query =np.array([[1, 0, 1],\n", " [1, 0, 0],\n", " [0, 0, 1],\n", " [0, 1, 1]])\n", "print(w_query)" ], "execution_count": 0, "outputs": [ { "output_type": "stream", "text": [ "Step 2: weights 3 dimensions x d_model=4\n", "w_query\n", "[[1 0 1]\n", " [1 0 0]\n", " [0 0 1]\n", " [0 1 1]]\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "7kRBS7MUOFgV", "colab_type": "code", "outputId": "8b0bcc03-88b1-4e8d-a483-dacc91ffa9ee", "colab": { "base_uri": "https://localhost:8080/", "height": 108 } }, "source": [ "print(\"w_key\")\n", "w_key =np.array([[0, 0, 1],\n", " [1, 1, 0],\n", " [0, 1, 0],\n", " [1, 1, 0]])\n", "print(w_key)" ], "execution_count": 0, "outputs": [ { "output_type": "stream", "text": [ "w_key\n", "[[0 0 1]\n", " [1 1 0]\n", " [0 1 0]\n", " [1 1 0]]\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "Napm2VtkOIEN", "colab_type": "code", "outputId": "7331eb08-64d5-4a36-eeef-0a0a556f130b", "colab": { "base_uri": "https://localhost:8080/", "height": 108 } }, "source": [ "print(\"w_value\")\n", "w_value = np.array([[0, 2, 0],\n", " [0, 3, 0],\n", " [1, 0, 3],\n", " [1, 1, 0]])\n", "print(w_value)" ], "execution_count": 0, "outputs": [ { "output_type": "stream", "text": [ "w_value\n", "[[0 2 0]\n", " [0 3 0]\n", " [1 0 3]\n", " [1 1 0]]\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "JqapIgfDOQ7d", "colab_type": "code", "outputId": "fd610d7a-968a-47e6-d614-40ad03c1d172", "colab": { "base_uri": "https://localhost:8080/", "height": 108 } }, "source": [ "print(\"Step 3: Matrix multiplication to obtain Q,K,V\")\n", "\n", "print(\"Queries: x * w_query\")\n", "Q=np.matmul(x,w_query)\n", "print(Q)" ], "execution_count": 0, "outputs": [ { "output_type": "stream", "text": [ "Step 3: Matrix multiplication to obtain Q,K,V\n", "Queries: x * w_query\n", "[[1. 0. 2.]\n", " [2. 2. 2.]\n", " [2. 1. 3.]]\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "NmfMln1Wmv73", "colab_type": "code", "outputId": "065b63ba-7584-4302-97cd-d5e1765470ed", "colab": { "base_uri": "https://localhost:8080/", "height": 108 } }, "source": [ "print(\"Step 3: Matrix multiplication to obtain Q,K,V\")\n", "\n", "print(\"Keys: x * w_key\")\n", "K=np.matmul(x,w_key)\n", "print(K)" ], "execution_count": 0, "outputs": [ { "output_type": "stream", "text": [ "Step 3: Matrix multiplication to obtain Q,K,V\n", "Keys: x * w_key\n", "[[0. 1. 1.]\n", " [4. 4. 0.]\n", " [2. 3. 1.]]\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "v3Asv-8mOWkN", "colab_type": "code", "outputId": "2ec71310-0486-46f4-d9f5-d12a1a6ad0e6", "colab": { "base_uri": "https://localhost:8080/", "height": 90 } }, "source": [ "print(\"Values: x * w_value\")\n", "V=np.matmul(x,w_value)\n", "print(V)" ], "execution_count": 0, "outputs": [ { "output_type": "stream", "text": [ "Values: x * w_value\n", "[[1. 2. 3.]\n", " [2. 8. 0.]\n", " [2. 6. 3.]]\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "gfgRAHUuOp5c", "colab_type": "code", "outputId": "ad02f055-11e0-4b9a-eb15-b66e4846c95e", "colab": { "base_uri": "https://localhost:8080/", "height": 90 } }, "source": [ "print(\"Step 4: Scaled Attention Scores\")\n", "k_d=1 #square root of k_d=3 rounded down to 1 for this example\n", "attention_scores = (Q @ K.transpose())/k_d\n", "print(attention_scores)" ], "execution_count": 0, "outputs": [ { "output_type": "stream", "text": [ "Step 4: Scaled Attention Scores\n", "[[ 2. 4. 4.]\n", " [ 4. 16. 12.]\n", " [ 4. 12. 10.]]\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "hg2t6KuNOjzM", "colab_type": "code", "outputId": "c0610f91-cd1d-4b0f-b5ce-f6445481186a", "colab": { "base_uri": "https://localhost:8080/", "height": 90 } }, "source": [ "print(\"Step 5: Scaled softmax attention_scores for each vector\")\n", "attention_scores[0]=softmax(attention_scores[0])\n", "attention_scores[1]=softmax(attention_scores[1])\n", "attention_scores[2]=softmax(attention_scores[2])\n", "print(attention_scores[0])\n", "print(attention_scores[1])\n", "print(attention_scores[2])" ], "execution_count": 0, "outputs": [ { "output_type": "stream", "text": [ "Step 5: Scaled softmax attention_scores for each vector\n", "[0.06337894 0.46831053 0.46831053]\n", "[6.03366485e-06 9.82007865e-01 1.79861014e-02]\n", "[2.95387223e-04 8.80536902e-01 1.19167711e-01]\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "R4Es7A7NOvjD", "colab_type": "code", "outputId": "b86060fe-1292-47c5-93f6-ddeeca1bfb62", "colab": { "base_uri": "https://localhost:8080/", "height": 199 } }, "source": [ "print(\"Step 6: attention value obtained by score1/k_d * V\")\n", "print(V[0])\n", "print(V[1])\n", "print(V[2])\n", "print(\"Attention 1\")\n", "attention1=attention_scores[0].reshape(-1,1)\n", "attention1=attention_scores[0][0]*V[0]\n", "print(attention1)\n", "\n", "print(\"Attention 2\")\n", "attention2=attention_scores[0][1]*V[1]\n", "print(attention2)\n", "\n", "print(\"Attention 3\")\n", "attention3=attention_scores[0][2]*V[2]\n", "print(attention3)" ], "execution_count": 0, "outputs": [ { "output_type": "stream", "text": [ "Step 6: attention value obtained by score1/k_d * V\n", "[1. 2. 3.]\n", "[2. 8. 0.]\n", "[2. 6. 3.]\n", "Attention 1\n", "[0.06337894 0.12675788 0.19013681]\n", "Attention 2\n", "[0.93662106 3.74648425 0. ]\n", "Attention 3\n", "[0.93662106 2.80986319 1.40493159]\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "uBDKhaCvOzXj", "colab_type": "code", "outputId": "138901d8-0aa9-4db9-b8b1-76ad557e6688", "colab": { "base_uri": "https://localhost:8080/", "height": 54 } }, "source": [ "print(\"Step 7: summed the results to create the first line of the output matrix\")\n", "attention_input1=attention1+attention2+attention3\n", "print(attention_input1)" ], "execution_count": 0, "outputs": [ { "output_type": "stream", "text": [ "Step 7: summed the results to create the first line of the output matrix\n", "[1.93662106 6.68310531 1.59506841]\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "iEjgRcqHO4ik", "colab_type": "code", "outputId": "675a154b-a305-4c0c-e314-353541abfd3e", "colab": { "base_uri": "https://localhost:8080/", "height": 635 } }, "source": [ "print(\"Step 8: Step 1 to 7 for inputs 1 to 3\")\n", "#We assume we have 3 results with learned weights (they were not trained in this example)\n", "#We assume we are implementing the original Transformer paper. We will have 3 results of 64 dimensions each\n", "attention_head1=np.random.random((3, 64))\n", "print(attention_head1)" ], "execution_count": 0, "outputs": [ { "output_type": "stream", "text": [ "Step 8: Step 1 to 7 for inputs 1 to 3\n", "[[0.05750794 0.25966685 0.80912647 0.00841755 0.53786959 0.05089332\n", " 0.17938191 0.91091697 0.20593063 0.27634727 0.33869867 0.25488968\n", " 0.88673807 0.56544205 0.69075114 0.56069125 0.92579273 0.46042461\n", " 0.78471374 0.93064241 0.99626239 0.13662306 0.72892312 0.52327088\n", " 0.90128711 0.28245531 0.05630861 0.55857421 0.50998676 0.59709355\n", " 0.40038745 0.70580749 0.18971837 0.78544634 0.35815199 0.57527984\n", " 0.38283035 0.94917395 0.25450774 0.85725663 0.27262613 0.5720429\n", " 0.38092713 0.34721503 0.38857267 0.50218029 0.74035216 0.37789311\n", " 0.12812721 0.42074447 0.39534834 0.4927362 0.65353466 0.86485487\n", " 0.22989766 0.87239043 0.64613354 0.89034403 0.29338559 0.1671029\n", " 0.1675619 0.70683457 0.03683821 0.37657364]\n", " [0.08308343 0.01529261 0.34000535 0.48559272 0.25036425 0.98195061\n", " 0.72015388 0.03838282 0.18674587 0.33203929 0.82965726 0.6962791\n", " 0.49038184 0.97126469 0.25373185 0.18486967 0.38352481 0.68254099\n", " 0.01014604 0.51217341 0.17219508 0.14178547 0.74892979 0.12190071\n", " 0.0090985 0.09704158 0.70447804 0.21374912 0.72523093 0.89713875\n", " 0.28817021 0.56472583 0.59136866 0.7711216 0.78839121 0.03607145\n", " 0.33438564 0.99970048 0.80579864 0.79923327 0.57124039 0.64183951\n", " 0.11464931 0.703289 0.64033748 0.5799896 0.14488077 0.90946673\n", " 0.4189947 0.99825172 0.28607413 0.6801013 0.16240732 0.25219133\n", " 0.30470031 0.30292756 0.15999459 0.52230381 0.82012623 0.33586634\n", " 0.25613996 0.60354742 0.26006038 0.23281006]\n", " [0.37977727 0.7429604 0.38837932 0.18434243 0.84440271 0.53955069\n", " 0.40121556 0.83114666 0.48845808 0.58768546 0.4097926 0.29445373\n", " 0.22750019 0.9520429 0.99964437 0.57829693 0.32369595 0.60769326\n", " 0.76116892 0.14857116 0.07462658 0.01199289 0.37147371 0.80177111\n", " 0.60845313 0.33410248 0.06017335 0.447363 0.31500924 0.95988807\n", " 0.41506716 0.33740287 0.38991258 0.23478571 0.57808465 0.48520973\n", " 0.48241035 0.35030686 0.90598744 0.1296871 0.57966373 0.98736092\n", " 0.43859306 0.5358377 0.25181342 0.0195783 0.51178364 0.26981021\n", " 0.04674047 0.97762416 0.72747288 0.75616534 0.68105477 0.06914679\n", " 0.14054312 0.42816012 0.66792325 0.03168237 0.68685758 0.43487164\n", " 0.08064005 0.23444144 0.60360253 0.21423994]]\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "QI50dkZ1O630", "colab_type": "code", "outputId": "7d467842-f837-4e41-e099-534549b6fc05", "colab": { "base_uri": "https://localhost:8080/", "height": 54 } }, "source": [ "print(\"Step 9: We assume we have trained the 8 heads of the attention sub-layer\")\n", "z0h1=np.random.random((3, 64))\n", "z1h2=np.random.random((3, 64))\n", "z2h3=np.random.random((3, 64))\n", "z3h4=np.random.random((3, 64))\n", "z4h5=np.random.random((3, 64))\n", "z5h6=np.random.random((3, 64))\n", "z6h7=np.random.random((3, 64))\n", "z7h8=np.random.random((3, 64))\n", "print(\"shape of one head\",z0h1.shape,\"dimension of 8 heads\",64*8)" ], "execution_count": 0, "outputs": [ { "output_type": "stream", "text": [ "Step 9: We assume we have trained the 8 heads of the attention sub-layer\n", "shape of one head (3, 64) dimension of 8 heads 512\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "3n87LE92_Puf", "colab_type": "code", "outputId": "55d00415-ebea-43a6-b4c5-ff13e02c3052", "colab": { "base_uri": "https://localhost:8080/", "height": 90 } }, "source": [ "print(\"Step 10: Concatenation of heads 1 to 8 to obtain the original 8x64=512 output dimension of the model\")\n", "output_attention=np.hstack((z0h1,z1h2,z2h3,z3h4,z4h5,z5h6,z6h7,z7h8))\n", "print(output_attention)" ], "execution_count": 0, "outputs": [ { "output_type": "stream", "text": [ "Step 10: Concantenation of heads 1 to 8 to obtain the original 8x64=512 ouput dimension of the model\n", "[[0.46950893 0.88546586 0.47615937 ... 0.08285802 0.16577096 0.61094461]\n", " [0.31638247 0.24246402 0.30390966 ... 0.42283366 0.62127905 0.64414042]\n", " [0.1922683 0.7017995 0.60116595 ... 0.20012387 0.16264044 0.93645276]]\n" ], "name": "stdout" } ] }, { "cell_type": "markdown", "metadata": { "id": "PJLl4Jf3fPLh", "colab_type": "text" }, "source": [ "And now with Hugging Face in one line!" ] }, { "cell_type": "code", "metadata": { "id": "CZIRvcRmfTPb", "colab_type": "code", "colab": {} }, "source": [ "#@title Transformer Installation\n", "!pip -qq install transformers" ], "execution_count": 0, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "cNwLYc-SfXdF", "colab_type": "code", "outputId": "d1314cc6-74d6-45cf-b8d6-0a903e58ac60", "colab": { "base_uri": "https://localhost:8080/", "height": 85, "referenced_widgets": [ "946c90b82f7f46caa25c885668b75eab", "4191af78535e4da8bb797690eff84e00", "9ce3d57b96b64da0b15e3f3626bacb30", "f8da2c91156342a69d9b262f4f993aa4", "97370923218945c5b80ab468751ac8a7", "0ba4a91f472e4c41ba80ab4025288446", "15aa4b6f8f784c74804107be249126b9", "edea457617ed4792aeeb65292019ceb4" ] } }, "source": [ "#@title Retrieve pipeline of modules and choose English to French translation\n", "from transformers import pipeline\n", "translator = pipeline(\"translation_en_to_fr\")\n", "#One line of code!\n", "print(translator(\"It is easy to translate languages with transformers\", max_length=40))" ], "execution_count": 0, "outputs": [ { "output_type": "display_data", "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "946c90b82f7f46caa25c885668b75eab", "version_minor": 0, "version_major": 2 }, "text/plain": [ "HBox(children=(FloatProgress(value=0.0, description='Downloading', max=230.0, style=ProgressStyle(description_…" ] }, "metadata": { "tags": [] } }, { "output_type": "stream", "text": [ "\n", "[{'translation_text': 'Il est facile de traduire des langues avec des transformateurs.'}]\n" ], "name": "stdout" } ] } ] } ================================================ FILE: Chapter01/positional_encoding.ipynb ================================================ { "nbformat": 4, "nbformat_minor": 0, "metadata": { "colab": { "name": "positional_encoding.ipynb", "provenance": [], "collapsed_sections": [], "toc_visible": true }, "kernelspec": { "name": "python3", "display_name": "Python 3" } }, "cells": [ { "cell_type": "markdown", "metadata": { "id": "7fjcTlyE3WvR" }, "source": [ "#A Positional Encoding Example\n", "Copyright 2021 Denis Rothman, MIT License\n", "\n", "Reference 1 for Positional Encoding:\n", "Attention is All You Need paper, page 6,Google Brain and Google Research\n", "\n", "\n", "Reference 2 for word embedding:\n", "https://www.geeksforgeeks.org/python-word-embedding-using-word2vec/\n", "Reference 3 for cosine similarity:\n", "SciKit Learn cosine similarity documentation\n" ] }, { "cell_type": "code", "metadata": { "id": "JKJ8Saf6vR9b", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "654ce4ae-0115-46d4-a186-ee2581f1ee4f" }, "source": [ "!pip install gensim==3.8.3\n", "import torch\n", "import nltk\n", "nltk.download('punkt')" ], "execution_count": 6, "outputs": [ { "output_type": "stream", "text": [ "Requirement already satisfied: gensim==3.8.3 in /usr/local/lib/python3.7/dist-packages (3.8.3)\n", "Requirement already satisfied: scipy>=0.18.1 in /usr/local/lib/python3.7/dist-packages (from gensim==3.8.3) (1.4.1)\n", "Requirement already satisfied: smart-open>=1.8.1 in /usr/local/lib/python3.7/dist-packages (from gensim==3.8.3) (4.2.0)\n", "Requirement already satisfied: numpy>=1.11.3 in /usr/local/lib/python3.7/dist-packages (from gensim==3.8.3) (1.19.5)\n", "Requirement already satisfied: six>=1.5.0 in /usr/local/lib/python3.7/dist-packages (from gensim==3.8.3) (1.15.0)\n", "[nltk_data] Downloading package punkt to /root/nltk_data...\n", "[nltk_data] Package punkt is already up-to-date!\n" ], "name": "stdout" }, { "output_type": "execute_result", "data": { "text/plain": [ "True" ] }, "metadata": { "tags": [] }, "execution_count": 6 } ] }, { "cell_type": "markdown", "metadata": { "id": "PGXgeOyS5qBP" }, "source": [ "# upload to the text.txt file to Google Colaboratory using the file manager " ] }, { "cell_type": "code", "metadata": { "id": "7o7EeDUUu0Sh" }, "source": [ "import math\n", "import numpy as np\n", "from nltk.tokenize import sent_tokenize, word_tokenize \n", "import gensim \n", "from gensim.models import Word2Vec \n", "import numpy as np\n", "from sklearn.metrics.pairwise import cosine_similarity\n", "import matplotlib.pyplot as plt\n", "import warnings \n", "warnings.filterwarnings(action = 'ignore') \n", "\n", "\n", "dprint=0 # prints outputs if set to 1, default=0\n", "\n", "#‘text.txt’ file \n", "sample = open(\"text.txt\", \"r\") \n", "s = sample.read() \n", "\n", "# processing escape characters \n", "f = s.replace(\"\\n\", \" \") \n", "\n", "data = [] \n", "\n", "# sentence parsing \n", "for i in sent_tokenize(f): \n", "\ttemp = [] \n", "\t# tokenize the sentence into words \n", "\tfor j in word_tokenize(i): \n", "\t\ttemp.append(j.lower()) \n", "\tdata.append(temp) \n", "\n", "# Creating Skip Gram model \n", "#model2 = gensim.models.Word2Vec(data, min_count = 1, size = 512,window = 5, sg = 1) \n", "#model = Word2Vec(sentences=common_texts, vector_size=100, window=5, min_count=1, workers=4)\n", "model2 = gensim.models.Word2Vec(data, min_count = 1, size = 512,window = 5, sg = 1)\n", "\n", "# 1-The 2-black 3-cat 4-sat 5-on 6-the 7-couch 8-and 9-the 10-brown 11-dog 12-slept 13-on 14-the 15-rug.\n", "word1='black'\n", "word2='brown'\n", "pos1=2\n", "pos2=10\n", "a=model2[word1]\n", "b=model2[word2]\n", "\n", "if(dprint==1):\n", " print(a)\n", "\n", "# compute cosine similarity\n", "dot = np.dot(a, b)\n", "norma = np.linalg.norm(a)\n", "normb = np.linalg.norm(b)\n", "cos = dot / (norma * normb)\n", "\n", "aa = a.reshape(1,512) \n", "ba = b.reshape(1,512)\n", "cos_lib = cosine_similarity(aa, ba)" ], "execution_count": 7, "outputs": [] }, { "cell_type": "markdown", "metadata": { "id": "xlTeXmatz7bP" }, "source": [ "A Positional Encoding example using one line of basic Python using a few lines of code for the sine and cosine functions.\n", "I added a Pytorch method inspired by Pytorch.org to explore these methods.\n", "The main idea to keep in mind is that we are looking to add small values to the word embedding output so that the positions are taken into account. This means that as long as the cosine similarity, for example, displayed at the end of the notebook, shows the positions have been taken into account, the method can apply. Depending on the Transformer model, this method can be fine-tuned as well as using other methods. " ] }, { "cell_type": "code", "metadata": { "id": "EmBUq9MzxQxz" }, "source": [ "pe1=aa.copy()\n", "pe2=aa.copy()\n", "pe3=aa.copy()\n", "paa=aa.copy()\n", "pba=ba.copy()\n", "d_model=512\n", "max_print=d_model\n", "max_length=20\n", "\n", "for i in range(0, max_print,2):\n", " pe1[0][i] = math.sin(pos1 / (10000 ** ((2 * i)/d_model)))\n", " paa[0][i] = (paa[0][i]*math.sqrt(d_model))+ pe1[0][i]\n", " pe1[0][i+1] = math.cos(pos1 / (10000 ** ((2 * i)/d_model)))\n", " paa[0][i+1] = (paa[0][i+1]*math.sqrt(d_model))+pe1[0][i+1]\n", " if dprint==1:\n", " print(i,pe1[0][i],i+1,pe1[0][i+1])\n", " print(i,paa[0][i],i+1,paa[0][i+1])\n", " print(\"\\n\")\n", "\n", "#print(pe1)\n", "# A method in Pytorch using torch.exp and math.log :\n", "max_len=max_length \n", "pe = torch.zeros(max_len, d_model)\n", "position = torch.arange(0, max_len, dtype=torch.float).unsqueeze(1)\n", "div_term = torch.exp(torch.arange(0, d_model, 2).float() * (-math.log(10000.0) / d_model))\n", "pe[:, 0::2] = torch.sin(position * div_term)\n", "pe[:, 1::2] = torch.cos(position * div_term)\n", "#print(pe[:, 0::2])" ], "execution_count": 8, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "pgrXed2FwHDC", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "54dcdded-8470-47fc-999e-67294ee67dd2" }, "source": [ "\n", "for i in range(0, max_print,2):\n", " pe2[0][i] = math.sin(pos2 / (10000 ** ((2 * i)/d_model)))\n", " pba[0][i] = (pba[0][i]*math.sqrt(d_model))+ pe2[0][i]\n", " \n", " pe2[0][i+1] = math.cos(pos2 / (10000 ** ((2 * i)/d_model)))\n", " pba[0][i+1] = (pba[0][i+1]*math.sqrt(d_model))+ pe2[0][i+1]\n", " \n", " if dprint==1:\n", " print(i,pe2[0][i],i+1,pe2[0][i+1])\n", " print(i,paa[0][i],i+1,paa[0][i+1])\n", " print(\"\\n\")\n", "\n", "print(word1,word2)\n", "cos_lib = cosine_similarity(aa, ba)\n", "print(cos_lib,\"word similarity\")\n", "cos_lib = cosine_similarity(pe1, pe2)\n", "print(cos_lib,\"positional similarity\")\n", "cos_lib = cosine_similarity(paa, pba)\n", "print(cos_lib,\"positional encoding similarity\")\n", "\n", "if dprint==1:\n", " print(word1)\n", " print(\"embedding\")\n", " print(aa)\n", " print(\"positional encoding\")\n", " print(pe1)\n", " print(\"encoded embedding\")\n", " print(paa)\n", "\n", " print(word2)\n", " print(\"embedding\")\n", " print(ba)\n", " print(\"positional encoding\")\n", " print(pe2)\n", " print(\"encoded embedding\")\n", " print(pba)\n", "\n" ], "execution_count": 9, "outputs": [ { "output_type": "stream", "text": [ "black brown\n", "[[0.9998703]] word similarity\n", "[[0.8600013]] positional similarity\n", "[[0.96135795]] positional encoding similarity\n" ], "name": "stdout" } ] } ] } ================================================ FILE: Chapter01/text.txt ================================================ The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug.The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. The black cat sat on the couch and the brown dog slept on the rug.The cat did not cross the street because it was too wet.The dog sat on the couch near the rug. ================================================ FILE: Chapter02/BERT_Fine_Tuning_Sentence_Classification_DR.ipynb ================================================ { "nbformat": 4, "nbformat_minor": 0, "metadata": { "colab": { "name": "BERT_Fine_Tuning_Sentence_Classification_DR.ipynb", "provenance": [], "collapsed_sections": [], "toc_visible": true }, "kernelspec": { "name": "python3", "display_name": "Python 3" }, "accelerator": "GPU" }, "cells": [ { "cell_type": "markdown", "metadata": { "id": "jNKaJz5j_ylj" }, "source": [ "# BERT Fine-Tuning Sentence Classification\n", "Copyright 2020 Denis Rothman. The text cells were taken out and replaced by titles of each cell withing the cell. The titles of the cells refer to the titles of the sections of the book. The descriptions of the cells have been rewritten for educational purposes.\n", "\n", "Contributer: George Mihaila\n", "\n", "[Reference Notebook by Chris McCormick and Nick Ryan](https://colab.research.google.com/drive/1pTuQhug6Dhl9XalKB0zUGf4FIdYFlpcX)\n", "\n", "[Reference Article by Chris McCormick and Nick Ryan](https://mccormickml.com/2019/07/22/BERT-fine-tuning/)" ] }, { "cell_type": "code", "metadata": { "id": "DEfSbAA4QHas", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "d8560cba-271c-451a-cc53-a088af8ce80e" }, "source": [ "#@title Activating the GPU\n", "# Main menu->Runtime->Change Runtime Type\n", "import tensorflow as tf\n", "device_name = tf.test.gpu_device_name()\n", "if device_name != '/device:GPU:0':\n", " raise SystemError('GPU device not found')\n", "print('Found GPU at: {}'.format(device_name))" ], "execution_count": 26, "outputs": [ { "output_type": "stream", "text": [ "Found GPU at: /device:GPU:0\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "0NmMdkZO8R6q" }, "source": [ "#@title Installing the Hugging Face PyTorch Interface for Bert\n", "# !pip install pytorch-pretrained-bert pytorch-nlp\n", "!pip install -q transformers" ], "execution_count": 27, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "Ok002ceNB8E7" }, "source": [ "#@title Importing the modules\n", "import torch\n", "from torch.utils.data import TensorDataset, DataLoader, RandomSampler, SequentialSampler\n", "from keras.preprocessing.sequence import pad_sequences\n", "from sklearn.model_selection import train_test_split\n", "from transformers import BertTokenizer, BertConfig\n", "from transformers import AdamW, BertForSequenceClassification, get_linear_schedule_with_warmup\n", "from tqdm import tqdm, trange\n", "import pandas as pd\n", "import io\n", "import numpy as np\n", "import matplotlib.pyplot as plt\n", "% matplotlib inline" ], "execution_count": 28, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "oYsV4H8fCpZ-", "colab": { "base_uri": "https://localhost:8080/", "height": 35 }, "outputId": "5a2b9ada-305c-4c38-f5c1-9daf64758ff9" }, "source": [ "#@title Specifying CUDA as the device for Torch\n", "device = torch.device(\"cuda\" if torch.cuda.is_available() else \"cpu\")\n", "n_gpu = torch.cuda.device_count()\n", "torch.cuda.get_device_name(0)" ], "execution_count": 29, "outputs": [ { "output_type": "execute_result", "data": { "application/vnd.google.colaboratory.intrinsic+json": { "type": "string" }, "text/plain": [ "'Tesla P4'" ] }, "metadata": { "tags": [] }, "execution_count": 29 } ] }, { "cell_type": "markdown", "metadata": { "id": "JpfK9OOJy1OY" }, "source": [ "@article{warstadt2018neural,\n", " title={Neural Network Acceptability Judgments},\n", " author={Warstadt, Alex and Singh, Amanpreet and Bowman, Samuel R},\n", " journal={arXiv preprint arXiv:1805.12471},\n", " year={2018}\n", "}\n" ] }, { "cell_type": "code", "metadata": { "id": "_UkeC7SG2krJ", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "1d576ac9-fb08-4715-bb7b-dd0b55983bac" }, "source": [ "#@title Loading the Dataset\n", "#source of dataset : https://nyu-mll.github.io/CoLA/\n", "df = pd.read_csv(\"in_domain_train.tsv\", delimiter='\\t', header=None, names=['sentence_source', 'label', 'label_notes', 'sentence'])\n", "df.shape" ], "execution_count": 30, "outputs": [ { "output_type": "execute_result", "data": { "text/plain": [ "(8551, 4)" ] }, "metadata": { "tags": [] }, "execution_count": 30 } ] }, { "cell_type": "code", "metadata": { "id": "AQfTaYDo42zu", "colab": { "base_uri": "https://localhost:8080/", "height": 376 }, "outputId": "abf38bbc-0bc2-42e8-9709-b25614e6f3d3" }, "source": [ "df.sample(10)" ], "execution_count": 31, "outputs": [ { "output_type": "execute_result", "data": { "text/html": [ "
\n", "\n", "\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "
sentence_sourcelabellabel_notessentence
1171r-671NaNjohn is as tall as that man .
5605c_131NaNi expect soon to see the results .
6605g_811NaNjohn hummed , and mary sang , at equal volumes .
3537ks081NaNthey can smile .
8483ad031NaNalison ran
4709ks081NaNthe news was dealt with carefully .
7690sks131NaNi sent money to mary .
7515sks130*mary wrote a letter to himself last year .
2443l-931NaNa flowering plant is on the windowsill .
5680c_131NaNthe canadian bought himself a barbecue .
\n", "
" ], "text/plain": [ " sentence_source ... sentence\n", "1171 r-67 ... john is as tall as that man .\n", "5605 c_13 ... i expect soon to see the results .\n", "6605 g_81 ... john hummed , and mary sang , at equal volumes .\n", "3537 ks08 ... they can smile .\n", "8483 ad03 ... alison ran\n", "4709 ks08 ... the news was dealt with carefully .\n", "7690 sks13 ... i sent money to mary .\n", "7515 sks13 ... mary wrote a letter to himself last year .\n", "2443 l-93 ... a flowering plant is on the windowsill .\n", "5680 c_13 ... the canadian bought himself a barbecue .\n", "\n", "[10 rows x 4 columns]" ] }, "metadata": { "tags": [] }, "execution_count": 31 } ] }, { "cell_type": "code", "metadata": { "id": "GuE5BqICAne2" }, "source": [ "#@ Creating sentence, label lists and adding Bert tokens\n", "sentences = df.sentence.values\n", "\n", "# Adding CLS and SEP tokens at the beginning and end of each sentence for BERT\n", "sentences = [\"[CLS] \" + sentence + \" [SEP]\" for sentence in sentences]\n", "labels = df.label.values" ], "execution_count": 32, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "Z474sSC6oe7A", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "59ff8f5c-64ab-46a5-b74b-6fb24ded5cb8" }, "source": [ "#@title Activating the BERT Tokenizer\n", "tokenizer = BertTokenizer.from_pretrained('bert-base-uncased', do_lower_case=True)\n", "tokenized_texts = [tokenizer.tokenize(sent) for sent in sentences]\n", "print (\"Tokenize the first sentence:\")\n", "print (tokenized_texts[0])" ], "execution_count": 33, "outputs": [ { "output_type": "stream", "text": [ "Tokenize the first sentence:\n", "['[CLS]', 'our', 'friends', 'wo', 'n', \"'\", 't', 'buy', 'this', 'analysis', ',', 'let', 'alone', 'the', 'next', 'one', 'we', 'propose', '.', '[SEP]']\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "Cp9BPRd1tMIo" }, "source": [ "#@title Processing the data\n", "# Set the maximum sequence length. The longest sequence in our training set is 47, but we'll leave room on the end anyway. \n", "# In the original paper, the authors used a length of 512.\n", "MAX_LEN = 128\n", "\n", "# Use the BERT tokenizer to convert the tokens to their index numbers in the BERT vocabulary\n", "input_ids = [tokenizer.convert_tokens_to_ids(x) for x in tokenized_texts]\n", "\n", "# Pad our input tokens\n", "input_ids = pad_sequences(input_ids, maxlen=MAX_LEN, dtype=\"long\", truncating=\"post\", padding=\"post\")" ], "execution_count": 34, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "cDoC24LeEv3N" }, "source": [ "#@title Create attention masks\n", "attention_masks = []\n", "\n", "# Create a mask of 1s for each token followed by 0s for padding\n", "for seq in input_ids:\n", " seq_mask = [float(i>0) for i in seq]\n", " attention_masks.append(seq_mask)" ], "execution_count": 35, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "aFbE-UHvsb7-" }, "source": [ "#@title Splitting data into train and validation sets\n", "# Use train_test_split to split our data into train and validation sets for training\n", "\n", "train_inputs, validation_inputs, train_labels, validation_labels = train_test_split(input_ids, labels, \n", " random_state=2018, test_size=0.1)\n", "train_masks, validation_masks, _, _ = train_test_split(attention_masks, input_ids,\n", " random_state=2018, test_size=0.1)" ], "execution_count": 36, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "jw5K2A5Ko1RF" }, "source": [ "#@title Converting all the data into torch tensors\n", "# Torch tensors are the required datatype for our model\n", "\n", "train_inputs = torch.tensor(train_inputs)\n", "validation_inputs = torch.tensor(validation_inputs)\n", "train_labels = torch.tensor(train_labels)\n", "validation_labels = torch.tensor(validation_labels)\n", "train_masks = torch.tensor(train_masks)\n", "validation_masks = torch.tensor(validation_masks)" ], "execution_count": 37, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "GEgLpFVlo1Z-" }, "source": [ "#@title Selecting a Batch Size and Creating and Iterator\n", "# Select a batch size for training. For fine-tuning BERT on a specific task, the authors recommend a batch size of 16 or 32\n", "batch_size = 32\n", "\n", "# Create an iterator of our data with torch DataLoader. This helps save on memory during training because, unlike a for loop, \n", "# with an iterator the entire dataset does not need to be loaded into memory\n", "\n", "train_data = TensorDataset(train_inputs, train_masks, train_labels)\n", "train_sampler = RandomSampler(train_data)\n", "train_dataloader = DataLoader(train_data, sampler=train_sampler, batch_size=batch_size)\n", "\n", "validation_data = TensorDataset(validation_inputs, validation_masks, validation_labels)\n", "validation_sampler = SequentialSampler(validation_data)\n", "validation_dataloader = DataLoader(validation_data, sampler=validation_sampler, batch_size=batch_size)\n" ], "execution_count": 38, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "JzX6dkOHCv9F", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "7729a80b-bc93-41be-f999-063c77f5e2a4" }, "source": [ "#@title Bert Configuration\n", "# Initializing a BERT bert-base-uncased style configuration\n", "#@title Transformer Installation\n", "try:\n", " import transformers\n", "except:\n", " print(\"Installing transformers\")\n", " !pip -qq install transformers\n", " \n", "from transformers import BertModel, BertConfig\n", "configuration = BertConfig()\n", "\n", "# Initializing a model from the bert-base-uncased style configuration\n", "model = BertModel(configuration)\n", "\n", "# Accessing the model configuration\n", "configuration = model.config\n", "print(configuration)" ], "execution_count": 39, "outputs": [ { "output_type": "stream", "text": [ "BertConfig {\n", " \"attention_probs_dropout_prob\": 0.1,\n", " \"gradient_checkpointing\": false,\n", " \"hidden_act\": \"gelu\",\n", " \"hidden_dropout_prob\": 0.1,\n", " \"hidden_size\": 768,\n", " \"initializer_range\": 0.02,\n", " \"intermediate_size\": 3072,\n", " \"layer_norm_eps\": 1e-12,\n", " \"max_position_embeddings\": 512,\n", " \"model_type\": \"bert\",\n", " \"num_attention_heads\": 12,\n", " \"num_hidden_layers\": 12,\n", " \"pad_token_id\": 0,\n", " \"type_vocab_size\": 2,\n", " \"vocab_size\": 30522\n", "}\n", "\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "0z3-ZV0k2qk8", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "852674d6-31a6-40e5-eed1-5548bdb03584" }, "source": [ "#@title Loading the Hugging Face Bert Uncased Base Model \n", "model = BertForSequenceClassification.from_pretrained(\"bert-base-uncased\", num_labels=2)\n", "model.cuda()" ], "execution_count": 40, "outputs": [ { "output_type": "stream", "text": [ "Some weights of the model checkpoint at bert-base-uncased were not used when initializing BertForSequenceClassification: ['cls.predictions.bias', 'cls.predictions.transform.dense.weight', 'cls.predictions.transform.dense.bias', 'cls.predictions.decoder.weight', 'cls.seq_relationship.weight', 'cls.seq_relationship.bias', 'cls.predictions.transform.LayerNorm.weight', 'cls.predictions.transform.LayerNorm.bias']\n", "- This IS expected if you are initializing BertForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).\n", "- This IS NOT expected if you are initializing BertForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).\n", "Some weights of BertForSequenceClassification were not initialized from the model checkpoint at bert-base-uncased and are newly initialized: ['classifier.weight', 'classifier.bias']\n", "You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.\n" ], "name": "stderr" }, { "output_type": "execute_result", "data": { "text/plain": [ "BertForSequenceClassification(\n", " (bert): BertModel(\n", " (embeddings): BertEmbeddings(\n", " (word_embeddings): Embedding(30522, 768, padding_idx=0)\n", " (position_embeddings): Embedding(512, 768)\n", " (token_type_embeddings): Embedding(2, 768)\n", " (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (encoder): BertEncoder(\n", " (layer): ModuleList(\n", " (0): BertLayer(\n", " (attention): BertAttention(\n", " (self): BertSelfAttention(\n", " (query): Linear(in_features=768, out_features=768, bias=True)\n", " (key): Linear(in_features=768, out_features=768, bias=True)\n", " (value): Linear(in_features=768, out_features=768, bias=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (output): BertSelfOutput(\n", " (dense): Linear(in_features=768, out_features=768, bias=True)\n", " (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " (intermediate): BertIntermediate(\n", " (dense): Linear(in_features=768, out_features=3072, bias=True)\n", " )\n", " (output): BertOutput(\n", " (dense): Linear(in_features=3072, out_features=768, bias=True)\n", " (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " (1): BertLayer(\n", " (attention): BertAttention(\n", " (self): BertSelfAttention(\n", " (query): Linear(in_features=768, out_features=768, bias=True)\n", " (key): Linear(in_features=768, out_features=768, bias=True)\n", " (value): Linear(in_features=768, out_features=768, bias=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (output): BertSelfOutput(\n", " (dense): Linear(in_features=768, out_features=768, bias=True)\n", " (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " (intermediate): BertIntermediate(\n", " (dense): Linear(in_features=768, out_features=3072, bias=True)\n", " )\n", " (output): BertOutput(\n", " (dense): Linear(in_features=3072, out_features=768, bias=True)\n", " (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " (2): BertLayer(\n", " (attention): BertAttention(\n", " (self): BertSelfAttention(\n", " (query): Linear(in_features=768, out_features=768, bias=True)\n", " (key): Linear(in_features=768, out_features=768, bias=True)\n", " (value): Linear(in_features=768, out_features=768, bias=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (output): BertSelfOutput(\n", " (dense): Linear(in_features=768, out_features=768, bias=True)\n", " (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " (intermediate): BertIntermediate(\n", " (dense): Linear(in_features=768, out_features=3072, bias=True)\n", " )\n", " (output): BertOutput(\n", " (dense): Linear(in_features=3072, out_features=768, bias=True)\n", " (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " (3): BertLayer(\n", " (attention): BertAttention(\n", " (self): BertSelfAttention(\n", " (query): Linear(in_features=768, out_features=768, bias=True)\n", " (key): Linear(in_features=768, out_features=768, bias=True)\n", " (value): Linear(in_features=768, out_features=768, bias=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (output): BertSelfOutput(\n", " (dense): Linear(in_features=768, out_features=768, bias=True)\n", " (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " (intermediate): BertIntermediate(\n", " (dense): Linear(in_features=768, out_features=3072, bias=True)\n", " )\n", " (output): BertOutput(\n", " (dense): Linear(in_features=3072, out_features=768, bias=True)\n", " (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " (4): BertLayer(\n", " (attention): BertAttention(\n", " (self): BertSelfAttention(\n", " (query): Linear(in_features=768, out_features=768, bias=True)\n", " (key): Linear(in_features=768, out_features=768, bias=True)\n", " (value): Linear(in_features=768, out_features=768, bias=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (output): BertSelfOutput(\n", " (dense): Linear(in_features=768, out_features=768, bias=True)\n", " (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " (intermediate): BertIntermediate(\n", " (dense): Linear(in_features=768, out_features=3072, bias=True)\n", " )\n", " (output): BertOutput(\n", " (dense): Linear(in_features=3072, out_features=768, bias=True)\n", " (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " (5): BertLayer(\n", " (attention): BertAttention(\n", " (self): BertSelfAttention(\n", " (query): Linear(in_features=768, out_features=768, bias=True)\n", " (key): Linear(in_features=768, out_features=768, bias=True)\n", " (value): Linear(in_features=768, out_features=768, bias=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (output): BertSelfOutput(\n", " (dense): Linear(in_features=768, out_features=768, bias=True)\n", " (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " (intermediate): BertIntermediate(\n", " (dense): Linear(in_features=768, out_features=3072, bias=True)\n", " )\n", " (output): BertOutput(\n", " (dense): Linear(in_features=3072, out_features=768, bias=True)\n", " (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " (6): BertLayer(\n", " (attention): BertAttention(\n", " (self): BertSelfAttention(\n", " (query): Linear(in_features=768, out_features=768, bias=True)\n", " (key): Linear(in_features=768, out_features=768, bias=True)\n", " (value): Linear(in_features=768, out_features=768, bias=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (output): BertSelfOutput(\n", " (dense): Linear(in_features=768, out_features=768, bias=True)\n", " (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " (intermediate): BertIntermediate(\n", " (dense): Linear(in_features=768, out_features=3072, bias=True)\n", " )\n", " (output): BertOutput(\n", " (dense): Linear(in_features=3072, out_features=768, bias=True)\n", " (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " (7): BertLayer(\n", " (attention): BertAttention(\n", " (self): BertSelfAttention(\n", " (query): Linear(in_features=768, out_features=768, bias=True)\n", " (key): Linear(in_features=768, out_features=768, bias=True)\n", " (value): Linear(in_features=768, out_features=768, bias=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (output): BertSelfOutput(\n", " (dense): Linear(in_features=768, out_features=768, bias=True)\n", " (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " (intermediate): BertIntermediate(\n", " (dense): Linear(in_features=768, out_features=3072, bias=True)\n", " )\n", " (output): BertOutput(\n", " (dense): Linear(in_features=3072, out_features=768, bias=True)\n", " (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " (8): BertLayer(\n", " (attention): BertAttention(\n", " (self): BertSelfAttention(\n", " (query): Linear(in_features=768, out_features=768, bias=True)\n", " (key): Linear(in_features=768, out_features=768, bias=True)\n", " (value): Linear(in_features=768, out_features=768, bias=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (output): BertSelfOutput(\n", " (dense): Linear(in_features=768, out_features=768, bias=True)\n", " (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " (intermediate): BertIntermediate(\n", " (dense): Linear(in_features=768, out_features=3072, bias=True)\n", " )\n", " (output): BertOutput(\n", " (dense): Linear(in_features=3072, out_features=768, bias=True)\n", " (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " (9): BertLayer(\n", " (attention): BertAttention(\n", " (self): BertSelfAttention(\n", " (query): Linear(in_features=768, out_features=768, bias=True)\n", " (key): Linear(in_features=768, out_features=768, bias=True)\n", " (value): Linear(in_features=768, out_features=768, bias=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (output): BertSelfOutput(\n", " (dense): Linear(in_features=768, out_features=768, bias=True)\n", " (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " (intermediate): BertIntermediate(\n", " (dense): Linear(in_features=768, out_features=3072, bias=True)\n", " )\n", " (output): BertOutput(\n", " (dense): Linear(in_features=3072, out_features=768, bias=True)\n", " (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " (10): BertLayer(\n", " (attention): BertAttention(\n", " (self): BertSelfAttention(\n", " (query): Linear(in_features=768, out_features=768, bias=True)\n", " (key): Linear(in_features=768, out_features=768, bias=True)\n", " (value): Linear(in_features=768, out_features=768, bias=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (output): BertSelfOutput(\n", " (dense): Linear(in_features=768, out_features=768, bias=True)\n", " (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " (intermediate): BertIntermediate(\n", " (dense): Linear(in_features=768, out_features=3072, bias=True)\n", " )\n", " (output): BertOutput(\n", " (dense): Linear(in_features=3072, out_features=768, bias=True)\n", " (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " (11): BertLayer(\n", " (attention): BertAttention(\n", " (self): BertSelfAttention(\n", " (query): Linear(in_features=768, out_features=768, bias=True)\n", " (key): Linear(in_features=768, out_features=768, bias=True)\n", " (value): Linear(in_features=768, out_features=768, bias=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (output): BertSelfOutput(\n", " (dense): Linear(in_features=768, out_features=768, bias=True)\n", " (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " (intermediate): BertIntermediate(\n", " (dense): Linear(in_features=768, out_features=3072, bias=True)\n", " )\n", " (output): BertOutput(\n", " (dense): Linear(in_features=3072, out_features=768, bias=True)\n", " (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " )\n", " (pooler): BertPooler(\n", " (dense): Linear(in_features=768, out_features=768, bias=True)\n", " (activation): Tanh()\n", " )\n", " )\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " (classifier): Linear(in_features=768, out_features=2, bias=True)\n", ")" ] }, "metadata": { "tags": [] }, "execution_count": 40 } ] }, { "cell_type": "code", "metadata": { "id": "cJO7qtU_SsDy" }, "source": [ "##@title Optimizer Grouped Parameters\n", "#This code is taken from:\n", "# https://github.com/huggingface/transformers/blob/5bfcd0485ece086ebcbed2d008813037968a9e58/examples/run_glue.py#L102\n", "\n", "# Don't apply weight decay to any parameters whose names include these tokens.\n", "# (Here, the BERT doesn't have `gamma` or `beta` parameters, only `bias` terms)\n", "param_optimizer = list(model.named_parameters())\n", "no_decay = ['bias', 'LayerNorm.weight']\n", "# Separate the `weight` parameters from the `bias` parameters. \n", "# - For the `weight` parameters, this specifies a 'weight_decay_rate' of 0.01. \n", "# - For the `bias` parameters, the 'weight_decay_rate' is 0.0. \n", "optimizer_grouped_parameters = [\n", " # Filter for all parameters which *don't* include 'bias', 'gamma', 'beta'.\n", " {'params': [p for n, p in param_optimizer if not any(nd in n for nd in no_decay)],\n", " 'weight_decay_rate': 0.1},\n", " \n", " # Filter for parameters which *do* include those.\n", " {'params': [p for n, p in param_optimizer if any(nd in n for nd in no_decay)],\n", " 'weight_decay_rate': 0.0}\n", "]\n", "# Note - `optimizer_grouped_parameters` only includes the parameter values, not \n", "# the names." ], "execution_count": 41, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "GLs72DuMODJO" }, "source": [ "#@title The Hyperparemeters for the Training Loop \n", "# optimizer = BertAdam(optimizer_grouped_parameters,\n", "# lr=2e-5,\n", "# warmup=.1)\n", "\n", "# Number of training epochs (authors recommend between 2 and 4)\n", "epochs = 4\n", "\n", "optimizer = AdamW(optimizer_grouped_parameters,\n", " lr = 2e-5, # args.learning_rate - default is 5e-5, our notebook had 2e-5\n", " eps = 1e-8 # args.adam_epsilon - default is 1e-8.\n", " )\n", "# Total number of training steps is number of batches * number of epochs.\n", "# `train_dataloader` contains batched data so `len(train_dataloader)` gives \n", "# us the number of batches.\n", "total_steps = len(train_dataloader) * epochs\n", "\n", "# Create the learning rate scheduler.\n", "scheduler = get_linear_schedule_with_warmup(optimizer, \n", " num_warmup_steps = 0, # Default value in run_glue.py\n", " num_training_steps = total_steps)" ], "execution_count": 42, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "9cQNvaZ9bnyy" }, "source": [ "#Creating the Accuracy Measurement Function\n", "# Function to calculate the accuracy of our predictions vs labels\n", "def flat_accuracy(preds, labels):\n", " pred_flat = np.argmax(preds, axis=1).flatten()\n", " labels_flat = labels.flatten()\n", " return np.sum(pred_flat == labels_flat) / len(labels_flat)" ], "execution_count": 43, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "6J-FYdx6nFE_", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "791a450e-3e14-435f-bdd8-77f54e520e99" }, "source": [ "#@title The Training Loop\n", "t = [] \n", "\n", "# Store our loss and accuracy for plotting\n", "train_loss_set = []\n", "\n", "# trange is a tqdm wrapper around the normal python range\n", "for _ in trange(epochs, desc=\"Epoch\"):\n", " \n", " \n", " # Training\n", " \n", " # Set our model to training mode (as opposed to evaluation mode)\n", " model.train()\n", " \n", " # Tracking variables\n", " tr_loss = 0\n", " nb_tr_examples, nb_tr_steps = 0, 0\n", " \n", " # Train the data for one epoch\n", " for step, batch in enumerate(train_dataloader):\n", " # Add batch to GPU\n", " batch = tuple(t.to(device) for t in batch)\n", " # Unpack the inputs from our dataloader\n", " b_input_ids, b_input_mask, b_labels = batch\n", " # Clear out the gradients (by default they accumulate)\n", " optimizer.zero_grad()\n", " # Forward pass\n", " outputs = model(b_input_ids, token_type_ids=None, attention_mask=b_input_mask, labels=b_labels)\n", " loss = outputs['loss']\n", " train_loss_set.append(loss.item()) \n", " # Backward pass\n", " loss.backward()\n", " # Update parameters and take a step using the computed gradient\n", " optimizer.step()\n", "\n", " # Update the learning rate.\n", " scheduler.step()\n", " \n", " \n", " # Update tracking variables\n", " tr_loss += loss.item()\n", " nb_tr_examples += b_input_ids.size(0)\n", " nb_tr_steps += 1\n", "\n", " print(\"Train loss: {}\".format(tr_loss/nb_tr_steps))\n", " \n", " \n", " # Validation\n", "\n", " # Put model in evaluation mode to evaluate loss on the validation set\n", " model.eval()\n", "\n", " # Tracking variables \n", " eval_loss, eval_accuracy = 0, 0\n", " nb_eval_steps, nb_eval_examples = 0, 0\n", "\n", " # Evaluate data for one epoch\n", " for batch in validation_dataloader:\n", " # Add batch to GPU\n", " batch = tuple(t.to(device) for t in batch)\n", " # Unpack the inputs from our dataloader\n", " b_input_ids, b_input_mask, b_labels = batch\n", " # Telling the model not to compute or store gradients, saving memory and speeding up validation\n", " with torch.no_grad():\n", " # Forward pass, calculate logit predictions\n", " logits = model(b_input_ids, token_type_ids=None, attention_mask=b_input_mask)\n", " \n", " # Move logits and labels to CPU\n", " logits = logits['logits'].detach().cpu().numpy()\n", " label_ids = b_labels.to('cpu').numpy()\n", "\n", " tmp_eval_accuracy = flat_accuracy(logits, label_ids)\n", " \n", " eval_accuracy += tmp_eval_accuracy\n", " nb_eval_steps += 1\n", "\n", " print(\"Validation Accuracy: {}\".format(eval_accuracy/nb_eval_steps))" ], "execution_count": 44, "outputs": [ { "output_type": "stream", "text": [ "\rEpoch: 0%| | 0/4 [00:00" ] }, "metadata": { "tags": [], "needs_background": "light" } } ] }, { "cell_type": "code", "metadata": { "id": "mAN0LZBOOPVh" }, "source": [ "#@title Predicting and Evaluating Using the Hold-out Dataset \n", "df = pd.read_csv(\"out_of_domain_dev.tsv\", delimiter='\\t', header=None, names=['sentence_source', 'label', 'label_notes', 'sentence'])\n", "\n", "# Create sentence and label lists\n", "sentences = df.sentence.values\n", "\n", "# We need to add special tokens at the beginning and end of each sentence for BERT to work properly\n", "sentences = [\"[CLS] \" + sentence + \" [SEP]\" for sentence in sentences]\n", "labels = df.label.values\n", "\n", "tokenized_texts = [tokenizer.tokenize(sent) for sent in sentences]\n", "\n", "\n", "MAX_LEN = 128\n", "\n", "# Use the BERT tokenizer to convert the tokens to their index numbers in the BERT vocabulary\n", "input_ids = [tokenizer.convert_tokens_to_ids(x) for x in tokenized_texts]\n", "# Pad our input tokens\n", "input_ids = pad_sequences(input_ids, maxlen=MAX_LEN, dtype=\"long\", truncating=\"post\", padding=\"post\")\n", "# Create attention masks\n", "attention_masks = []\n", "\n", "# Create a mask of 1s for each token followed by 0s for padding\n", "for seq in input_ids:\n", " seq_mask = [float(i>0) for i in seq]\n", " attention_masks.append(seq_mask) \n", "\n", "prediction_inputs = torch.tensor(input_ids)\n", "prediction_masks = torch.tensor(attention_masks)\n", "prediction_labels = torch.tensor(labels)\n", " \n", "batch_size = 32 \n", "\n", "\n", "prediction_data = TensorDataset(prediction_inputs, prediction_masks, prediction_labels)\n", "prediction_sampler = SequentialSampler(prediction_data)\n", "prediction_dataloader = DataLoader(prediction_data, sampler=prediction_sampler, batch_size=batch_size)" ], "execution_count": 46, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "Hba10sXR7Xi6" }, "source": [ "# Prediction on test set\n", "\n", "# Put model in evaluation mode\n", "model.eval()\n", "\n", "# Tracking variables \n", "predictions , true_labels = [], []\n", "\n", "# Predict \n", "for batch in prediction_dataloader:\n", " # Add batch to GPU\n", " batch = tuple(t.to(device) for t in batch)\n", " # Unpack the inputs from our dataloader\n", " b_input_ids, b_input_mask, b_labels = batch\n", " # Telling the model not to compute or store gradients, saving memory and speeding up prediction\n", " with torch.no_grad():\n", " # Forward pass, calculate logit predictions\n", " logits = model(b_input_ids, token_type_ids=None, attention_mask=b_input_mask)\n", "\n", " # Move logits and labels to CPU\n", " logits = logits['logits'].detach().cpu().numpy()\n", " label_ids = b_labels.to('cpu').numpy()\n", " \n", " # Store predictions and true labels\n", " predictions.append(logits)\n", " true_labels.append(label_ids)" ], "execution_count": 47, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "cRaZQ4XC7kLs", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "819f0e2f-168e-49ee-8d45-36cbc8452711" }, "source": [ "#@title Evaluating Using Matthew's Correlation Coefficient\n", "# Import and evaluate each test batch using Matthew's correlation coefficient\n", "from sklearn.metrics import matthews_corrcoef\n", "matthews_set = []\n", "\n", "for i in range(len(true_labels)):\n", " matthews = matthews_corrcoef(true_labels[i],\n", " np.argmax(predictions[i], axis=1).flatten())\n", " matthews_set.append(matthews)" ], "execution_count": 48, "outputs": [ { "output_type": "stream", "text": [ "/usr/local/lib/python3.6/dist-packages/sklearn/metrics/_classification.py:900: RuntimeWarning: invalid value encountered in double_scalars\n", " mcc = cov_ytyp / np.sqrt(cov_ytyt * cov_ypyp)\n" ], "name": "stderr" } ] }, { "cell_type": "markdown", "metadata": { "id": "IUM0UA1qJaVB" }, "source": [ "The final score will be based on the entire test set, but let's take a look at the scores on the individual batches to get a sense of the variability in the metric between batches.\n" ] }, { "cell_type": "code", "metadata": { "id": "xytAr_C48wnu", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "07097f15-0ae7-41af-f114-e5d9002dced9" }, "source": [ "#@title Score of Individual Batches\n", "matthews_set" ], "execution_count": 49, "outputs": [ { "output_type": "execute_result", "data": { "text/plain": [ "[0.049286405809014416,\n", " -0.17407765595569785,\n", " 0.4732058754737091,\n", " 0.34151450937027694,\n", " 0.5945883900105632,\n", " 0.7410010097502685,\n", " 0.4472135954999579,\n", " 0.29277002188455997,\n", " 0.9165151389911681,\n", " 0.8246211251235321,\n", " 0.8459051693633014,\n", " 0.7419408268023742,\n", " 0.6979824404521128,\n", " 0.7141684885491869,\n", " 0.2773500981126145,\n", " 0.5056936741642399,\n", " 0.0]" ] }, "metadata": { "tags": [] }, "execution_count": 49 } ] }, { "cell_type": "code", "metadata": { "id": "oCYZa1lQ8Jn8", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "c604d08f-efeb-48bc-b5eb-bf3b0811bf6b" }, "source": [ "#@title Matthew's Evaluation on the Whole Dataset\n", "# Flatten the predictions and true values for aggregate Matthew's evaluation on the whole dataset\n", "flat_predictions = [item for sublist in predictions for item in sublist]\n", "flat_predictions = np.argmax(flat_predictions, axis=1).flatten()\n", "flat_true_labels = [item for sublist in true_labels for item in sublist]\n", "matthews_corrcoef(flat_true_labels, flat_predictions)" ], "execution_count": 50, "outputs": [ { "output_type": "execute_result", "data": { "text/plain": [ "0.5453476037943634" ] }, "metadata": { "tags": [] }, "execution_count": 50 } ] } ] } ================================================ FILE: Chapter02/in_domain_train.tsv ================================================ gj04 1 our friends wo n't buy this analysis , let alone the next one we propose . gj04 1 one more pseudo generalization and i 'm giving up . gj04 1 one more pseudo generalization or i 'm giving up . gj04 1 the more we study verbs , the crazier they get . gj04 1 day by day the facts are getting murkier . gj04 1 i 'll fix you a drink . gj04 1 fred watered the plants flat . gj04 1 bill coughed his way out of the restaurant . gj04 1 we 're dancing the night away . gj04 1 herman hammered the metal flat . gj04 1 the critics laughed the play off the stage . gj04 1 the pond froze solid . gj04 1 bill rolled out of the room . gj04 1 the gardener watered the flowers flat . gj04 1 the gardener watered the flowers . gj04 1 bill broke the bathtub into pieces . gj04 1 bill broke the bathtub . gj04 1 they drank the pub dry . gj04 0 * they drank the pub . gj04 1 the professor talked us into a stupor . gj04 0 * the professor talked us . gj04 1 we yelled ourselves hoarse . gj04 0 * we yelled ourselves . gj04 0 * we yelled harry hoarse . gj04 1 harry coughed himself into a fit . gj04 0 * harry coughed himself . gj04 0 * harry coughed us into a fit . gj04 1 bill followed the road into the forest . gj04 1 we drove highway 5 from sd to sf . gj04 1 fred tracked the leak to its source . gj04 1 john danced waltzes across the room . gj04 1 bill urinated out the window . gj04 1 bill coughed out the window . gj04 1 bill bled on the floor . gj04 1 the toilet leaked through the floor into the kitchen below . gj04 1 bill ate off the floor . gj04 1 bill drank from the hose . gj04 1 this metal hammers flat easily . gj04 1 they made him president . gj04 1 they made him angry . gj04 0 * they caused him to become angry by making him . gj04 0 * they caused him to become president by making him . gj04 0 * they made him to exhaustion . gj04 1 they made him into a monster . gj04 1 the trolley rumbled through the tunnel . gj04 1 the wagon rumbled down the road . gj04 1 the bullets whistled past the house . gj04 1 the knee replacement candidate groaned up the stairs . gj04 0 * the car honked down the road . gj04 0 * the dog barked out of the room . gj04 1 the dog barked its way out of the room . gj04 1 bill whistled his way past the house . gj04 1 the witch vanished into the forest . gj04 1 bill disappeared down the road . gj04 0 * the witch went into the forest by vanishing . gj04 1 the witch went into the forest and thereby vanished . gj04 1 the building is tall and wide . gj04 0 * the building is tall and tall . gj04 1 this building is taller and wider than that one . gj04 1 this building got taller and wider than that one . gj04 1 this building got taller and taller . gj04 0 * this building is taller and taller . gj04 0 * this building got than that one . gj04 0 * this building is than that one . gj04 1 bill floated into the cave . gj04 0 *? bill floated into the cave for hours . gj04 0 *? bill pushed harry off the sofa for hours . gj04 1 bill floated down the river for hours . gj04 1 bill floated down the river . gj04 1 bill pushed harry along the trail for hours . gj04 1 bill pushed harry along the trail . gj04 1 the road zigzagged down the hill . gj04 1 the rope stretched over the pulley . gj04 1 the weights stretched the rope over the pulley . gj04 1 the weights kept the rope stretched over the pulley . gj04 1 sam cut himself free . gj04 1 sam got free by cutting his finger . gj04 1 bill cried himself to sleep . gj04 0 * bill cried sue to sleep . gj04 1 bill squeezed himself through the hole . gj04 1 bill sang himself to sleep . gj04 1 bill squeezed the puppet through the hole . gj04 1 bill sang sue to sleep . gj04 0 * the elevator rumbled itself to the ground . gj04 1 if the telephone rang , it could ring itself silly . gj04 0 * she yelled hoarse . gj04 0 * ted cried to sleep . gj04 1 the tiger bled to death . gj04 1 he coughed awake and we were all overjoyed , especially sierra . gj04 1 john coughed awake , rubbing his nose and cursing under his breath . gj04 1 john coughed himself awake on the bank of the lake where he and bill had their play . gj04 1 ron yawned himself awake . gj04 1 she coughed herself awake as the leaf landed on her nose . gj04 1 the worm wriggled onto the carpet . gj04 1 the chocolate melted onto the carpet . gj04 0 * the ball wriggled itself loose . gj04 1 bill wriggled himself loose . gj04 1 aliza wriggled her tooth loose . gj04 1 the off center spinning flywheel shook itself loose . cj99 1 the more you eat , the less you want . cj99 1 if you eat more , you want correspondingly less . cj99 1 when you eat more , you want correspondingly less . cj99 1 as you eat more , you want correspondingly less . cj99 0 * the most you want , the least you eat . cj99 1 the angrier sue gets , the more fred admires her . cj99 1 the more that you eat , the less that you want . cj99 1 the angrier that sue gets , the more that fred admires her . cj99 1 i think that the more you eat , the less you want . cj99 1 i 'm not shocked by the idea that the more you eat , the less you want . cj99 1 it is obvious that the more you eat , the less you want . cj99 1 it is not entirely clear if the more you eat , the less you want . cj99 1 i want to explain exactly why the more you eat , the less you want . cj99 1 i demand that the more john eats , the more he pays . cj99 0 * i demand that the more john eat , the more he pay . cj99 1 i demand that john pay more , the more he eats . cj99 0 * i demand that john pays more , the more he eat . cj99 1 you get angrier , the more we eat , do n't you . cj99 0 * you get angrier , the more we eat , do n't we . cj99 0 * the harder it has rained , how much faster a flow that appears in the river ? cj99 1 the harder it has rained , how much faster a flow appears in the river ? cj99 0 * the harder it rains , how much faster that do you run ? cj99 1 the harder it rains , how much faster do you run ? cj99 1 the harder it rains , how much faster a flow do you see in the river ? cj99 0 * the harder it rains , how much faster a flow that do you see in the river ? cj99 1 when it rains harder , how much faster a flow appears in the river ? cj99 1 as it rains harder , how much faster a flow appears in the river ? cj99 0 * as it rains harder , how much faster a flow that appears in the river ? cj99 0 * when it rains harder , how much faster a flow that appears in the river ? cj99 1 how much harder has it rained , the faster a flow you see in the river ? cj99 1 how much harder has it rained , when you see a faster flow in the river ? cj99 0 * the more john eats , the tighter keep your mouth shut about it . cj99 0 * the more everyone eat , the more john keeps his big mouth shut about it , ok ? cj99 1 when john eats more , keep your mouth shut tighter , ok ? cj99 1 as john eats more , keep your mouth shut tighter , ok ? cj99 1 keep your mouth shut tighter , the more john eats , ok ? cj99 1 everyone keep your mouth shut tighter , the more john eats , ok ? cj99 0 ?? i can well imagine the more him eating , the fatter him getting . cj99 1 bill can well imagine getting fat . cj99 0 * bill can well imagine the more he eats , the fatter getting . cj99 1 fred can well imagine joe getting fatter , the more he eats . cj99 0 * it is important the more you eat , the more careful to be . cj99 0 * it is important for the more you eat , the more careful to be . cj99 0 * it is important the more you to eat , the more careful to be . cj99 0 * it is important the more you eat , the more careful you to be . cj99 0 * it is important the more you eat , the more careful for you to be . cj99 0 * it is important for the more you to eat , the more careful to be . cj99 0 * it is important for the more you to eat , the more careful for you to be . cj99 0 * it is important the more you to eat , the more careful for you to be . cj99 0 * it is important for the more you eat , the more careful you to be . cj99 1 it is important for you to be more careful , the more you eat . cj99 1 it is important to be more careful , the more you eat . cj99 0 * i can well imagine quickly mary answering the question . cj99 0 ?* i can well imagine with a hatchet mary destroying the jeep . cj99 0 ?* i can well imagine if he eats more , him getting fat . cj99 0 * it is not entirely obvious if , mary listens to the grateful dead , she gets depressed . cj99 0 * it is not entirely obvious whether , mary listens to the grateful dead , she gets depressed . cj99 1 mary listens to the grateful dead and she gets depressed . cj99 1 if mary listens to the grateful dead , she gets depressed . cj99 1 when mary listens to the grateful dead , she gets depressed . cj99 1 mary gets depressed if she listens to the grateful dead . cj99 1 mary gets depressed when she listens to the grateful dead . cj99 1 the more she looked at pictures , the angrier mary got . cj99 0 * the more pictures mary looked at , she got angrier and angrier . cj99 1 mary gets depressed and she listens to the grateful dead . cj99 1 the higher the stakes are , the lower his expectations are . cj99 1 the higher the stakes , the lower his expectations . cj99 1 his expectations are lower , the higher the stakes . cj99 1 his expectations are lower , the higher the stakes are . cj99 0 * his expectations lower , the higher the stakes . cj99 0 * his expectations lower , the higher the stakes are . cj99 1 the more obnoxious fred is , the less attention you should pay to him . cj99 0 * the more obnoxious fred , the less attention you should pay to him . cj99 1 the more fred is obnoxious , the less you should pay attention to him . cj99 0 * the more obnoxious fred , the less you should pay attention to him . cj99 1 his expectations are always lower than mine . cj99 1 john was lots more obnoxious than fred was . cj99 1 you should always lock your door , no matter how fancy the hotel might be . cj99 1 you should always lock your door , no matter how fancy the hotel . cj99 1 i do n't plan to lock the door , no matter how fancy this hotel is . cj99 0 * i do n't plan to lock the door , no matter how fancy this hotel . cj99 1 i 'm going out , whatever the weather . cj99 1 i 'm going out , wherever that hurricane might be . cj99 0 * i 'm going out , wherever that hurricane . cj99 1 the more examples mary says that bill has helped fred to discover the less i believe her . cj99 0 * the more food mary knows a man that eats the poorer she gets . cj99 0 * the fatter he goes to a doctor when he gets the more he eats . cj99 0 * the fatter that that he gets bothers him , the more he eats . cj99 0 * the more books i ask to whom he will give , the more he reads . cj99 0 * the more people i ask what he will give to the more he reads . cj99 1 the more carefully he words the letter the safer he 'll be . cj99 0 * the more carefully he knows a man that worded the letter the safer he 'll be . cj99 1 the more geniuses john meets , the angrier he gets . cj99 0 * the more john meets geniuses , the angrier he gets . cj99 1 the more people you say will buy tickets , the happier i 'll be . cj99 0 * the more people you say that will buy tickets , the happier i 'll be . cj99 1 the more people you say that right after the show opens will buy tickets , the happier i 'll be . cj99 1 the more i talk to joe , the less about linguistics i am inclined to think sally has taught him to appreciate . cj99 0 * the more he eats , the poorer he knows a woman that gets . cj99 0 * the more he eats , the fatter he goes to a doctor when he gets . cj99 0 * the more he eats , the fatter that that he gets really bothers me . cj99 0 * the more he reads , the more books i wonder to whom he will give . cj99 0 * the more he reads , the more people i wonder what he will give to . cj99 0 * the sooner you call , the more carefully i know a man that will word the letter . cj99 1 the richer john gets , the more geniuses john meets . cj99 0 * the richer he gets , the more john meets geniuses . cj99 1 the more articles he reads , the fewer people he thinks will go into linguistics . cj99 0 * the more articles he reads , the fewer people he thinks that will go into linguistics . cj99 1 the more articles he reads , the fewer people he thinks that under the current circumstances will go into linguistics . cj99 1 the more articles he reads , the fewer people he thinks under the current circumstances will go into linguistics . cj99 1 the more people that arrive , the louder that it gets . cj99 1 the more people that arrive , the louder it gets . cj99 1 the more people you give beer to , the more people that get sick . cj99 1 the more people that you give beer to , the more people that get sick . cj99 1 the more people arrive , the louder that it gets . cj99 1 the more people arrive , the louder it gets . cj99 1 the more people that you give beer to , the more people get sick . cj99 0 * the more pictures of john that he buys the more arrogant he becomes . cj99 1 the more pictures of himself that john buys the more arrogant he becomes . cj99 1 the man that arrived on the train was my brother . cj99 0 * the man arrived on the train was my brother . cj99 0 * the more people everyone who likes pays attention to , the happier we all are . cj99 0 * the later it gets , the more people everyone who likes pays attention to . cj99 1 whenever bill smokes , susan hates him all the more . cj99 1 whenever bill smokes , susan hates him much more . cj99 1 whenever bill smokes , susan hates him far more . cj99 1 whenever bill smokes , susan hates him a lot more . cj99 1 once janet left , fred became all the crazier . cj99 1 once janet left , fred became much crazier . cj99 1 once janet left , fred became far crazier . cj99 1 fred became all the crazier , the more often janet left . cj99 1 when bill smokes , all the more does susan hate him . cj99 0 * when bill smokes , much more does susan hate him . cj99 0 * when bill smokes , all the more susan hates him . cj99 1 so much did you eat that everyone gasped . cj99 1 so fast did you run that everyone gasped . cj99 1 so intelligent a dog did you buy that everyone gasped . cj99 1 i know how much you ate . cj99 1 i know how fast you ran . cj99 1 i know how intelligent a dog you bought . cj99 1 he ate so much that he got sick . cj99 1 so much did he eat that he got sick . cj99 1 the more you eat , the more you want . cj99 0 * you eat the more , the more you want . cj99 0 * the more you eat , you want the more . cj99 0 * i wonder you ate how much . cj99 1 i wonder to how many people bill talks . cj99 1 the longer he has to wait , the angrier john gets . cj99 1 if he has to wait , john gets angry . cj99 0 * he gets angry , the longer john has to wait . cj99 0 * he gets angry if john has to wait . cj99 1 the more that pictures of him appear in the news , the more embarrassed john becomes . cj99 1 the more pictures of himself that appear in the news , the more embarrassed john becomes . cj99 1 the more that pictures of himself appear in the news , the more embarrassed john becomes . cj99 1 the more pictures of him appear in the news , the more likely john is to get arrested . cj99 0 * the more pictures of himself appear in the news , the more likely john is to get arrested . cj99 1 the more that pictures of him appear in the news , the more likely john is to get arrested . cj99 0 * the more that pictures of himself appear in the news , the more likely john is to get arrested . cj99 1 the more that john gets upset by them , the more that stories about him seem to show up in the news . cj99 0 * the more that john gets upset by them , the more that stories about himself seem to show up in the news . cj99 1 john is more embarrassed , the more pictures of him appear in the news . cj99 1 john is more embarrassed , the more pictures of him that appear in the news . cj99 1 john is more embarrassed , the more pictures of himself appear in the news . cj99 1 john is more embarrassed , the more pictures of himself that appear in the news . cj99 1 stories about him seem to show up more on the evening news , the more that john gets upset by them . cj99 0 * stories about himself seem to show up more on the evening news , the more that john gets upset by them . cj99 1 if you give him enough opportunity , every senator will succumb to corruption . cj99 1 you give him enough opportunity and every senator will succumb to corruption . cj99 0 * we gave him enough opportunity and , sure enough , every senator succumbed to corruption . cj99 1 if you give any senator enough opportunity , he will succumb to corruption . cj99 1 you give any senator enough opportunity and he will succumb to corruption . cj99 0 * you give every senator enough opportunity and he will succumb to corruption . cj99 0 * we gave any senator enough opportunity and , sure enough , he succumbed to corruption . cj99 0 * we gave every senator enough opportunity and , sure enough , he succumbed to corruption . cj99 1 the more lobbyists he talks to , the more corrupt every senator seems to become . cj99 1 the more lobbyists wine and dine him , the more every senator is susceptible to corruption . cj99 0 * the more time that every senator spends with lobbyists , the more likely he succumbs to corruption . cj99 1 every senator becomes more corrupt , the more lobbyists he talks to . cj99 1 any senator becomes more corrupt , the more lobbyists he talks to . cj99 0 * he seems to become more corrupt , the more lobbyists any senator talks to . cj99 0 * he seems to become more corrupt , the more lobbyists every senator talks to . cj99 1 every senator seems to become more corrupt , if he talks to more lobbyists . cj99 1 any senator seems to become more corrupt , if he talks to more lobbyists . cj99 1 any senator seems to become more corrupt , as he talks to more lobbyists . cj99 0 * he seems to become more corrupt , if any senator talks to more lobbyists . cj99 0 * he seems to become more corrupt , if every senator talks to more lobbyists . cj99 0 * he seems to become more corrupt , as every senator talks to more lobbyists . cj99 0 * he seems to become more corrupt , as any senator talks to more lobbyists . cj99 1 the sooner you solve this problem , the more easily you 'll satisfy the folks up at corporate headquarters . cj99 1 this is the sort of problem which the sooner you solve the more easily you 'll satisfy the folks up at corporate headquarters . cj99 1 the folks up at corporate headquarters are the sort of people who the sooner you solve this problem , the more easily you 'll satisfy . cj99 1 this problem , the sooner you solve the more easily you 'll satisfy the folks up at corporate headquarters . cj99 1 who did you give pictures of to friends of ? cj99 1 it is this problem that the sooner you solve the more easily you 'll satisfy the folks up at corporate headquarters . cj99 0 ?* it is the folks up at corporate headquarters who the sooner you solve this problem , the more easily you 'll satisfy . cj99 0 * which problem the sooner you solve , will the more easily you satisfy the folks up at corporate headquarters ? cj99 0 * which problem does the sooner that you solve , the more easily you 'll satisfy the folks up at corporate headquarters ? cj99 0 * which problem the sooner that you solve , will the more easily you satisfy the folks up at corporate headquarters ? cj99 0 * the harder it rains , the faster who runs ? cj99 0 * the louder who talks , the angrier you get ? cj99 1 the harder that it rains , how much faster a flow do you see in the river ? cj99 1 they failed to tell me which problem the sooner i solve , the quicker the folks up at corporate headquarters . cj99 0 ?? i finally worked up enough courage to ask which people up at corporate headquarters the sooner i solve this problem , the quicker i 'll get free of . cj99 0 ?? which folks up at corporate headquarters do you think that the sooner you solve this problem , the quicker you 'll be able to tell t to buzz off ? cj99 0 ?? this is a problem that you 'll be able to tell the folks up at corporate headquarters to buzz off if you solve . cj99 1 this is a problem that you 'll be able to tell the folks up at corporate headquarters to buzz off if you solve it . cj99 0 ?? this is a problem that you solve it and you 'll be able to tell the folks up at corporate headquarters to buzz off . cj99 0 ?? those are the folks that you just solve this problem and you 'll be able to put them on ice . cj99 0 ?? they failed to tell me which problem i 'll beat the competition more easily , the sooner i solve . cj99 0 ?? this is the problem that you 'll beat the competition more easily , the sooner you solve . bc01 1 john saw the man in the room . bc01 1 which room did john see the man in ? bc01 1 who did john think that bill claimed that mary suspected that everybody liked ? bc01 1 john could not visit sally . bc01 1 what john could do is not visit sally . bc01 1 john could n't visit sally . bc01 1 why did john leave ? bc01 1 i hit the ball . bc01 1 you hit the ball . bc01 0 * he hit the ball . bc01 0 * she hit the ball . bc01 1 they hit the ball . bc01 0 * am not i going ? bc01 1 i am not going . bc01 1 are n't i going ? bc01 0 * i are n't going . bc01 1 louise is unhappy , is n't she ? bc01 1 louise likes not being happy , does n't she ? bc01 1 not many books survived the fire , did they ? bc01 1 no books survived the fire , did they ? bc01 1 he has n't often paid taxes , has he ? bc01 1 he ca n't pay taxes , can he ? bc01 1 she does not see him . bc01 1 she kept not seeing him . bc01 1 she could not have been working . bc01 0 * marianne not left . bc01 0 * marianne left not . bc01 1 he could not have been working . bc01 1 he can not have been working . bc01 1 he can simply not have been working . bc01 1 you must not simply not work . bc01 1 he may not just not have been working . bc01 1 he ca n't have been working . bc01 1 ca n't he have been working ? bc01 1 can he not have been working ? bc01 0 * can he not have been working ? bc01 1 john wrote books . bc01 0 * john write books . bc01 0 * john wrote books . bc01 1 john did not write books . bc01 0 * john seems that is nice . bc01 1 `` i am so happy '' , thought john . bc01 1 down the hill rolled john . bc01 0 * john kisses often mary . bc01 1 john often kisses mary . bc01 1 who do you think mary said john likes ? bc01 0 ?* who did you ask whether mary knows why john likes ? bc01 1 who do you think that mary said that john likes ? bc01 0 * how do you wonder whether mary solved the problem ? bc01 1 how do you think that mary solved the problem ? bc01 0 * how do you wonder whether john said that mary solved the problem ? bc01 0 * how do you wonder whether john said mary solved the problem ? bc01 0 ?? which problem do you wonder whether john said that mary solved ? bc01 1 how did you think that mary solved the problem ? bc01 1 mary hired someone . bc01 1 i heard that mary hired someone . bc01 1 i resigned because mary hired someone . bc01 1 mary wondered which picture of himself bill saw ? bc01 1 which picture of himself does mary think that john said that susan likes ? bc01 0 * mary thinks that john said that susan likes pictures of himself ? bc01 1 mary thinks that john said that pictures of himself , susan likes ? bc01 1 if you do n't believe me , you will the weatherman ? bc01 1 i rolled up a newspaper , and lynn did a magazine ? bc01 1 kathy likes astronomy , but she does n't meteorology ? bc01 1 the da proved jones guilty and the assistant da will prove smith . bc01 1 mary will believe susan , and you will bob . bc01 1 you might not believe me but you will bob . bc01 0 * you will bob believe . bc01 1 how did you solve the problem ? bc01 1 i wonder who could solve the problem in this way . bc01 0 * how do you wonder who could solve this problem . bc01 1 no candidate can predict how many people will vote for him . bc01 1 every politician is worried when the press starts attacking him . bc01 1 which politician appointed the journalist who supported him ? bc01 0 * the fact that no candidate was elected shows that he was inadequate . bc01 1 john sells books , mary buys records and bill v newspapers . bc01 1 the question of whether john met mary worries the people who support . bc01 1 they have left . bc01 1 have they left ? bc01 1 could they have left ? bc01 1 he has often seen mary . bc01 1 he i often sees mary . bc01 0 * he sees often mary . bc01 0 * sees he i often mary ? bc01 1 it seems that it is likely that john will win . bc01 1 it seems that john is likely to win . bc01 1 john seems to be likely to win . bc01 0 * john seems that it is likely to win . bc01 0 * john seems will win . bc01 0 * how do you wonder which problem to solve ? bc01 1 how intelligent do you consider john ? bc01 0 ?? how many people do you wonder whether i consider intelligent ? bc01 0 * how intelligent do you wonder whether i consider john ? bc01 0 * what the hell do you wonder how to say ? bc01 1 he has left . bc01 1 his book is nice . bc01 1 bill saw him . bc01 1 bill works with him . bc01 1 john believes him to be a nice guy . bc01 1 john considers him a nice guy . bc01 1 for him to do that would be a mistake . bc01 1 with him sick , the team is in trouble . bc01 0 * a man to be in the garden is unlikely . bc01 0 * a man to come is unlikely . bc01 0 * john to call would be unlikely . bc01 0 * this conclusion to be arrived at is surprising . bc01 1 john believes that he is sick . bc01 0 * john believes that him is sick . bc01 0 * john tries him to win . bc01 0 * john wonders where him to go . bc01 1 who do you think that bill likes ? bc01 1 who do you think that bill believes to be innocent ? bc01 0 * who do you think that believes john to be innocent ? bc01 0 * who would you prefer for to win the race ? bc01 1 someone stole my car . bc01 1 my car was stolen . bc01 0 * the children eat all chocolate . bc01 1 john has often kissed mary . bc01 1 the kids have all eaten the chocolate . bc01 1 in general , he understands what 's going on . bc01 1 it 's probable that in general he understands what 's going on . bc01 0 * it 's probable in general that he understands what 's going on . bc01 0 * in general that he understands what 's going on is surprising . bc01 1 i explained how to fix the sink . bc01 1 i explained how we should fix the sink . bc01 1 i explained that we should fix the sink . bc01 0 * i explained to fix the sink . bc01 1 mickey looked up the reference . bc01 1 mickey looked the reference up . bc01 1 mickey looked up them . bc01 1 mickey teamed up with the women . bc01 0 * mickey teamed with the women up . bc01 1 mickey pointed out that gary had left . bc01 0 * mickey pointed that gary had left out . bc01 1 mickey slips up all the time . bc01 0 * mickey slips all the time up . bc01 1 what does john think mary bought ? bc01 0 * john thinks what mary bought . bc01 1 john wonders what mary bought . bc01 0 * what does john wonder mary bought ? bc01 0 ?? who is he reading a book that criticizes ? bc01 0 ?? what do you remember where we bought ? bc01 1 who bought what ? bc01 1 who is reading a book that criticizes who ? bc01 1 who remembers where we bought what ? bc01 0 * i wonder who what bought ? bc01 0 * i wonder what who bought ? bc01 1 there are n't many linguistics students here . bc01 1 i have n't met many linguistics students . bc01 1 what does every student buy ? bc01 1 i need sally to be there . bc01 0 * the boat sank to collect the insurance . bc01 1 the boat was sunk to collect the insurance . bc01 1 john wants to win . bc01 1 the bed was unmade . bc01 0 * headway was unmade . bc01 1 john was unknown . bc01 0 * john was unknown to be the murderer . bc01 1 we knew john to be the murderer . bc01 1 he fed the children . bc01 1 the children were uneducated . bc01 1 the children were undisciplined . bc01 1 i believed these students all to like john . bc01 1 they tried to all like john . bc01 1 i believed these students to all like john . bc01 0 ?* did he try ever to talk to the student ? bc01 1 did you believe him ever to have made an effort to talk to the student ? bc01 1 did he try to ever be attentive to the needs of students ? bc01 1 did you believe him to ever have made an effort to talk to the student ? bc01 1 work out an analysis that is typical of this view of understood subjects . bc01 1 they were believed all to be quite diligent . bc01 1 was he believed ever to fail students ? bc01 1 there is tending to be more and more discussion of these issues . bc01 1 john seemed to be a great linguist . bc01 1 there promises to be a storm tonight . bc01 1 john strived to be successful . bc01 1 john wanted to improve his lot in life . bc01 1 john expected to win . bc01 1 this book is too dense to be read in one sitting . bc01 0 * there is too likely to be a riot to be a serious discussion of the issues . bc01 1 john tried . bc01 1 john remembered . bc01 1 john is refused . bc01 1 john forgot . bc01 0 * bill seems to be obnoxious , but i do n't think that sam happens . bc01 0 * bill seems to be obnoxious , but i do n't think that sam turns out . bc01 0 * bill seems to be obnoxious , but i do n't think that sam tends . bc01 0 * they tried all to like john . bc01 1 they seemed all to like john . bc01 1 john believes sally to be polite . bc01 1 i believe john with all my heart to be a fine person . bc01 0 * john is wanted to win . bc01 0 * john would be liked to win . bc01 1 we would like john to win . bc01 0 * john would be hated to win . bc01 0 * john would be preferred to be the candidate . bc01 1 we would prefer john to be the candidate . bc01 1 i would like for john to win . bc01 1 i would hate for john to win . bc01 1 i would prefer for john to be the candidate . bc01 1 john destroyed the house . bc01 1 the electrode emitted ions into the medium . bc01 1 ions struck the electrode . bc01 1 the medium contains ions . bc01 0 * the house destroyed john . bc01 1 ions left the electrode . bc01 0 * the electrode was left by ions . bc01 1 the electrode was struck by ions . bc01 1 the ball lies in the box . bc01 1 the ball rolled from the bush to the tree . bc01 1 the box contains the ball . bc01 1 the tree dropped fruit to the ground . bc01 1 fruit hit the ground from the tree . bc01 1 the stone knocked against the pole into the road . bc01 1 the stone knocked the pole into the road . bc01 1 the box contained the ball . bc01 0 * the box gradually contained the ball . bc01 0 * the box at once contained the ball . bc01 0 * the box contained the ball to the ground . bc01 1 the tree gradually dropped its fruit to the ground . bc01 1 the tree dropped its fruit to the ground . bc01 1 fruit hit the roof . bc01 1 fruit hit the roof from the tree . bc01 1 fruit at once hit the roof from the tree . bc01 0 * fruit hit the roof against the ground . bc01 0 * fruit at once hit the roof against the ground . bc01 1 fruit dropped from the tree . bc01 0 * fruit dropped from the tree from the clouds . bc01 0 * fruit fell against the house . bc01 0 * fruit fell against the house against the ground . bc01 1 the tree changed into an oak . bc01 1 the tree changed from a maple into an oak . bc01 0 * the maple changed into an oak from a cedar . bc01 1 the maple changed into an oak from a cedar . bc01 1 the maple changed into an oak . bc01 1 the oak developed out of a maple . bc01 1 the train reached the station . bc01 1 the branches knocked against the wall . bc01 1 the child became a man . bc01 1 the party lasted till midnight . bc01 1 the dog went crazy . bc01 1 it struck john that it was so . bc01 1 it came to john that it was so . bc01 1 the snake saw into the nest . bc01 1 hard work resulted in high grades . bc01 1 the farm passed to john . bc01 1 john is touching the wall . bc01 1 the wall is being touched by john . bc01 1 a bear occupies the cave . bc01 1 a bear inhabits the cave . bc01 1 water fills the tub . bc01 1 the electric main joins the house circuit in the basement . bc01 1 the house circuit is joined by the electric main in the basement . bc01 1 the fence straddles the sidewalk . bc01 1 the sidewalk is straddled by the fence . bc01 1 the man with a book . bc01 1 gas escaped the tube . bc01 1 the terrorist escaped the prison cell . bc01 1 the prison cell was escaped by the terrorist . bc01 1 the rolling stone avoided the river . bc01 1 the river was avoided by the rolling stone . bc01 1 the agents caught the terrorist . bc01 1 the sponge soaked up the water . bc01 1 the tub filled with water . bc01 1 john received a book . bc01 1 john learned a lesson . bc01 1 the parcel reached john . bc01 1 john received the parcel . bc01 1 the farm finally got to john after much litigation . bc01 0 * the farm finally reached john after much litigation . bc01 1 water filled the cup high . bc01 1 water filled the cup . bc01 0 * water emptied the cup . bc01 0 * the cup filled the water high . bc01 0 * the cup filled of water . bc01 1 the cup filled with water . bc01 0 * the cup emptied with water . bc01 1 the barge piled high with logs . bc01 0 * the road blocked with a stone . bc01 0 * the branch dropped bare of its apple . bc01 0 * the logs piled the barge high . bc01 1 a stone blocked the road . bc01 0 * the bottle drained the liquid free . bc01 1 the branch dropped its apple free . bc01 1 some branches broke off of the tree . bc01 0 * the tree broke off some branches . bc01 1 the tree dropped some branches . bc01 1 the tree lost some branches . bc01 1 water bubbled out of the kettle . bc01 0 * the kettle bubbled water up . bc01 1 the kettle bubbled water . bc01 0 * the cup filled water . bc01 0 * the stone knocked the pole into the road . bc01 1 the tub leaked empty of water . bc01 0 * the stone knocked against the pole into the road . bc01 1 hail stones broke the window . bc01 1 the force of the wind broke the window . bc01 0 * the window broke from hail stones . bc01 1 the window broke from the force of the wind . bc01 1 what the force of the wind did to the window was break it . bc01 1 john hit the stone against the wall . bc01 1 john hit the wall with the stone . bc01 1 john tapped some wine from a barrel . bc01 1 john tapped a barrel of some wine . bc01 1 john laid the book on the table . bc01 1 john included his name in the list . bc01 1 john loaded the bricks onto the truck . bc01 1 john loaded the truck with bricks . bc01 1 john fed rice to the baby . bc01 1 john fed the baby rice . bc01 1 john fed the baby up with rice . bc01 0 * john fed the baby rice up . bc01 1 the ball lies completely in the box . bc01 1 the box completely contains the ball . bc01 1 the train got to the station fully . bc01 1 the train reached the station fully . bc01 1 press the stamp against the pad completely . bc01 1 press the pad with the stamp completely . bc01 1 spray the paint onto the wall completely . bc01 1 spray all the paint onto the wall completely . bc01 0 * spray the wall with all the paint . bc01 1 spray the whole wall with the paint . bc01 1 what john did to the wall was paint it . bc01 1 what john did to the whole wall was paint it . bc01 1 what john did to the wall was hit it . bc01 0 * what the stone did to the wall was hit it . bc01 0 * what the stone did to the whole wall was hit it . bc01 1 john took bill to be a fool . bc01 0 * john concluded bill to be a fool . bc01 1 give the bottle to the baby full . bc01 0 * give the bottle to the baby awake . bc01 1 give the baby the bottle full . bc01 0 * give the baby the bottle awake . bc01 1 rub the cloth on the baby torn . bc01 0 * rub the cloth on the baby asleep . bc01 1 rub the baby with the cloth torn . bc01 0 * rub the baby with the cloth asleep . bc01 1 dry the baby with the cloth asleep . bc01 0 * dry the baby with the cloth torn . bc01 0 * the cup knocked the stone apart . bc01 1 the stone knocked the cup apart . bc01 1 the cup smashed apart against the stone . bc01 1 the stone smashed the cup apart . bc01 1 the tank filled with petrol out of the pump . bc01 1 the cup emptied of water onto the ground . bc01 1 john included her name in the list . bc01 1 john rolled the ball from the tree to the bush . bc01 1 john tapped the bottle of some water . bc01 1 john gave bill the book . bc01 1 john got the book from bill . bc01 0 * john gave bill of the book . bc01 1 we have someone in the living room . bc01 1 john is very fond of mary . bc01 1 mary laughed at john . bc01 1 the ship sank beneath the waves . bc01 1 mary considers john a fool and bill a wimp . bc01 1 john regards professors as strange and politicians as creepy . bc01 1 sue put the books on the table and the records on the chair . bc01 1 harriet gave a mug to john and a scarf to vivien . bc01 1 i expect john to win and harry to lose . bc01 1 you eat the fish raw and the beef cooked . bc01 1 they told sue who to talk to and virginia when to leave . bc01 1 smith loaned , and his widow later donated , a valuable collection of manuscripts to the library . bc01 1 sue moved , and mary also transferred , her business to a different location . bc01 1 i succeeded in convincing , even though john had failed to persuade , mary not to leave . bc01 1 we did n't particularly like , but nevertheless ate , the fish raw . bc01 1 flo desperately wants , though she does n't really expect , the miami dolphins to be in the play-offs . bc01 1 john learned french perfectly . bc01 1 bill recited his lines poorly . bc01 1 mary plays the violin beautifully . bc01 0 * john perfectly learned french . bc01 0 * bill poorly recited his lines . bc01 1 john learned french immediately . bc01 1 bill recited his lines slowly . bc01 1 mary will play the violin soon . bc01 1 john immediately learned french . bc01 1 bill slowly recited his lines . bc01 1 mary will soon play the violin . bc01 1 john immediately learned french perfectly . bc01 1 john learned french perfectly almost immediately . bc01 1 john learned french perfectly immediately . bc01 0 * john perfectly learned french immediately . bc01 0 * john learned french immediately perfectly . bc01 0 * clearly , john immediately will probably learn french perfectly . bc01 0 * immediately , john probably will clearly learn french perfectly . bc01 0 * clearly , john perfectly will immediately learn french probably . bc01 0 * john perfectly rolled the ball down the hill . bc01 1 john rolled the ball perfectly down the hill . bc01 1 john rolled the ball down the hill perfectly . bc01 0 * john perfectly shot the ball . bc01 1 john shot the ball perfectly . bc01 0 * john intimately spoke to mary . bc01 1 john spoke intimately to mary . bc01 1 john spoke to mary intimately . bc01 1 john spoke french intimately to mary . bc01 1 john spoke french to mary intimately . bc01 1 mary jumped the horse perfectly over the last fence . bc01 1 mary jumped the horse over the last fence perfectly . bc01 0 * john spoke intimately french to mary . bc01 0 * john spoke to mary french . bc01 0 * mary persuaded to leave john . bc01 0 * the lions ate raw the meat . bc01 0 * mary persuaded that he should rest bill . bc01 1 we consider the men all fools . bc01 1 we consider the men all totally crazy . bc01 0 * i saw the men all . bc01 0 * the men were arrested all . bc01 0 * the men arrived all . bc01 1 the teacher ordered the two boys both to pay close attention . bc01 1 they returned the books all to their owners . bc01 1 we painted the chairs all red . bc01 1 the trainer fed the steaks all to the lions . bc01 0 * bill proud of himself john does n't consider . bc01 0 * home was gone by john . bc01 1 mary left the room angry . bc01 0 * the room was left angry by mary . bc01 0 * the room was left angry . bc01 1 john resembles bill . bc01 0 * bill is resembled by john . bc01 1 the package weighed 10 lb . bc01 0 * 10 lb was weighed by the package . bc01 1 this book cost $ 10 . bc01 0 * $ 10 was cost by this book . bc01 1 the book cost john $ 10 . bc01 0 * john was cost $ 10 by the book . bc01 0 * john is impressed by bill as pompous . bc01 0 * the boys were made a good mother . bc01 0 * the boys were made a good mother by aunt mary . bc01 0 * the kids were failed by max as a father . bc01 0 * the kids were failed as a father . bc01 0 * the men were struck by the idea as nonsense . bc01 0 * the men were promised to leave . bc01 0 * he impresses his friends all as pompous . bc01 0 * aunt mary made the boys all a good mother . bc01 0 * max failed the kids all as a father . bc01 0 * frank promised the men all to leave . bc01 0 * we proclaimed to the public john to be a hero . bc01 1 we proclaimed john to the public to be a hero . bc01 0 * we proclaimed sincerely john to be a hero . bc01 1 we proclaimed john sincerely to be a hero . bc01 0 * we proclaimed sincerely to the public john to be a hero . bc01 1 we proclaimed john sincerely to the public to be a hero . bc01 0 * they represented to the dean mary as a genuine linguist . bc01 1 they represented mary to the dean as a genuine linguist . bc01 0 * they represented seriously mary as a genuine linguist . bc01 1 they represented mary seriously as a genuine linguist . bc01 1 they represented mary seriously to the dean as a genuine linguist . bc01 0 * we proved to the authorities smith to be the thief . bc01 0 * we proved conclusively smith to be the thief . bc01 1 we proved smith conclusively to be the thief . bc01 0 * we proved conclusively to the authorities smith to be the thief . bc01 1 we proved smith conclusively to the authorities to be the thief . bc01 1 the gardener watered the tulips flat . bc01 1 the grocer ground the coffee beans to a fine powder . bc01 1 they painted their house a hideous shade of green . bc01 1 the joggers ran their nikes threadbare . bc01 1 the kids laughed themselves into a frenzy . bc01 1 he coughed his handkerchief completely soggy . bc01 1 they fed the meat to the lions raw . bc01 0 * the lions ate at the meat raw . bc01 1 we love them . bc01 0 * we love they . bc01 0 * we love their . bc01 0 * us love their . bc01 1 our love they . bc01 1 our love them . bc01 1 our love their . bc01 0 * he belief that mary kissed bill is mistaken . bc01 0 * him belief that mary kissed bill is mistaken . bc01 1 his belief that mary kissed bill is mistaken . bc01 1 mary loves him . bc01 1 mary is fond of him . bc01 0 * mary is fond him . bc01 1 mary criticized him . bc01 0 * mary 's criticism him was cruel . bc01 1 mary 's criticism of him was cruel . bc01 1 that john loves mary is doubtful . bc01 0 * john to love mary would be doubtful . bc01 1 for john to love mary would be doubtful . bc01 1 to go abroad would be nice . bc01 1 john 's plan to go abroad is nice . bc01 1 mary believed john to have loved her . bc01 1 mary considered john to have loved her . bc01 1 mary reported john to have loved her . bc01 0 * mary considered to have loved her . bc01 1 mary tried to go abroad . bc01 1 mary intended to go abroad . bc01 1 mary managed to go abroad . bc01 1 mary desired to go abroad . bc01 0 * mary tried john to go abroad . bc01 0 * mary managed john to go abroad . bc01 0 * mary desired john to go abroad . bc01 1 mary believed him to have loved her . bc01 1 mary considered him to have loved her . bc01 0 * mary believed he to have loved her . bc01 0 * mary considered he to have loved her . bc01 0 * mary reported he to have loved her . bc01 0 * mary believed his to have loved her . bc01 0 * mary considered his to have loved her . bc01 0 * mary reported his to have loved her . bc01 1 it is certain that john has loved mary . bc01 1 it is likely that john has loved mary . bc01 1 there are strangers in that garden . bc01 0 * there is strangers in that garden . bc01 0 * there is arriving three men at that station . bc01 1 there are arriving three men at that station . bc01 1 i consider there to be a man in that garden . bc01 0 * i consider there a man in that garden . bc01 1 they alleged there to have been many strangers in that garden . bc01 0 * they alleged many strangers to have been in that garden . bc01 1 john wagered there to have been a stranger in that haunted house . bc01 0 * john wagered a stranger to have been in that haunted house . bc01 1 john tried to kiss mary . bc01 1 john persuaded mary to kiss him . bc01 1 john told mary to kiss him . bc01 1 it is illegal to park here . bc01 1 i remembered him having kissed mary . bc01 1 i reported him having kissed mary . bc01 1 i reported having kissed mary . bc01 1 i enjoy taking a bath . bc01 1 i detest taking a bath . bc01 0 * i enjoy him taking a bath . bc01 0 * i detest him taking a bath . bc01 1 i saw him kissing mary . bc01 1 i noticed him kissing mary . bc01 0 * i noticed kissing mary . bc01 0 * there was known to everyone . bc01 1 john 's refusing the offer is shocking . bc01 1 the enemy 's destroying the city was horrific . bc01 1 john 's refusal of the offer was shocking . bc01 1 the enemy 's destruction of the city was horrific . bc01 0 * john wanted to leave the room happy and leave the room he did happy . bc01 1 i often send mary home drunk , and she gets there just fine . bc01 0 * i raw eat fish drunk . bc01 0 * i only eat fish drunk raw . bc01 1 i do n't think fred will , either . bc01 1 josé likes cabbage , and holly does too . bc01 1 josé ate cabbage , and holly has too . bc01 1 josé is eating cabbage , and holly is too . bc01 1 john is leaving but mary 's not . bc01 1 i consider bill intelligent and i consider sally not . bc01 0 * sally started running down the street , but only after josé started . bc01 0 * sally made bill laugh , and then josé made . bc01 0 * mary came to read fred 's story , and i also came to . bc01 1 john wants to go on vacation , but he does n't know when to . bc01 0 * mary was told to bring something to the party , so she asked sue what to . bc01 0 * we might go on vacation if we can ever figure out when to . bc01 0 * ron wanted to wear a tuxedo to the party , but caspar could n't decide whether to . bc01 0 * you should n't play with rifles because to is dangerous . bc01 0 * john is being discussed and sally is being too . bc01 0 * i remember john being discussed , but you recall sally being . bc01 1 sally might have eaten cabbage , but holly should n't . bc01 1 josé asks that we go to the meeting , and sally will tell us when . bc01 0 * it 's we go to the meeting , that sally will tell us when . bc01 1 it 's to mary that joe said holly can talk . bc01 1 mary claimed that eaten cabbage , holly has n't . bc01 1 mary claimed that eating cabbage , holly 's not . bc01 1 mary claimed that eat cabbage , holly wants to . bc01 0 * mary claimed that would eat cabbage , holly . bc01 0 * mary claimed that has n't eaten cabbage , holly . bc01 0 * mary claimed that eating cabbage , holly started . bc01 0 * mary claimed that eat cabbage , holly made me . bc01 0 * mary claimed that have eaten cabbage , holly should . bc01 0 * mary claimed that intelligent , i consider holly not . bc01 0 * lilly recounted a story to remember because holly had also recounted a story to . bc01 0 *? i reviewed joe 's attempt to find holly while you reviewed josé 's attempt to . bc01 0 *? mary questioned joe 's desire to eat cabbage , but only after i had questioned sally 's desire to . bc01 0 *? sally explained the attempt to arrest holly , but only after i had denied the decision to . bc01 1 john did n't hit a home run , but i know a woman who did . bc01 1 that betsy won the batting crown is not surprising , but that peter did n't know she did is surprising . bc01 0 * you should n't have played with rifles because to have is dangerous . bc01 0 ?? ron wanted to be wearing a tuxedo to the party , but caspar did n't know whether to be . bc01 0 * lilly recounted a story to be remembered because holly had recounted a story to be . bc01 1 lilly decided that eating cabbage , she should be . bc01 0 * lilly decided eating cabbage , to be . bc01 1 read fred 's story , i also want to . bc01 0 * you should n't play with rifles because play with rifles to is dangerous . bc01 0 ?? ron wanted to wear a tuxedo to the party , but wear a tuxedo to the party caspar could n't decide whether to . bc01 0 * lucy barnes recounted a story to remember because remember holly had recounted a story to . bc01 1 mag wildwood came to introduce the bartender but i came not to . bc01 1 mag wildwood came to introduce the bartender but i came precisely not to . bc01 1 you should unload rifles because not to s is dangerous . bc01 1 if ron knows whether to wear a tuxedo , and caspar knows whether not to , do they know different things ? bc01 1 lucy recounted a story to remember because holly had recounted as story not to . bc01 0 * i will , if i can work on it . bc01 1 did harry leave ? bc01 1 does joe sing ? bc01 0 * a proof that god exist does . bc01 0 * a proof that god does exists . bc01 0 * i visited every town in every country i had to . bc01 1 every man who said he would buy some salmon did . bc01 1 i visited every town i had to . bc01 1 every town in every country i had to i visited . bc01 1 every man who said he would buy some salmon did buy some salmon . bc01 1 lilly should buy salmon and mary should too . bc01 1 lilly should buy salmon and mary should buy salmon too . bc01 1 joe 's neuroses bother his patrons , and sally 's neuroses do too . bc01 1 joe likes his bar , and sally 's patrons do too . bc01 1 every picture of itself arrived . bc01 1 my uncle does n't have a spouse but your aunt does and he is lying on the floor . bc01 0 * my uncle did n't buy anything for christmas , but my aunt did it for him and it was bright red . bc01 1 i know which book max read , and which book oscar did n't . bc01 1 this is the book of which bill approves , and this is the one of which he does n't . bc01 0 ?* i know which book mag read , and which book bob asked why you had n't . bc01 0 ?* i know which book mag read , and which book bob discussed after i had . bc01 1 dulles suspected everyone who angleton did . bc01 1 while bob read fred , he did n't dickens . bc01 1 sally suspected joe , but he did n't holly . bc01 0 * although mag does n't eggplants , sally eats cabbage . bc01 0 ?* although i do n't know which book sam did , i do know which book sally read . bc01 0 ?* near everyone angleton did , dulles stood . bc01 0 * sally will stand near mag , but he wo n't holly . bc01 0 * while holly did n't discuss a report about every boy , she did every girl . bc01 1 sally will stand near every woman that you will . bc01 1 i know which woman holly will discuss a report about , but i do n't know which woman you will . bc01 0 * sam stood near yesterday every one of the women we 'd been discussing . bc01 0 * truman visited yesterday you . bc01 0 * truman told the story bob . bc01 1 while truman did n't visit me , he did you . bc01 1 while truman did n't tell me a story , he did rusty . bc01 1 while josé wo n't talk about mag , he might about holly . bc01 1 although doc might tell it to you , he wo n't to me . bc01 1 i think you need to show yourself more than you do anyone else . bc01 1 while truman does n't want to visit every city , he does barcelona . bc01 0 * while rusty might leave in order to please mag , he wo n't his father . bc01 0 * while doc might claim that bob had read his book , he wo n't the paper . bc01 0 * i 'll turn the radio down , but i wo n't up . bc01 1 fred likes eggplants , although he likes cabbage too . bc01 1 although he likes cabbage too , fred likes eggplants . bc01 1 fred gave flowers to his sweetie because frank had . bc01 1 china is a country that joe wants to visit , and he will too , if he gets enough money . bc01 1 jerry would n't read a book by babel , but meryl has done so and it was pretty good . bc01 0 * i know which book max read , and which book oscar has n't done so . bc01 1 joe might wish he had , but this is n't a country he has visited . bc01 1 while i might want to , this is the kind of thing that harris has already suggested . bc01 1 we like our friends and they do too . bc01 1 we like our friends and they like our friends too . bc01 1 we like our friends and they like their friends , too . bc01 1 rusty talked about himself only after holly did . bc01 0 * rusty talked about himself only after mary did talk about himself . bc01 1 i could find no solution , but holly might . bc01 1 fred talked about everything before rusty did . bc01 1 joe will go to the store , even though fred already has . bc01 1 today there is little or no official harassment of lesbians and gays by the national government , although autonomous governments might . bc01 1 the candidate was dogged by charges of infidelity and avoiding the draft , or at least trying to . bc01 0 * david is a great artist , and when he does , his eyes squint at you . bc01 0 * the candidate was dogged by charges of infidelity , or at least trying to . bc01 1 this information could have been released by gorbachev , but he chose not to . bc01 1 a lot of this material can be presented in a fairly informal and accessible fashion , and often i do . bc01 0 * john likes not mary . bc01 1 john does not like mary . bc01 0 * john meets often mary . bc01 1 john tries to often meet mary . bc01 0 * john tries to meet often mary . bc01 1 john tries not to meet mary . bc01 0 * john tries to meet not mary . bc01 1 is mary running the marathon ? bc01 0 * runs mary the marathon ? bc01 1 mary is often running the marathon . bc01 0 * mary runs often the marathon . bc01 1 mary is not running the marathon . bc01 1 i did n't , as bill had thought , go to the store . bc01 1 i did , as bill had thought , go to the store . bc01 0 * i did not , as bill had thought , go to the store . bc01 1 the writers could so believe the boy . bc01 0 * the writers so believed the boy . bc01 1 the writers did so believe the boy . bc01 0 * the writers did n't so believe the boy . bc01 1 rome destroyed carthage . bc01 1 rome 's destruction of carthage was horrific . bc01 1 john bought the picture of himself that bill saw . bc01 1 the perception of the problem is quite thorough . bc01 1 the knowledge of the problem is quite thorough . bc01 0 * the problem 's perception is quite thorough . bc01 0 * the problem 's knowledge is quite thorough . bc01 0 * the problem knows easily . bc01 0 * the ship sank to collect the insurance . bc01 1 the sinking of the ship was very devious . bc01 1 the sinking of the ship to collect the insurance was very devious . bc01 1 the ship 's sinking was very devious . bc01 0 * the ship 's sinking to collect the insurance was very devious . bc01 1 the testing of such drugs on oneself is too risky . bc01 0 * this drug 's testing on oneself is too risky . bc01 1 the ship was sunk to collect the insurance . bc01 1 this drug must first be tested on oneself . bc01 1 the president 's moral destruction is complete . bc01 1 the moral destruction of the president was certainly not helpful . bc01 1 mary wants to wear nice blue german dress . bc01 1 tomatoes were introduced in europe after 1492 . bc01 1 we rich have impeccable taste . bc01 0 * rich we have impeccable taste . bc01 0 * i read three his books . bc01 0 * i read every his book . bc01 1 i read his every book . bc01 1 every boy named a planet . bc01 1 i showed every boy a planet . bc01 1 few boys read any of the books . bc01 1 i showed few boys any of the books . bc01 0 * that few boys came upset any of the teachers . bc01 1 i was not reading a book when you came in . bc01 1 a boy did not laugh . bc01 1 most boys did not laugh . bc01 1 every boy named mercury and venus . bc01 1 every boy named every planet . bc01 1 each student speaks two languages . bc01 1 two students speak each language . bc01 1 some tourists visited all the museums . bc01 1 fond of some boy every girl is . bc01 0 * guinevere has a single bone that is in every corner of the house . bc01 1 a critic thinks that every book is readable . bc01 1 who does he admire ? bc01 1 he admires every man . bc01 0 * what does who admire ? bc01 1 who admires what ? bc01 1 someone from every city hates it . bc01 1 some professor admires every student . bc01 1 some professor admires every student and hates the dean . bc01 0 * you filed every paper without inspecting . bc01 1 everyone reported that max and some lady disappeared . bc01 1 most guests will be offended if we do n't invite some philosopher . bc01 1 all students believe anything that many teachers say . bc01 1 who will be offended if we do n't invite which philosopher ? bc01 1 who believes anything that who says ? bc01 1 exactly two boys kissed some girl . bc01 1 mary dates exactly two of the men who know a producer i like . bc01 1 every student has to come up with three arguments that show that some condition proposed by bill is wrong . bc01 1 if we invite some philosopher , max will be offended . bc01 1 three relatives of mine inherited a house . bc01 1 if three relatives of mine die , i will inherit a house . bc01 1 everyone attended some seminar . bc01 1 exactly half of the students attended some seminar . bc01 1 more than three students attended every seminar . bc01 1 every student attended more than three seminars . bc01 0 * every man surrounded the fort . bc01 1 every man lifted the table . bc01 0 * every man lifted the table together . bc01 1 the men surrounded the fort . bc01 1 all the men surrounded the fort . bc01 1 the men lifted the table together . bc01 1 a hundred men lifted the table together . bc01 1 all the men lifted the table together . bc01 1 every man lifted a table . bc01 1 each man lifted a table . bc01 1 someone attended every seminar . bc01 1 more than two students attended every seminar . bc01 1 you married no one . bc01 1 i will force you to marry no one . bc01 0 * we voted for me . bc01 1 everyone had been worrying himself stiff . bc01 1 everyone who had been worrying himself stiff said that he was relieved . bc01 1 there were five tourists in the room apart from myself . bc01 1 physicists like yourself are a godsend . bc01 1 max boasted that the queen invited lucie and himself for a drink . bc01 1 which pictures of him did earl see ? bc01 1 which pictures of earl did he see ? bc01 1 bill seems to himself to be handsome . bc01 1 bill seems to him to be handsome . bc01 1 john will see which picture of himself ? bc01 1 each other 's houses seem to the women to be garish . bc01 1 each other 's houses appear to the women to be garish . bc01 0 * old pictures of themselves convinced the children to pretend to be adults . bc01 0 * each other 's houses proved to the women that they had bad taste . bc01 1 these stories about himself worry john more than anything else . bc01 0 * these stories about himself describe john better than any official biography . bc01 1 which picture that john took at the party did he decide to display in his house ? bc01 1 which report that john revised did he submit ? bc01 1 mary always prefers lemons to limes . bc01 1 mary always has preferred lemons to limes . bc01 1 the dog that the rat bit chased the cat . bc01 0 * the cat that the dog that the rat bit chased died . bc01 1 jean never reads this newspaper . bc01 0 * jean reads never this newspaper . r-67 1 a gun which i had cleaned went off . r-67 1 i gave a gun which i had cleaned to my brother . r-67 1 i gave a gun to my brother which i had cleaned . r-67 1 he let the cats out which were whining . r-67 0 * what did bill buy potatoes and ? r-67 0 * what dl , d john fall asleep and bill wear ? r-67 1 who did mary see walking toward the railroad station ? r-67 1 whom did mary see walking toward the railroad station ? r-67 1 do you know the boy who mary saw ? r-67 1 do you know the boy whom mary saw ? r-67 1 the government prescribes the height of the lettering on the covers of the reports . r-67 0 * here is the snowball which i chased the boy who threw at our teacher . r-67 0 * where 's the bikini which tom mentioned the fact that sue had worn ? r-67 0 * who did he expect who i was acquainted with to show up ? r-67 1 who did he expect to show up who i was acquainted with ? r-67 1 whose book did you find ? r-67 1 he will put the chair between some table and some sofa . r-67 0 * what table will he put the chair between some table and ? r-67 0 * what table will he put the chair between and some sofa ? r-67 1 i know who is mad at john . r-67 1 i know a boy mad at john . r-67 1 john is taller than dill . r-67 1 john is taller than bill is . r-67 1 i want to go . r-67 1 shaving myself is difficult for me . r-67 1 the shock touched off the explosion . r-67 1 the shock touched the explosion off . r-67 1 i called almost all of the men from boston up . r-67 0 * i ran a man who was old down . r-67 1 i ran an old man down . r-67 0 * i 'm going to call somebody who is . r-67 0 * i polished the vase which was from india up . r-67 1 he attributed the fire to a short circuit . r-67 0 * he attributed to a short circuit the fire . r-67 1 he attributed to a short circuit the fire which . r-67 1 he threw the letter into the wastebasket . r-67 0 * he threw into the wastebasket the letter . r-67 1 they dismissed the proposal as too costly . r-67 0 * they dismissed as to costly the proposal . r-67 1 they dismissed as too costly the proposal for the state to build a sidewalk from dartmouth to smith . r-67 1 i found to be delicious some fruit which i picked up on the way home . r-67 1 i found delicious some fruit which i picked up on my way home . r-67 0 * i consider to be a fool the senator who made the opening speech . r-67 0 * did that john showed up please you ? r-67 1 did the fact that john showed up please you ? r-67 0 ?* that that john showed up pleased her was obvious . r-67 1 i want the fact that bill left to remain a secret . r-67 1 i want it to remain a secret that bill left . r-67 0 * what what i ate cost almost broke me . r-67 1 what the thing which i ate cost almost broke me . r-67 1 what the thing cost which i ate almost broke me . r-67 0 * i went out with a girl who that john showed up pleased . r-67 1 i went out with a girl who it pleased that john showed up . r-67 1 i loaned a man who was watching the race my binoculars . r-67 0 * i loaned my binoculars a man who was watching the race . r-67 1 she asked a man who was near the window whether it looked like rain . r-67 0 * we called my father , who had just turned 60 , up . r-67 0 ?* we elected my father , who had just turned 60 , president . r-67 0 * they gave my father , who had just turned 60 , it . r-67 1 he figured it out . r-67 0 * he figured out it . r-67 0 * he figured out that . r-67 1 he figured ann out . r-67 0 ?* he figured out ann . r-67 1 he figured something out . r-67 1 he figured the answer out . r-67 1 he figured out the answer . r-67 0 * i sent him it . r-67 1 i sent him that . r-67 1 i sent him something . r-67 0 ?* we elected the man who he had brought with him president . r-67 1 they gave the reports which he had brought with him to me . r-67 0 * he kept company some girls who had been injured in the wreck . r-67 0 ?* he kept some girls who had been injured in the wreck company . r-67 0 * i insist on seeing through all the students who had started out the term in my class . r-67 0 ?* i insist in seeing all the students who started out the term in my class through . r-67 1 i insist on seeing all the student ' through who started out the term in my class . r-67 0 * the doctor brought to the passengers who had passed cut from the fumes . r-67 0 * he tries to put on everyone who he does n't like . r-67 0 ?* he tries to put everyone who he does n't like on . r-67 0 * i watched the indians who the man who had been my advisor in my freshman year had advised me to study when i got to utah talk . r-67 1 tom drives as that man does . r-67 1 tom drives like that man . r-67 0 * i know a man who tom drives as does . r-67 1 i know a man who tom drives like . r-67 1 tom drives the way that that man drives . r-67 1 toms drives the way that that man does . r-67 1 john is taller than that man is . r-67 1 is taller than that man . r-67 0 * i know a man who john is taller than is . r-67 1 john is as tall as that man . r-67 0 * i know a man who john is as tall as is . r-67 1 i know a man who john is as tall as . r-67 1 mary has never kissed a man who is taller than john is . r-67 1 mary has never kissed a man who is taller than john . r-67 1 mary has never kissed a man taller than john . r-67 0 * mary has never kissed a man taller than john is . r-67 0 ?* mary has never kissed as tall a man as john is . r-67 1 mary has never kissed as tall a man as john . r-67 1 the brave are not afraid to die . r-67 1 drowning cats are hard to rescue . r-67 1 drowning cats is against the law . r-67 1 i know a taller man than john . r-67 1 the shooting of the prisoners shocked me . r-67 1 he told peter that i know a taller man than john , but peter did n't believe it . r-67 1 i divulged when bill promised to call me , but i did so reluctantly . r-67 1 i 'll talk to john on friday about the report that the shooting of the prisoners shocked me , and to his wife on saturday . r-67 1 i read a statement which was about that man . r-67 1 i read a statement about that man . r-67 0 * the man who i read a statement which was about is sick . r-67 1 the man who i read a statement about is sick . r-67 1 i read that bill had seen me . r-67 0 * i read that bill had seen myself . r-67 1 evidence that he was drunk will be presented . r-67 1 evidence will be presented that he was drunk . r-67 1 that the defendant had been rude was stoutly denied by his lawyer . r-67 1 bill told me something awful : that ice wo n't sink . r-67 1 this is a hat which i 'm going to see to it that my wife buys . r-67 1 this is a hat which i 'm going to see that my wife buys . r-67 1 phineas knows a girl who is jealous of maxime . r-67 1 phineas knows a girl who is behind maxime . r-67 1 phineas knows a girl who is working with maxime . r-67 0 * who does phineas know a girl who is jealous of ? . r-67 0 * who does phineas know a girl who is behind ? r-67 0 * who does phineas know a girl who is working with ? r-67 0 * who does phineas know a girl jealous of ? r-67 0 * who does phineas know a girl behind ? r-67 0 * who does phineas know a girl working with ? r-67 1 i believed the claim that otto was wearing this hat . r-67 1 i believed that otto was wearing this hat . r-67 0 * the hat which i believed the claim that otto was wearing is red . r-67 1 the hat which i believed that otto was wearing is red . r-67 1 rutherford understands himself . r-67 0 * rutherford is understood . by himself . r-67 1 the man who ordered ice cream said the pudding would be tasty . r-67 1 the pudding which the man who ordered ice cream said would be tasty was a horror show . r-67 1 the man who ordered it said the pudding would be tasty . r-67 0 * the pudding which the man who ordered it said would be tasty was a horror show . r-67 1 the sheriff denied that gangsters had bribed him . r-67 1 that gangsters had bribed him was denied by the sheriff . r-67 0 * the money which i am discussing the claim that the company squandered amounts to $ 400,000 . r-67 0 * the money which i am discussing sarah 's claim that the company squandered amounts to $ 400,000 . r-67 1 the money which i have hopes that the company will squander amounts to $ 400,000 . r-67 1 the money which i will have a chance to squander amounts to $ 400,000 . r-67 1 the money which i will make a proposal for us to squander amounts to $ 400,000 . r-67 1 the money which i will make a proposal that we squander amounts to $ 400,000 . r-67 1 i yawned . r-67 1 sam progressed . r-67 1 bill gave me $ 40 . r-67 1 i took a snooze . r-67 1 sam made progress . r-67 1 bill made a gift to me of $ 40 . r-67 1 max gave the car a shove . r-67 1 i have a feeling that arch will show up . r-67 1 bob proved that this set is recursive . r-67 1 bob proved this set is recursive . r-67 1 the proof that this set is recursive is difficult . r-67 1 i have hopes the company will squander the money . r-67 1 i have a feeling the company will squander the money . r-67 0 * i made a proposal we squander the money . r-67 1 dick 's claim that semantics is generative is preposterous . r-67 1 we are discussing their claim that flying saucers are real . r-67 0 * myron is making suzie 's claim that dead is better than red . r-67 1 myron is making the claim that dead is better than red . r-67 0 * i have tom 's feeling that the company will squander the money . r-67 0 * myra took betty 's snooze . r-67 0 * bill made sarah 's gal to me of $ 40 . r-67 0 * max gave the car levi 's shove . r-67 1 the man who was arrested by officer bob went mad . r-67 1 jack is claiming that you wo n't need it . r-67 1 jack is claiming you wo n't need it . r-67 0 * the claim you wo n't need it is being made by jack . r-67 1 you 're going to hurt yourself one of these days . r-67 1 i spoke to bill about himself . r-67 0 * he said that himself was hungry . r-67 1 i know a man who hates me . r-67 0 * i know a man who hates myself . r-67 1 i read him two statements about himself . r-67 0 * i read him judy 's statement about himself . r-67 1 i know two men who are behind me . r-67 1 i know two men behind me . r-67 0 * i know two men behind myself . r-67 1 you are too flip with people who are jealous of you . r-67 1 i screamed at some children who were watching me . r-67 1 i screamed at some children watching me . r-67 0 * what sofa will he put the chair between some table and ? r-67 0 * what table will he put the chair between and some sofa . r-67 0 * the lute which henry plays and sings madrigals is warped . r-67 0 * the nurse who polished her trombone and the plumber computed my tax was a blond . r-67 0 * which trombone did the nurse polish and ? r-67 0 * the plumber who the nurse polished her trombone and computed my tax was a hefty fellow . r-67 0 * whose tax did the nurse polish her trombone and the plumber compute ? r-67 1 irma washed the dishes , and sally dried , and floyd idled . r-67 1 i went to the store and bought some whisky . r-67 1 i went to the store and nike bought some whisky . r-67 1 here 's the whisky which i went to the store and bought . r-67 1 here 's the whisky which i went to the store and mike bought . r-67 1 tony has a fiat and yearns for a tall nurse . r-67 0 * the tall nurse who tony has a fiat and yearns for is cruel to him . r-67 0 * the shirts which i went to the movies and did n't pick up will cost us a lot of money . r-67 1 i went to the store and have bought some excellent whisky . r-67 0 * the excellent whisky which i went to the store and have bought was very costly . r-67 1 i went to the store to buy some whisky . r-67 0 * tony has a fiat to yearn for a tall nurse . r-67 0 * i went to the movies not to pick the shirts up . r-67 0 * i went to the movies to not pick the shirts up . r-67 0 * i went to the store to have bought some whisky . r-67 1 she 's gone and ruined her dress now . r-67 1 i 've got to try and find that screw . r-67 1 aunt hattie wants you to be nice and kiss your granny . r-67 1 the screw which i 've got to try and find holds the door to the frame . r-67 1 which granny does aunt hattie want me to be nice and kiss ? r-67 1 the boy works in a skyscraper and the girl in a quonset hut . r-67 0 * which boy works in a skyscraper and the girl in a quonset hut ? r-67 0 * the skyscraper which the boy works in and the girl in a quonset hut belongs to uncle sam . r-67 0 * the girl who the by works in a skyscraper and in a quonset but has a dimple on her nose . r-67 0 * which quonset hut does the boy work in a skyscraper and the girl in ? r-67 1 the luscious chick who billy went to the movies with will wed me ere the morn . r-67 0 * the luscious chick who billy and went to the movies will wed me ere the morn . r-67 0 * the ferrari which pietro bought from me and sofia adores him cost him a bundle . r-67 1 the ferrari which pietro , who sofia adores , bought from me cost him a bundle . r-67 1 sally might be pregnant , and everyone believes sheila definitely is pregnant . r-67 1 sally might be , and everyone believes sheila definitely is , pregnant . r-67 1 tom picked these grapes , and i washed these grapes , and suzie will prepare these grapes . r-67 1 tom picked , and i washed , and suzie will prepare , these grapes . r-67 0 * tom picked , and i washed some turnips , and suzie will prepare , these grapes . r-67 1 students who fail the final exam or who do not do the reading will be executed . r-67 1 students who fail the final exam will be executed or students who do not do the reading will be executed . r-67 1 john has been captured by the cops and i 'm afraid he 'll talk . r-67 1 i heated up the coffee and sally wiped the table off . r-67 1 although bob may not be a nut , many people have claimed it and i think so too . r-67 0 * although bob may not be a nut , many people have claimed and i think so too . r-67 1 when did you get back and what did you bring me ? r-67 1 make yourself comfortable . r-67 1 did merv show up and did you play chess ? r-67 0 * sally 's sick and what did you bring me ? r-67 0 * make yourself comfortable and i got sick . r-67 0 * what are you eating or did you play chess ? r-67 0 * which boy and the girl embraced ? r-67 0 * i 'm hungry and did you play chess ? r-67 1 who ate what ? r-67 1 what exploded when ? r-67 1 who gave what to whom ? r-67 1 how long did this fit of generosity last and who gave what to whom ? r-67 0 * i saw you there and who ate what ? r-67 0 * what exploded when and i warned you it would ? r-67 1 please make yourself comfortable and i 'll wash the dishes . r-67 1 you please make yourself comfortable and i 'll wash the dishes . r-67 1 harry will be in the marines next year and herman was drafted last night . r-67 1 sasha is gobbling down dumplings faster than i can reheat them . r-67 1 i want to peruse that contract before filing it away . r-67 1 fred tore the curtain in roiling it up . r-67 0 ?? the dumplings which sasha is gobbling down faster than i can reheat are extremely tasty , if i do say so . r-67 1 the curtain which fred tore in rolling up was the kind gift of my maternal aunt priscilla . r-67 1 i want to peruse that contract before damaging it while filing it away . r-67 1 sasha is gobbling down dumplings faster than i can reheat the meatballs . r-67 1 i want to peruse that contract before filing away the deed . r-67 1 fred tore the curtain in rolling up the wallpaper . r-67 0 * i think anita may have poisoned the meatballs which sasha is gobbling down dumplings faster than i can reheat . r-67 0 * the deed which i want to peruse that contract before filing away is probably a forgery . r-67 1 the dumplings which sasha is gobbling down faster than i can reheat the meatballs are extremely tasty , if i do say so . r-67 1 i suspect that the contract which i want to peruse before filing away the deed may some loopholes . r-67 1 the curtain which fred tore in rolling the wallpaper up was the kind gift of my maternal aunt priscilla . r-67 1 the dumplings which sasha is gobbling down faster than i can reheat them are extremely tasty , if i do say so . r-67 1 reports which the government prescribes the height of the lettering on the covers of are invariably boring . r-67 1 reports the covers of which the government prescribes the height of the lettering on almost always put me to sleep . r-67 1 reports the lettering on the covers of which the government prescribes the height of are a shocking waste of public funds . r-67 1 reports the height of the lettering on the covers of which the government prescribes should be abolished . r-67 0 * reports of which the government prescribes the height of the lettering on the covers are invariably boring . r-67 0 * reports on the covers of which the government prescribes the height of the lettering almost always put me to sleep . r-67 0 * reports of the lettering on the covers of which the government prescribes the height are shocking waste of public funds . r-67 1 he has books by several greek authors . r-67 1 which greek authors does he have books by ? r-67 0 ?* by which greek authors does he have books ? r-67 0 * the boy who i watched bill and was vain . r-67 0 * the boy bill and who i watched was vain . r-67 1 they will give me a hat which i know that i wo n't like . r-67 0 * they will give me a hat that i wo n't like which i know . r-67 1 the boy whose guardian 's employee we elected president betrayed us . r-67 0 * the boy whose guardian 's we elected employer president betrayed us . r-67 0 * the boy whose we elected guardian 's guardian 's employer president betrayed us . r-67 1 i 'm going to ask bill to make the old geezer take up these points later . r-67 1 these points i 'm going to ask bill to make the old geezer take up later . r-67 1 the boy 's guardians ' employer we elected president . r-67 0 * the boy 's guardian 's we elected employer president . r-67 0 * the boy 's we elected guardian 's employer president . r-67 1 we elected president the boy 's guardian 's employer . r-67 0 * we elected employer president the boy 's guardian 's . r-67 0 * we elected guardian 's employer president the boy . r-67 1 which boy 's guardian 's employer did we elect president ? . r-67 0 * which boy 's guardian 's did we elect employer president ? r-67 0 * how have you picked up tnt carelessly ? r-67 1 how carelessly have you picked up tnt ? r-67 1 sheila married that tall a man . r-67 1 how tall a man did sheila marry ? r-67 0 * how tall did sheila marry a man ? r-67 0 * how did sheila marry tall a man ? r-67 1 on which bed does tom sleep ? r-67 1 the bed on which tom slept was hard . r-67 1 which bed did tom sleep on ? r-67 1 the bed which tom slept on was hard . r-67 1 my sister arrived at a time when no buses were running , and my brother arrived at a time when no buses were running too . r-67 1 jack disappeared in a mysterious manner and the hudson disappeared in a mysterious manner too . r-67 0 * jack disappeared in a mysterious manner and marion disappeared in one too . r-67 0 * what time did you arrive at ? r-67 0 * the manner which jack disappeared in was creepy . r-67 0 * the place which i live at is the place where route 150 crosses the hudson river . r-67 1 the only relatives who i 'd like to do away with are my aunts . r-67 1 that meeting i 'd like to sit in on . r-67 0 * the only relatives with whom i 'd like to do away are my aunts . r-67 0 * to whom is she trying to make up now ? r-67 0 * on that meeting i 'd like to sit in . r-67 1 for whose rights do you expect me to speak up ? r-67 1 one plan which i got wind of was calculated to keep us in suspense . r-67 1 did you notice which difficulties she made . r-67 1 who are you trying to get hold of ? r-67 0 * one plan of which i got wind was calculated to keep us in suspense . r-67 0 ?* did you notice of which difficulties she made light ? r-67 0 * of whom are you trying to get hold ? r-67 1 the only offer of which i plan to take advantage will give me an eleven month paid vacation . r-67 1 the scenes to which the censors took objection had to do with the mixed marriage of a woman and a giant panda . r-67 1 advantage will be taken of his offer . r-67 1 his offer will be taken advantage of . r-67 1 in this experiment , fourteen variables must be kept track of simultaneously . r-67 1 objection was taken to the length of our skirts . r-67 1 a plan to negotiate an honorable end to the war in vietnam was gotten wind of . r-67 0 * light was made of her indiscretions . r-67 1 her indiscretions were made light of . r-67 0 * hold has been gotten of some rare old manuscripts . r-67 1 some rare old manuscripts have been gotten hold of . r-67 1 use was made of gauss 's polynomial lemma . r-67 1 tabs were kept on all persons entering the station . r-67 0 ?? the persons on whom we kept tabs all proved to be innocent . r-67 0 * faith was had in all kinds of people . r-67 1 my friends mike talked to about politics yesterday . r-67 1 to my friends mike talked about politics yesterday . r-67 0 * mike talked to about politics yesterday my friends . r-67 0 ?? she made light , not too surprisingly , of the difficulties we might have at the border . r-67 1 i gave to the officer in charge the blackjack . which i had found in the cookie jar . r-67 1 i am confident of , and my boss depends on , a successful outing at the track . r-67 0 * the university 's students are intelligent and faculty is committed to freedom . r-67 1 the boy 's uncle and aunt were kissing . r-67 0 * the boy whose uncle and tom 's aunt 's grandmother were kissing was furious . r-67 1 who are you gawking at ? r-67 1 which hat do you believe that she never wore ? r-67 1 the reporters expected that the principal would fire some teacher . r-67 1 that the principal would fire some teacher was expected by the reporters . r-67 1 the teacher who the reporters expected that the principal would fire is a crusty old jerk . r-67 0 * the teacher who that the principal would fire was expected by the reporters is a crusty old jerk . r-67 1 the teacher who it was expected by the reporters that the principal would fire is a crusty old jerk . r-67 0 * the hat which that i brought seemed strange to the nurse was a fedora . r-67 1 i disliked the boy 's playing the piano loudly . r-67 1 the boy whose loud playing of the piano i disliked was a student . r-67 1 the piano which i disliked the boy 's playing loudly was badly out of tune . r-67 1 the boy 's loud playing of the piano drove everyone crazy . r-67 1 the boy 's playing the piano loudly drove everyone crazy . r-67 1 that piano , the boy 's loud playing of which drove everyone crazy , was badly out of tune . r-67 0 * that piano , the boy 's playing which loudly drove everyone crazy , was badly out of tune . r-67 0 * that piano , which the boy 's playing loudly drove everyone crazy , was badly out of tune . r-67 0 * did that he played the piano surprise you ? r-67 0 * would for him to have played the piano have surprised you ? r-67 0 * is whether he played the piano known ? r-67 1 did his having played the piano surprise you ? r-67 1 mike quipped that she never wore this hat . r-67 0 * mike quipped she never wore this hat . r-67 0 * i dislike it for him to tickle myself . r-67 0 * i dislike his tickling myself . r-67 1 for anna to tickle him drives frank crazy . r-67 1 anna 's tickling him drove frank crazy . r-67 1 they are investigating all people owning parakeets . r-67 0 * the cages which we donated wire for the convicts to build with are strong . r-67 0 * what kind cf parakeets are they investigating all people owning ? r-67 1 it appears to be true that harry likes girls . r-67 1 this is the dog that chased the cat that caught the rat that ate the cheese . r-67 0 * her efficient looking of the answer up pleased the boss . r-67 1 her efficient looking up of the answer pleased the boss . r-67 1 she did away with her father . r-67 0 * she did with her father away . r-67 1 it is not true that that bob was lying was obvious . r-67 1 a proof was given that the claim that john had lied had been made . r-67 0 * a proof that the claim had been made was given that john had lied . r-67 0 * that sam did n't pick those packages up is possible which are to be mailed tomorrow . r-67 0 ?* that sam did n't pick those packages which are to be mailed tomorrow up is possible . r-67 0 * it is possible that sam did n't pick those packages which are to be mailed tomorrow up . r-67 1 that sam did n't pick those packages up which are to be mailed tomorrow is possible . r-67 1 which packages is it possible that sam did n't pick up which are to be mailed tomorrow ? r-67 1 sam did n't pick those packages up which are to be mailed tomorrow until it had stopped raining . r-67 1 sam picked those packages up which are to be mailed tomorrow rest might , but he did n't want to do so until it had stopped raining . r-67 0 ?* sam did n't pick those packages up until it had stopped raining which are to be mailed tomorrow . r-67 1 which packages which are to be mailed tomorrow is it possible that sam did n't pick up until it had stopped raining ? r-67 0 * which packages is it possible that sam did n't pick up which are to be mailed tomorrow until it had stopped raining ? r-67 0 * which packages did n't sam pick up which are to be mailed tomorrow until it had stopped raining ? r-67 1 a girl came in who had worn this coat . r-67 0 * the coat which a girl came in who had worn was torn . r-67 1 that that for herschel to throw a fit would confuse the guards was obvious is not true . r-67 1 it is not true that that for herschel to throw a fit would confuse the guards was obvious . r-67 1 it is not true that it was obvious that for herschel to throw a fit would confuse the guards . r-67 1 that that it would confuse the guards for herschel to throw a fit was obvious is not true . r-67 1 it is not true that that it would confuse the guards for herschel to throw a fit was obvious . r-67 1 that it was obvious that it would confuse the guards for herschel to throw a fit is not true . r-67 1 it is not true that it was obvious that it would confuse the guards for herschel to throw a fit . r-67 1 a review of this article came out yesterday . r-67 1 a review came out yesterday of this article . r-67 1 a review seems to have come out yesterday of this article . r-67 1 why do n't you pick some review up of this article ? r-67 1 ann is going to send a picture of chairman mao to her teacher , as soon as she gets home . r-67 1 ann is going to send a picture to her teacher of chairman mao , as soon as she gets home . r-67 0 * which picture is ann going to send to her teacher of chairman mao as soon as she gets home ? r-67 0 * who is ann going to send a picture to her teacher of , as soon as she gets home ? r-67 1 that a review came out yesterday of this article is catastrophic . r-67 0 * that a review came out yesterday is catastrophic of this article . r-67 1 i 'll give some to my good friend from akron . r-67 0 * i 'll give to my good friend from akron some . r-67 1 around midnight i promised that he would be there . r-67 1 i promised that he would be there tomorrow . r-67 1 tomorrow i promised chat he would be there . r-67 1 i promised that tomorrow he would be there . r-67 1 i promised that around midnight he would be there . r-67 1 beans i do n't like . r-67 1 proud of him i 've never been . r-67 1 beans i do n't think you 'll be able to convince me harry has ever tasted in his life . r-67 1 it was tomorrow that i promised that he would be there . r-67 1 it is beans that i do n't like . r-67 1 do you think that he sometimes went there alone ? r-67 1 i wo n't ask you to believe that he tried to force me to give her some money . r-67 1 that he sometimes went there alone is certain . r-67 1 do you believe that somebody was looking for something ? r-67 1 i never met that man who somebody tried to kill . r-67 1 i wo n't have any money . r-67 0 * i will ask you to believe that he tried to force me to give her any money . r-67 1 do you think that he ever went there alone ? r-67 1 that he ever went there alone is odd . r-67 0 that he ever went there alone is certain . r-67 0 * i never met that man who anybody tried to kill . r-67 1 tom told somebody that he was n't sick . r-67 1 buffy could n't do 100 push ups and somebody laughed . r-67 0 * tom told anybody that he was n't sick . r-67 0 * buffy could n't do 100 push ups and anybody laughed . r-67 1 i believe that the sun was out . r-67 1 i believed that the sun was out . r-67 1 i believed that the sun is out . r-67 0 * that the sun is out was obvious . r-67 1 that i believed that the sun was out is obvious . r-67 1 that i believed that the sun is out is obvious . r-67 1 that jack sometimes slept is impossible . r-67 1 that jack ever slept is impossible . r-67 0 * that jack ever slept is possible . r-67 0 * i talked to winston about winston . r-67 0 * i talked to winston about him . r-67 1 that the sun was out was obvious . r-67 1 john scratched his arm and mary did too . r-67 1 mary scratched her arm too . r-67 1 mary scratched john 's arm too . r-67 1 john scratched his arm and the boy who knew mary scratched her arm . r-67 1 john scratched his arm and the boy who mary knew did so too . r-67 1 that the fuzz wanted him worried john but it did n't worry mary . r-67 1 that the fuzz wanted him worried john , but that the fuzz wanted john did n't worry mary . r-67 1 that the police wanted him worried johns but it did n't worry the boy who mary knew . r-67 0 * john is prouder of having gone than nobody expected me to believe he would be . r-67 0 * john is prouder of having gone than john did n't expect me to believe he would be . r-67 0 * john is prouder of having gone than john expected nobody to believe he would be . r-67 0 * john is prouder of having gone than john expected me not to believe he would be . r-67 0 * john is prouder of having gone than than john expected me to believe not all . r-67 0 * john is prouder of having gone than john expected me to believe that he was n't . r-67 1 john is prouder of having gone than people who do n't know him would expect me to believe he would be . r-67 1 john is prouder of having gone than sally expected joan to believe that the man who did n't shave would be . r-67 1 john is prouder of having gone than i expected you to believe he would be of not having fallen asleep . r-67 1 tom knows it and dick knows it and harry knows it . r-67 1 tom washed the car , and dick waxed the car . r-67 1 tom ordered bacon , and dick ordered lettuce , and harry ordered tomatoes . r-67 1 tom , dick , and harry know it . r-67 1 tom washed , and dick waxed , and harry polished the car . r-67 1 tom , dick , and harry ate , drank , and sang . r-67 1 tom ordered bacon , and dick lettuce , and harry tomatoes . r-67 1 tom ordered bacon , and dick ordered lettuce , and i think that harry ordered tomatoes . r-67 0 * tom ordered bacon , and dick lettuce , and i think that harry tomatoes . r-67 1 joe is taller than mary is . r-67 1 joe is taller than mary . r-67 1 joe is taller than i think mary is . r-67 0 * joe is taller than i think mary . r-67 1 mike will sing if you will sing . r-67 1 mike will sing if you will . r-67 1 jim will go if he feels good . r-67 1 if jim feels good , he will go . r-67 1 i gave the book to harvey because he asked me to . r-67 1 it never occurred to harvey that i might want to leave because he is insensitive to other people 's desires . r-67 0 * it never occurred to harvey because he is insensitive to other people 's desires that , i might want to leave . r-67 1 . i figured it out that she was lying . r-67 1 i explained it to bill that she was lying . r-67 1 i took it for granted that she was lying . r-67 1 i regret it exceedingly that she was lying . r-67 1 he 'll bring me a hot dog if he sees one . r-67 0 * he 'll bring me one if he sees a hot dog . r-67 1 if he sees a hot dog , he 'll bring me one . r-67 1 if he sees one ; he 'll bring me a hot dog . r-67 0 * seven more soldiers came in after ten ones had left . r-67 0 * seven more ones came in after ten soldiers had left . r-67 0 * after ten soldiers had left , seven more ones came in . r-67 0 * after ten ones had left , seven more soldiers came in . r-67 1 seven more soldiers came in after ten had left . r-67 0 * seven more came in after ten soldiers had left . r-67 1 after ten had left , seven more soldiers came in . r-67 1 harry believes that sally is innocent , although no one else believes it . r-67 1 although no one else believes that sally is innocent , harry believes it . r-67 1 although no one else believes it , harry believes that sally is innocent . r-67 1 webster touched a sword after henry had done it . r-67 1 after henry had touched a sword , webster did it . r-67 1 after henry had done it , webster touched a sword . r-67 1 if so , i 've lost $ 500 . r-67 0 * if it , i 've lost $ 500 . r-67 1 harry thinks that sally is innocent , although no one else thinks so . r-67 0 * harry thinks so , although no one else thinks sally is innocent . r-67 1 although no one else thinks that sally is innocent , harry thinks so . r-67 1 although no one else thinks so , harry thinks that sally is innocent . r-67 1 webster touched a sword after henry had done so . r-67 1 after henry had touched a sword , webster did so . r-67 1 after henry had done so , webster touched a sword . r-67 1 i 'll work on it if i can work on it . r-67 1 i 'll work on it if no one else has worked on it . r-67 1 i 'll work on it if you do . r-67 1 i 'll work on it if no one else had . r-67 1 i 'll work on it if same will be too . r-67 0 * i will if i can work on it . r-67 1 if i can work on it , i will . r-67 1 if i can , i will work on it . r-67 1 the boy who mary loves hates her . r-67 1 the man who ordered a hot dog got one . r-67 1 tom says that it 's going to rain but i do n't believe it . r-67 1 he said he would leave and now he 's done it . r-67 1 i think that mort 's a swell guy , and lenny thinks so too . r-67 1 why ca n't the man who usually cuts the grass do so today ? r-67 1 mickey and roger have signed , and whitey will tomorrow . r-67 1 ronald scoffs at the belief that he would run if nominated . r-67 1 romeo conceded that he and juliet were going steady . r-67 1 i lost a japanese slide rule , and the fact that peter now has one i regard with suspicion . r-67 1 the earth is flat , but will all those who do n't believe it please raise their hands ? r-67 1 pilots who can fly barrel rolls say that for me to try to do it in a glider would be hazardous . r-67 1 the passengers who had known that the train was not on fire said that those who had thought so had barricaded themselves in the bathrooms . r-67 1 playing with matches is ; lots of fun , but doing , so and emptying gasoline from one can to another at the same time is a sport best reserved for arsons . r-67 1 swimming is fun , and i believe that people who ca n't should be taught to . r-67 1 how brave he is ! r-67 1 how surprisingly well he dances ! r-67 0 * whether he left ! r-67 0 * why he knows the answer ! r-67 0 * which boy is tall ! r-67 1 how brave everybody must think you expect me to believe he is ! r-67 0 * how brave they must believe the claim that you are ! r-67 1 how brave they must believe that you are ! r-67 0 * how brave he is tall and ! r-67 0 * how brave mike is cowardly and sam is ! r-67 1 how he is brave ! r-67 1 bill left when everyone will believe that the police have forced me to confess that i shot sandra . r-67 0 * bill left when i am looking at a girl who vomited . r-67 1 bill left when i believe the bomb had just exploded . r-67 0 * bill left when i believe the claim that the bomb had just exploded . r-67 1 when i am awake and susan is asleep , bill will leave . r-67 0 * when i am awake at that time and susan is asleep , bill will leave . r-67 0 * bill left when that no one else was awake is certain . r-67 1 bill left when it is certain that no one else was awake . r-67 1 here 's a knife for you to cut up the onions with . r-67 0 * i brought a razor to shave himself with . r-67 0 * i brought a razor to shave myself with . r-67 1 i brought a razor with which to shave myself . r-67 0 * i brought a razor with which to shave himself . r-67 1 i brought john a razor to shave himself with . r-67 0 * i brought john a razor to shave myself with . r-67 1 i brought john a razor with which to shave himself . r-67 0 * i brought john a razor with which to shave myself . r-67 0 * here 's a knife which for you to cut up the onions with . r-67 1 here 's a plate for you to make bob try to begin to force his sister to leave the cookies on . r-67 0 * here 's a knife for you to say was on the table . r-67 0 * here 's a pole for you to kiss the girl who tied the string around . r-67 0 * here 's a razor for you to chop up these nuts with this cleaver and . r-67 0 * here 's a razor for that you will be shaved with to be announced . r-67 0 ?? here 's a razor for it to be announced that you will be shaved with . r-67 0 * i loaned maggie a swiss army knife whose to open the padlock with corkscrew . r-67 1 fluffy is sick , which few people realize . r-67 1 fluffy is sick , which i 'm not sure you know sarah expects me to believe joan realizes . r-67 0 * fluffy is sick , which i slapped a boy who would n't acknowledge . r-67 1 fluffy is sick , which i believe that few people realize . r-67 0 * fluffy is sick , which i fell asleep and tom suddenly realized . r-67 0 * fluffy is sick , which that no one here realizes is certain . r-67 1 fluffy is sick , which it is certain that no one here realizes . r-67 1 fluffy is sick , which nobody knows . r-67 0 * fluffy is sick , as nobody knows . r-67 1 fluffy is sick , as not everybody knows . r-67 0 * fluffy is sick , as surprises me . r-67 1 it was this hat that tom said al thought you wanted me to make jack put on . r-67 1 what tom said al thought you wanted me to make jack put on was this hat . r-67 1 this hat tom said al thought you wanted me to make jack put on . r-67 0 * it is this hat that i know the boy who is wearing . r-67 1 it is this hat that i believe that he was wearing . r-67 0 * what i know the boy who was wearing is this hat . r-67 1 what i believe that he was wearing is this hat . r-67 0 * this hat i know the boy who was wearing . r-67 1 this hat i believe that he was wearing . r-67 0 * what the gloves and were on the table was this hat . r-67 0 * this hat the gloves and were on the table . r-67 0 * it is this hat that that he was wearing is certain . r-67 1 it is this hat that it is certain that he was wearing . r-67 0 * what that he was wearing is certain is this hat . r-67 1 what it is certain that he was wearing is this hat . r-67 1 this hat it is certain that he was wearing . r-67 0 * it was john 's that i stole bike . r-67 0 * the one whose i stole bike was john 's . r-67 0 * john 's i stole bike . r-67 1 maxwell is n't the doctor that his father was . r-67 1 maxwell is n't half the doctor that his father was . r-67 1 maxwell is the man who won the nobel prize for astrology . r-67 0 * maxwell is n't half the doctor . r-67 1 maxwell is quite the doctor . r-67 1 maxwell is n't much of a doctor . r-67 1 maxwell is more of a doctor than his son is . r-67 0 * maxwell is n't half the doctor that was here . r-67 0 * maxwell is n't half the doctor that polished off the vodka . r-67 0 * half the doctor that maxwell 's father was sat down . r-67 1 maxwell is n't half the doctor that feared marge would realize tom had confessed that he knew bill expected him to be . r-67 0 * maxwell is n't half the doctor that i know an african chief who is . r-67 1 maxwell is n't half the doctor that people around here believe that his father was . r-67 0 * maxwell is n't half the doctor that his sister is a psychologist and his father was . r-67 0 * maxwell is n't half the doctor that that he would be if he studied is certain . r-67 1 maxwell is n't half the doctor that i 'm certain that he would be if he studied . r-67 1 he 's the happiest that i 've ever seen him . r-67 1 the hardest that it ever snowed was last january 12th . r-67 1 the hardest that i think i remember him ever telling me that he had heard of it snowing around here was last january 12th . r-67 0 * he 's the happiest that we ever talked to the boy who had seen him . r-67 1 he 's the happiest that i believe that he 's ever been . r-67 0 * the hardest that i ever knew a man who said that it had snowed was last january 12th . r-67 1 the hardest that i believe that it ever snowed was last january 12th . r-67 0 * he 's the happiest that i 've ever seen him drunk and . r-67 0 * the hardest that all the power lines were down and it snowed was last january 12th . r-67 0 * he is the happiest that that he has ever been is believed . r-67 1 he is the happiest that it is believed that he has ever been . r-67 0 * the hardest that that it has snowed here is believed was last january 12th . r-67 1 the hardest that it is believed that it has ever snowed here was last january 12th . r-67 0 * a friend of mine and a girl who was from his home town met in vienna who was working in europe . r-67 0 * a friend of mine who was working in europe and a girl met in vienna who was from his home town . r-67 0 * it and that he loved another was painfully evident that she loved him . r-67 0 * that she loved him and it was painfully evident that he loved another . r-67 1 mary and an old friend who comes from miami kissed . r-67 0 * mary and kissed an old friend who comes from miami . r-67 1 i gave a picture of a covered bridge and a hundred hikers from hoboken to my sister . r-67 0 * i gave a picture of a covered bridge and to my sister a hundred hikers from hoboken . r-67 0 * joan plays and sings folk songs a wonderful old guitar from spain . r-67 1 sally might be pregnant , and i know a girl who definitely is pregnant . r-67 0 ?* sally might be , and i know a girl who definitely is pregnant . r-67 1 sally might be pregnant , and i believe the claim that sheila definitely is pregnant . r-67 1 sally might be pregnant , and i believe that sheila definitely is pregnant . r-67 0 ?* sally is tall , and may be , and sheila is short , and definitely is , blond . r-67 1 hank plays the guitar and finds arrangements for all the old folk songs which are still sung in these hills , and ernie writes down all the old folk songs which are still sung in these hills . r-67 0 ?? hank plays the guitar and finds arrangements for , and ernie writes down , all the old folk songs which are still sung in these hills . r-67 1 they said that tom would pay up and he will pay up . r-67 1 they said that tom was working , and he is working . r-67 1 they said that tom would pay up , and pay up he did . r-67 1 they said that tom would pay up , and pay up he will . r-67 1 they said that tom was working , and working he is . r-67 1 they said tom would pay up , and pay up i 'm sure everybody will tell you that his lawyers expect me to believe he did . r-67 1 they said nobody would pay up , but i know a boy who did pay up . r-67 0 * they said nobody would pay up , but pay up i know a boy who did . r-67 1 they said that tom would pay up , and pay up i believe that he did . r-67 1 they said that tom would n't pay up , but he did go to the bank , and he did pay up . r-67 1 they said that tom would n't pay up , but pay up he did go to the bank and he did . r-67 0 * they said that tom would pay up , and pay up that he did is well-known . r-67 1 they said that tom would pay up , and pay up it is well-known that he did . r-67 1 although dick is handsome , i 'm still going marry herman . r-67 1 handsome though dick is , i 'm still going to marry herman . r-67 1 handsome though everyone expects me to try to force bill to make mom agree that dick is , i 'm still going to marry herman . r-67 0 * handsome though i know several boys who are , i 'm still going to marry herman . r-67 1 handsome though i believe that dick is , i 'm still going to marry herman . r-67 0 * handsome though dick is fair , nordic , strong and , i 'm still going to marry herman . r-67 0 * handsome though that dick will be is likely , i 'm still going to marry herman . r-67 1 the more contented we pretended to be , the more we grew angry at the doctors . r-67 0 * the more contented i laughed at the nurse who thought that we were becoming , the more angry we grew at the doctors . r-67 0 ?? the more contented the nurses began to believe that we were going to pretend to be , the more angry we grew at the doctors . r-67 0 * the more contented we pretended to be better fed and , the more angry we grew at the doctors . r-67 0 * the more contented for us to pretend to be became possible , the more angry we grew at the doctors . r-67 1 i have some papers to grade . r-67 0 ?* i have some papers to announce that i 've got to grade . r-67 1 i have some papers to try to finish grading . r-67 1 i have getting into college to consider . r-67 0 * i have some papers to grade these exams and . r-67 0 * i have some voice exercises to play the guitar and sing . r-67 0 * i have john 's to grade paper . r-67 1 wilt is taller than i imagine anybody would ever guess that people had begun expecting red to announce that he was . r-67 1 the sofa was longer than the desk was . r-67 1 the sofa was longer than the desk was long . r-67 0 * willy is taller than i know a boy who is . r-67 1 wilt is taller than i believe that bill is . r-67 0 * willy is taller than bill is strong and . r-67 0 * dean drank more booze than frank ate wheaties and sammy drank . r-67 0 * willy is taller than that bill is is generally believed . r-67 1 wilt is taller than it is generally believed that bill is . r-67 1 willy is taller than bill by 7 millimeters . r-67 1 the raise which scrooge generously gave tom 's father increased his yearly salary by five cents . r-67 1 the hare outran the tortoise by so much that he forgot the latter was even in the race any more . r-67 1 who knew mickey would overthrow home plate by that much ? r-67 1 willy is taller than bill by that much . r-67 1 john is taller than bill by that much . r-67 1 willy is taller than bill by as much as joe is taller than the dan . r-67 1 willy is taller than bill by more than joe is taller than dan . r-67 1 willy is taller than bill by as much as everybody seems to expect me to admit to having publicly proclaimed that i believed . r-67 0 * willy is taller than bill by as much as i know a boy who thinks that bill is taller than dan . r-67 1 willy is taller than bill by as much a peter believes that billy is taller than dan . r-67 0 * willy is taller than bill by as much as i watch all the games and i know billy is taller than dan . r-67 0 * willy is taller than bill by as much as that bill is taller than dan is believed . r-67 1 willy is taller than bill by as much as it is believed that joe is taller than dan . r-67 1 the rock was too heavy for me to pick up . r-67 1 this rock is too heavy for me to begin to decide about helping bob to try to pick it up . r-67 0 ?? this rock is too heavy for me to begin to decide about helping bob to try to pick up . r-67 0 * this rock is too heavy for us to try to claim that we picked up . r-67 1 sodium is a little too peppy for me to want to try mixing it and water in a teacup . r-67 0 * sodium is a little too peppy for me to want to try mixing and water in a teacup . r-67 0 * that piece of ice is too big for for him to be able to pick up with a teaspoon to be likely . r-67 0 ?? that piece of ice is too big for it to be likely for him to be able to pick up with a teaspoon . r-67 1 bob is too thin for me to be able to squeeze into his jacket . r-67 0 * bob is too thin for me to be able to squeeze into jacket . r-67 1 this rock is light enough for marcia to pick it up . r-67 1 this rock is light enough for marcia to pick up . r-67 1 the socks are ready for you to put on . r-67 1 the socks are ready for you to go about beginning to put them on . r-67 1 the socks are ready for you to announce that you will put them on . r-67 0 * the socks are ready for you to announce that you will put on . r-67 1 the socks are ready for you to try them and the shoes on . r-67 0 * the socks are ready for you to try and the shoes on . r-67 1 john is ready for you to inspect his bunk . r-67 0 * john is ready for you to inspect bunk . r-67 0 * the socks are ready for it to be planned for you to put on . r-67 1 it is tough to play sonatas on this violin . r-67 1 sonatas are difficult to play on this violin . r-67 1 sonatas are easy to play on this violin . r-67 1 sonatas are tough to play on this violin . r-67 1 this violin is easy to play sonatas on . r-67 1 this violin is tough to play sonatas on . r-67 1 i made john easy to get along with . r-67 1 i made it easy to get along with john . r-67 1 john tries to be easy to get along with . r-67 0 * john tried bill to play checkers . r-67 0 * john tried for bill to play checkers . r-67 0 * bill would be easy for for you to chat with in moscow to become expensive . r-67 0 * bill would be easy for it to become expensive for you to chat with in moscow . r-67 1 my father , he 's armenian , and my mother , she 's greek . r-67 0 * if my father , he comes home late , my mother always grills him . r-67 0 * it started to rain after jackie and me , we had finally gotten to our seats . r-67 0 ?* i acknowledged that my father , he was tight as an owl . r-67 1 i said that my father , he was tight as an owl . r-67 0 * that beans he likes is now obvious . r-67 0 * i 'm going to write to the game warden if more than one deer my neighbor brings back . r-67 0 * i do n't know the boy who the flowers mary gave to . r-67 0 * i do n't know the boy the flowers who mary gave to . r-67 0 * that informers they never use is claimed by the cia . r-67 1 my father , i hardly ever see him and my mother when they 're not glaring at each other . r-67 1 this guitar , i 've sung folk songs and accompanied myself on it all my life . r-67 1 my father , that he 's lived here all his life is well-known to the cops . r-67 1 my wife , somebody stole her handbag last night . r-67 1 they spoke to the janitor about that robbery yesterday , the cops . r-67 1 the cops spoke to him about that robbery yesterday , the janitor . r-67 1 the cops spoke to the janitor about it yesterday , that robbery . r-67 1 that they spoke to the janitor about that robbery yesterday , the cops , is terrible . r-67 1 that the cops spoke to the janitor about it yesterday , that robbery , is terrible . r-67 0 ?* that they spoke to the janitor about that robbery yesterday is terrible , the cops . r-67 0 * they let him go yesterday , he . r-67 0 * they let him go yesterday , him . r-67 0 * i like beer , i . r-67 0 ?* i like beer , me . r-67 0 * we 'll go together , us . r-67 0 * they ca n't stand each other , they . r-67 0 * they ca n't stand each other , them . r-67 1 we 'll do it together , you and i . r-67 1 we 'll do it together , you and me . r-67 1 they ca n't stand each other , he and she . r-67 1 they ca n't stand each other , him and her . r-67 0 * he , they let him go yesterday . r-67 1 him , they let him go yesterday . r-67 0 * i , i like beer . r-67 1 me , i like beer . r-67 0 * we , we 'll go together . r-67 1 us , we 'll go together . r-67 0 * they , they ca n't stand each other . r-67 1 them , they ca n't stand each other . r-67 0 * i saw mary and downtown yesterday your friend from boston . r-67 1 i saw mary and him downtown yesterday , your friend from boston . r-67 0 * i noticed car in the driveway last night your friend from boston . r-67 1 i noticed his car in the driveway last night , your friend from boston . r-67 0 * i spoke to about the war yesterday that guy who 's always following us . r-67 1 i spoke to him about the war yesterday , that guy who 's always following us . r-67 1 i just saw that girl who long john 's claim that he was a martian made all the headlines . r-67 1 all the students who the papers which they submitted were lousy i 'm not going to allow to register next term . r-67 1 did n't that guy who the game warden and him had seen a flying saucer crack up ? r-67 1 palmer is a guy who for for him to stay in school would be stupid . r-67 1 king kong is a movie which you 'll laugh yourself sick if you see it . r-67 1 enrico , who is the smartest of us all , got the answer in seven seconds . r-67 1 enrico , and he is the smartest of us all , got the answer in seven seconds . r-67 0 * any student , who wears socks , is a swinger . r-67 0 * no student , who wears socks , is a swinger . r-67 0 * every student , who wears socks , is a swinger . r-67 0 * any student , and he wears socks , is a swinger . r-67 0 * no student , and he wears socks , is a swinger . r-67 1 is even clarence , who is wearing mauve socks , a swinger ? r-67 1 seven pine trees are behind that barn . r-67 1 there are seven pine trees behind that barn . r-67 1 that barn has seven pine trees behind it . r-67 1 there will be a hole in jack 's pocket . r-67 0 * there will be the hole in jack 's pocket . r-67 1 jack will have a hole in his pocket . r-67 0 * that barn has seven pine trees behind itself . r-67 0 * that barn has seven pine trees behind the cow . r-67 1 jack 's pocket will have a hole in it . r-67 0 ?? there is a hole in john 's quilt 's upper right-hand corner . r-67 0 ?? john 's quilt 's upper right-hand corner has a hole . r-67 1 john 's quilt has a hole in its upper right-hand corner . r-67 0 ?? john has a hole in his quilt 's upper right-hand corner . r-67 1 john has a hole in the upper right-hand corner of his quilt . r-67 1 there are seven holes in the door and window . r-67 0 * the door has seven holes in it and the window . r-67 1 there is a blemish on the end of jerry 's sister 's nose . r-67 0 * jerry has a blemish on the end of his sister 's nose . r-67 1 jerry 's sister has a blemish on the end of her nose . r-67 1 there is a hole in the rug which toby bought in boston . r-67 1 there was an error in the proof johns presented . r-67 1 there was a snake behind the car fred was sitting in . r-67 1 john had an error in the proof he presented . r-67 0 * john had an error in the proof sarah presented . r-67 1 fred had a snake behind the car joe was sitting in . r-67 1 fred had a snake behind the car he was sitting in . r-67 1 there was a yellow collar on the dog which the car injured . r-67 1 there was a snake behind the car the time bomb was sitting in . r-67 0 * the car had a yellow collar on the dog which it injured . r-67 0 * that stone has a hole in the tarpaulin which it is holding down . r-67 0 * the time bomb had a snake behind the car which it was sitting in . r-67 1 there were several hundred people yelling for me to put down gently . r-67 1 the hot potato which there were several hundred people yelling for me to put down gently turned out to have been filled with tnt . r-67 0 * the hot potato had several hundred people yelling for me to put it down gently . r-67 1 bartlett and toni danced . r-67 1 bartlett danced with toni . r-67 0 * bartlett and danced toni . r-67 0 * and toni danced bartlett . r-67 1 it bothers me for her to wear that old fedora . r-67 0 * the only girl for whom it bothers me to wear that old fedora is annabelle . r-67 0 * the only girl who it bothers me to wear that old fedora is annabelle . r-67 1 i would prefer it for there to be no talking . r-67 1 he gave my binoculars to that girl . r-67 1 he gave that girl my binoculars . r-67 1 which girl did he give my binoculars to ? r-67 0 * which girl did he give my binoculars ? r-67 1 my binoculars were given to that girl by him . r-67 1 bill confirmed that roger has eaten the cake . r-67 1 bill alleged that roger had eaten the cake . r-67 1 bill alleged that roger has eaten the cake . r-67 0 ?? what did bill confirm that roger had eaten ? r-67 1 what did bill allege that roger had eaten ? r-67 0 ?* bill did n't confirm that roger had eaten anything . r-67 0 * waldo did n't report the possibility that anyone had left . r-67 1 waldo did n't report that anyone had left . r-67 1 anybody who ever swears at me better watch his step . r-67 1 i want all the students who have ever tried to pat fido to show me their scars . r-67 0 * only the travelers who anybody has ever robbed do n't carry machetes . r-67 1 the only travelers who anybody has ever robbed do n't carry machetes . r-67 0 * i ca n't remember the name of somebody who had any misgivings . r-67 1 i ca n't remember the name of anybody who had any misgivings . r-67 1 everybody who has ever , worked in any office which contained any typewriter which had ever been used to type any letters which had to be signed by any administrator who ever worked in any department like mine will know what i mean . r-67 1 no student who ever goes to europe ever has enough money . r-67 0 * every student who ever goes to europe ever has enough money . r-67 0 * i did n't eat the ice cream and any cake . r-67 1 i realized that it had rained and some crops had been destroyed . r-67 1 i did n't realize that it had rained and some crops had been destroyed . r-67 1 i did n't eat any ice cream or any cake . r-67 0 * i did n't eat any ice cream and any cake . r-67 0 ?* i did n't eat the cake or any ice cream . r-67 0 * i did n't eat any ice cream or mary 's cake . r-67 0 * i did n't eat any ice cream or the cake . r-67 1 john and mary met in vienna . r-67 1 john met mary in vienna . r-67 0 * few writers and any playwrights meet in vienna . r-67 1 few writers meet any playwrights in vienna . r-67 0 * my brother and few americans meet in vienna . r-67 1 my brother meets few americans in vienna . r-67 1 no writer , and no playwright , speaks clearly . r-67 1 no writer , nor any playwright , speaks clearly . r-67 0 * bill understands mary and himself . r-67 0 ?* bill understands himself and mary . r-67 0 * bill and mary washed himself . r-67 0 * andy pinched sarah and tickled herself . r-67 0 * the gun and a description of itself lay on the bureau . r-67 1 bill believes that anna and he are similar . r-67 1 bill believes anna and him to be similar . r-67 0 * bill believes anna and himself to be similar . r-67 0 * i deny that that bob has any money is certain . r-67 1 i deny that it is certain that bob has any money . r-67 0 ?? i deny that that bob has some money is certain . r-67 1 tom will not force you to marry any student . r-67 1 tom will force you to marry no student . r-67 0 * the writers of any of the reports did n't know the answer . r-67 1 the writers of none of the reports . r-67 1 tom will force you to marry no student , and neither will i . r-67 1 it is not certain that you 'll marry any student . r-67 1 it is not certain that you 'll marry any particular student . r-67 1 it is certain that you 'll marry no student . r-67 1 that you will marry any particular student is not certain . r-67 1 the man who i gave john a picture of was bald . r-67 0 ?? the man who i gave john this picture of was bald . r-67 0 * the man who i gave john ed 's picture of was bald . r-67 1 i gave jack a picture of myself . r-67 0 * i gave jack ed 's picture of myself . r-67 1 i did n't give jack a picture of anybody . r-67 0 * i did n't give jack this picture of anybody . r-67 1 i hope i 'm not treading on anyone 's toes . r-67 1 abernathy admitted that the poison pen letter had been written by my sister and him . r-67 1 abernathy admitted that the poison pen letter had been written by my sister and himself . r-67 1 that the sun was out is obvious . r-67 1 that anybody ever left at all is not known . r-67 1 that anybody ever left at all is not certain . r-67 1 that anybody ever left at all is impossible . r-67 1 that anybody ever left at all is surprises me . r-67 1 tonight , what bob cooked yesterday still tastes good . r-67 1 tonight , what bob cooked yesterday still tastes good , so tonight , what bob cooked yesterday will be eaten up . r-67 1 tonight , what bob cooked yesterday still tastes good , so tonight it will be eaten up . rhl07 1 martha gave myrna an apple . rhl07 1 leigh threw the ball to lane . rhl07 1 leigh threw lane the ball . rhl07 1 the noise gave terry a headache . rhl07 0 * the noise gave a headache to terry . rhl07 1 jill threw the ball from home plate to third base . rhl07 1 jill kicked the ball from home plate to third base . rhl07 1 i sent the bicycle from my house at the beach to my house in the mountains . rhl07 1 i shipped the bicycle from my house at the beach to my house in the mountains . rhl07 1 fred threw the ball under the porch . rhl07 1 fred threw the ball behind the tree . rhl07 1 fred threw the ball over the fence . rhl07 1 fred kicked the ball under the porch . rhl07 1 fred kicked the ball behind the tree . rhl07 1 fred kicked the ball over the fence . rhl07 1 felicia threw the ball off the bench . rhl07 1 felicia threw the ball out the window . rhl07 1 felicia kicked the ball out the window . rhl07 0 * felicia sent the box off the shelf . rhl07 0 * felicia sent the box out of the storeroom . rhl07 0 * felicia shipped the box off the shelf . rhl07 0 * felicia shipped the box out of the storeroom . rhl07 0 * jake sent the box at carson . rhl07 0 * jake sent the box towards carson . rhl07 0 * jake shipped the box at carson . rhl07 0 * jake shipped the box towards carson . rhl07 1 anne is curious as to why her father sent her a telegram to america to return home at once . rhl07 0 * where did you give the ball ? rhl07 1 where did you throw the ball ? to third base . rhl07 1 where did you send the bicycle ? to rome . rhl07 1 i gave the package to maria . rhl07 0 * i gave the package to london . rhl07 1 i sent the package to london . rhl07 1 i threw the ball to maria . rhl07 1 i threw the ball to the other side of the field . rhl07 0 * susan gave the ball halfway to bill . rhl07 0 * susan gave the ball all the way to bill . rhl07 1 jake threw the ball all the way to bill . rhl07 1 jake threw the ball halfway to bill . rhl07 1 jake kicked the ball all the way to bill . rhl07 1 jake kicked the ball halfway to bill . rhl07 1 i sent the package all the way around the world . rhl07 1 i sent the package to the antarctic . rhl07 1 i shipped the package halfway around the world . rhl07 1 i shipped the package all the way around the world . rhl07 1 i shipped the package halfway to the antarctic . rhl07 0 * fred gave the ball under molly . rhl07 0 * fred gave the ball behind molly . rhl07 0 * fred gave the ball over molly . rhl07 0 * fred offered the ball under molly . rhl07 0 * fred offered the ball over molly . rhl07 0 * sam gave the ball off the shelf . rhl07 0 * sam offered the ball off the shelf . rhl07 0 * jill gave the ball at bob . rhl07 0 * jill gave the ball towards bob . rhl07 0 * jill offered the ball at bob . rhl07 0 * jill offered the ball towards bob . rhl07 1 give a fresh coat of paint to the front door . rhl07 1 one of the jewish children is a spunky girl , who gave a black eye to the kid with the german roots before the start of the war . rhl07 1 the door has a fresh coat of paint . rhl07 1 the spunky girl has a black eye . rhl07 1 i promise a good time to all who come . rhl07 1 all who come will have a good time . rhl07 1 he died from exhaustion . rhl07 1 the water melted into ice . rhl07 1 the water melted to ice . rhl07 1 a hefty sum of money came to him from his grandfather . rhl07 1 the close brush with the law put the fear of god in him . rhl07 1 she fell in love . rhl07 1 she fell into a sulk . rhl07 1 she fell into a funk . rhl07 1 to whom did you give the ball ? rhl07 1 to whom did you throw the ball ? rhl07 1 where did you throw the ball ? rhl07 1 to whom did you send the package ? rhl07 1 where did you send the package ? rhl07 1 smith threw the ball to the first baseman . rhl07 1 smith threw the first baseman the ball . rhl07 0 * smith threw the first base the ball . rhl07 1 smith envied jones his good fortune . rhl07 0 * smith envied his good fortune to jones . rhl07 1 no one can forgive you that comment . rhl07 1 the recession cost my grandfather a raise . rhl07 1 mary taught john linguistics . rhl07 1 mary taught linguistics to john . rhl07 1 i threw the ball to julian , but it fell short of him . rhl07 1 max offered the victims help , but they refused his offer . rhl07 1 max offered help to the victims , but they refused his offer . rhl07 1 sarah promised her old car to catherine , but then gave it to her son instead . rhl07 1 i taught them english for an entire year , but they do n't seem to have learned . rhl07 1 i read him the figures , but when i looked up , he was gone . rhl07 1 i throw you a lifeline and you giggle . rhl07 1 i kicked him the ball , but the wind blew it astray . rhl07 1 i threw mary the ball , but she was looking at the birds flying overhead and did n't even notice . rhl07 1 i threw the ball to mary , but she was looking at the birds flying overhead and did n't even notice . rhl07 1 lewis shipped sam a bicycle , but it never arrived . rhl07 1 lewis sent sam a bicycle , but it never arrived . rhl07 1 the police read the detainees ' rights to them , but not a single one was paying attention . rhl07 1 i wrote a letter to blair , but i tore it up before i sent it . rhl07 1 the police read the detainees their rights , but not a single one was paying attention . rhl07 1 i wrote blair a letter , but i tore it up before i sent it . rhl07 1 ann copied the manuscript , but she did n't finish it . rhl07 1 alex read the newspaper for an hour . rhl07 1 alex read the newspaper in an hour . rhl07 0 * i lent the book halfway to tony . rhl07 0 * i lent the book all the way to tony . rhl07 0 * i lent the book most of the way to tony . rhl07 0 * i lent tony the book partway . rhl07 0 * i lent tony the book halfway . rhl07 0 * i lent tony the book all the way . rhl07 0 * i lent tony the book most of the way . rhl07 0 * robin arrived partway at the station . rhl07 0 * robin arrived all the way at the station . rhl07 0 * robin arrived most of the way at the station . rhl07 0 * the old dog died partway . rhl07 0 * the old dog died halfway . rhl07 0 * the old dog died all the way . rhl07 1 sandy taught the children the alphabet , but only got as far as the letter `` r '' . rhl07 1 maxine read the children goodnight moon , but they fell asleep before she got to the end . rhl07 1 interviewing richard nixon gave norman mailer a book . rhl07 1 nixon 's behavior gave mailer an idea for a book . rhl07 1 nixon 's behavior gave an idea for a book to every journalist living in new york city in the 1970s . rhl07 1 we gave a fresh coat of paint to the house . rhl07 1 the five `` soundscape '' pieces gave a festive air to park square . rhl07 1 gordie gillespie still can give a piece of his mind to the umps . rhl07 1 i sent the salesman to the devil . rhl07 0 * i sent the devil the salesman . rhl07 1 nixon 's behavior gave an idea for a book to every journalist living in new york . rhl07 1 the music lent a festive air to the party . rhl07 1 it is very difficult to get an idea for a book simply from an interview . rhl07 1 it is unreadable , guaranteed to give a headache to anyone who looks hard at the small print . rhl07 1 `` doing my taxes '' gives a headache to 22 percent of americans surveyed for pfizer , which makes tylenol pain relief medicine . rhl07 1 lopez says that he has done more than simply give a fresh coat of paint to the site . rhl07 1 i think it 's time you give your lovely illness to someone else ! l-93 1 sharon sprayed the plants with water . l-93 1 the farmer loaded apples into the cart . l-93 0 * monica covered a blanket over the baby . l-93 1 monica covered the baby with a blanket . l-93 1 carla poured lemonade into the pitcher . l-93 0 * carla poured the pitcher with lemonade . l-93 1 the farmer dumped apples into the cart . l-93 1 the window broke . l-93 1 the little boy broke the window . l-93 1 a rabbit appeared out of the magician 's hat . l-93 0 * the magician appeared a rabbit out of his hat . l-93 1 martha carved a toy out of wood for the baby . l-93 1 martha carved some wood into a toy for the baby . l-93 0 * martha carved the baby some wood into a toy . l-93 1 margaret cut the bread . l-93 1 janet broke the vase . l-93 1 terry touched the cat . l-93 1 carla hit the door . l-93 1 crystal vases break easily . l-93 0 * cats touch easily . l-93 0 * door frames hit easily . l-93 1 margaret cut at the bread . l-93 0 * janet broke at the vase . l-93 0 * terry touched at the cat . l-93 1 carla hit at the door . l-93 1 margaret cut bill 's arm . l-93 1 margaret cut bill on the arm . l-93 1 janet broke bill 's finger . l-93 1 terry touched bill 's shoulder . l-93 1 terry touched bill on the shoulder . l-93 1 carla hit bill 's back . l-93 1 carla hit bill on the back . l-93 1 jean moved the table . l-93 0 * jean moved at the table . l-93 1 margaret cut the string . l-93 0 * the string cut . l-93 0 * the cat touched . l-93 0 * the door hit . l-93 1 a the butcher cuts the meat . l-93 1 the meat cuts easily . l-93 1 janet broke the crystal . l-93 1 crystal breaks at the slightest touch . l-93 1 kelly adores french fabrics . l-93 0 * french fabrics adore easily . l-93 1 joan knew the answer . l-93 0 * the answer knows easily . l-93 1 bill pounded the metal . l-93 1 bill pounded the metal fiat . l-93 1 this metal wo n't pound flat . l-93 1 the cup broke . l-93 1 they gave the bicycle to me . l-93 0 * the bicycle gave to me . l-93 0 * the bread cut . l-93 0 * the magician appeared a dove from his sleeve . l-93 1 sylvia jumped the horse over the fence . l-93 1 the horse jumped over the fence . l-93 1 the scientist ran the rats through the maze . l-93 1 the rats ran through the maze . l-93 1 the bell rang . l-93 1 they stood the statue on the pedestal . l-93 1 the statue stood on the pedestal . l-93 1 the army lodged the soldiers in the schoolhouse . l-93 1 heat radiates from the sun . l-93 1 the sun radiates heat . l-93 1 the departing passenger waved at the crowd . l-93 0 * jennifer craned . l-93 1 i shaved my face . l-93 1 i shaved . l-93 1 celia braided her hair . l-93 0 * celia braided . l-93 0 * tessa sprained . l-93 1 jill dressed hurriedly . l-93 0 * i shaved myself . l-93 0 * celia brushed . l-93 1 tessa cut herself . l-93 0 * tessa cut . l-93 1 we loaded ourselves onto the bus . l-93 1 we loaded onto the bus . l-93 1 we pulled ourselves free . l-93 1 anne met cathy . l-93 1 anne and cathy met . l-93 0 * brenda chatted molly . l-93 1 brenda and molly chatted . l-93 1 the drunk hugged the lamppost . l-93 0 * the drunk and the lamppost hugged . l-93 1 italy touches france . l-93 1 italy and france touch . l-93 0 * ellen argued helen . l-93 1 ellen and helen argued . l-93 1 the sign warned us against skating on the pond . l-93 1 the sign warned against skating on the pond . l-93 1 for discussion of the same phenomenon in russian . l-93 1 that dog bites people . l-93 1 that dog bites . l-93 1 i cut the bread with this knife . l-93 1 this knife cut the bread . l-93 1 this knife does n't cut . l-93 1 these shears clip well . l-93 1 this machine records well . l-93 1 this oven cooks well . l-93 1 this lotion softens , soothes , and protects . l-93 1 this polish cleans , protects , and shines . l-93 0 * this key wo n't open . l-93 1 this key wo n't open the jock . l-93 0 * this hammer wo n't break . l-93 1 this hammer wo n't break the window . l-93 1 they pushed their way through the crowd . l-93 1 they pushed through the crowd . l-93 1 bake for 30 minutes . l-93 0 * like the ice cream . l-93 0 * like after tasting . l-93 1 paula hit the fence . l-93 1 paula hit at the fence . l-93 1 faustina sprayed the lilies . l-93 0 * janet broke at the bread . l-93 1 i pushed the table . l-93 1 i pushed at the table . l-93 1 i pushed on the table . l-93 1 i pushed against the table . l-93 1 the mouse nibbled the cheese . l-93 1 the mouse nibbled at the cheese . l-93 1 the mouse nibbled on the cheese . l-93 0 * monica moved at the cat . l-93 1 martha climbed up the mountain . l-93 1 martha climbed the mountain . l-93 1 they skated along the canals . l-93 1 they skated the canals . l-93 1 the spaceship revolves around the earth . l-93 0 * the spaceship revolves the earth . l-93 1 martha slowly descended the stairs . l-93 1 jill met sarah . l-93 1 jill embraced sarah . l-93 1 bill sold tom a car . l-93 1 bill sent a package london . l-93 1 bill sent tom a package . l-93 0 * bill sent london a package . l-93 1 martha carved a toy for the baby . l-93 1 martha carved the baby a toy . l-93 1 the architect selected a house for the couple . l-93 1 jack sprayed paint on the wall . l-93 1 jack sprayed the wall with paint . l-93 0 * june covered the blanket over the baby . l-93 1 june covered the baby with a blanket . l-93 0 * tamara poured the bowl with water . l-93 1 henry cleared dishes from the table . l-93 1 henry cleared the table of dishes . l-93 1 the thief stole the painting from the museum . l-93 0 * the thief stole the museum of the painting . l-93 1 the doctor cured pat of pneumonia . l-93 1 helen wiped the wall . l-93 0 * helen wiped the wall of fingerprints . l-93 1 bees are swarming in the garden . l-93 1 the garden is swarming with bees . l-93 0 * people are seething in the square . l-93 0 * the pasture is herding with cattle . l-93 1 clouds cleared from the sky . l-93 1 the sky cleared . l-93 1 martha carved the piece of wood into a toy . l-93 1 david constructed a house out of . l-93 1 david constructed a house from bricks . l-93 0 * david constructed the bricks into a house . l-93 1 i whipped the eggs into a froth . l-93 1 the witch turned him into a frog . l-93 0 * the witch turned him from a prince . l-93 1 an oak tree will grow from that acorn . l-93 1 the witch turned him from a prince into a frog . l-93 0 * martha carved the piece of wood from a branch into a toy . l-93 0 * i whipped the eggs from a puddle into a froth . l-93 1 he turned from a prince into a frog . l-93 0 * that acorn will grow from a seed into an oak tree . l-93 1 the car collided with the fence . l-93 1 i separated the yolk from the white . l-93 1 i separated the yolk and the white . l-93 1 i mixed the sugar and the butter . l-93 1 i confused maria with anna . l-93 1 i confused maria and anna . l-93 1 linda taped the label to the cover . l-93 0 * linda taped the label and the cover . l-93 1 harriet alternated folk songs with pop songs . l-93 0 * harriet alternated folk songs and pop songs together . l-93 1 i broke the twig and the branch apart . l-93 0 * i detached the handle and the box apart . l-93 1 brenda agreed with molly . l-93 1 brenda and molly agreed . l-93 1 the oil separated from the vinegar . l-93 1 the oil and vinegar separated . l-93 0 * bill married with kathy . l-93 1 bill and kathy married . l-93 1 the twig broke off of the branch . l-93 0 * the twig and the branch broke . l-93 1 the eggs and the cream mixed together . l-93 0 * plays and ballets alternate together . l-93 1 the twig and the branch broke apart . l-93 0 * the yolk and the white separated apart . l-93 1 the judge presented the winner with a prize . l-93 1 the judge offered a prize to the winner . l-93 0 * the judge offered the winner with a prize . l-93 0 * the judge saddled a prize to the winner . l-93 1 the judge saddled the winner with a prize . l-93 1 the jeweller inscribed the name on the ring . l-93 1 the jeweller inscribed the ring with the name . l-93 1 the jeweller copied the name on the ring . l-93 0 * the jeweller copied the ring with the name . l-93 0 * the jeweller decorated the name on the ring . l-93 1 the jeweller decorated the ring with the name . l-93 1 brian hit the fence with the stick . l-93 1 don swatted the mosquito with the newspaper . l-93 1 alison pierced the cloth with a needle . l-93 1 paula hit the fence with the stick . l-93 1 mira blamed the accident on terry . l-93 1 mira blamed terry for the accident . l-93 0 * mira condemned the accident on terry . l-93 1 ida hunted the woods for deer . l-93 1 ida hunted for deer in the woods . l-93 1 ida hunted deer in the woods . l-93 1 melissa searched the papers for a clue . l-93 1 melissa searched for a clue in the papers . l-93 0 * melissa searched a clue in the papers . l-93 1 i stalked the woods for game . l-93 0 * i stalked for game in the woods . l-93 1 i stalked game in the woods . l-93 0 * we investigated for bombs in the area . l-93 0 * we investigated bombs in the area . l-93 0 * we rummaged the desk for papers . l-93 1 we rummaged through the desk for papers . l-93 0 * we rummaged papers through the desk . l-93 0 * i sought the woods for game . l-93 0 * i sought for game in the woods . l-93 1 selina touched the horse on the back . l-93 1 selina touched the horse 's back . l-93 1 the horse kicked penny in the shin . l-93 1 the horse kicked penny 's shin . l-93 1 alison poked daisy in the ribs . l-93 1 alison poked daisy 's ribs . l-93 0 * the horse broke penny in the shin . l-93 1 the horse broke penny 's shin . l-93 0 * the glass cut rachel in the toe . l-93 1 the glass cut rachel 's toe . l-93 1 they praised the volunteers ' dedication . l-93 1 they praised the volunteers for their dedication . l-93 1 i admired him for his courage . l-93 1 the inspector analyzed the building 's soundness . l-93 1 the inspector analyzed the building for its soundness . l-93 0 * i sensed him for his eagerness . l-93 1 i admired his honesty . l-93 1 i admired the honesty in him . l-93 1 i admired him for his honesty . l-93 1 i sensed the eagerness in him . l-93 0 * they praise the dedication in the volunteers . l-93 1 mark terrified me with his single mindedness . l-93 1 mark 's single mindedness terrified me . l-93 1 the clown amused the children with his antics . l-93 1 the clown 's antics amused the children . l-93 1 meat fell in price . l-93 1 the price of meat fell . l-93 1 the president appointed smith press secretary . l-93 1 the president appointed smith as press secretary . l-93 0 * angela characterized shelly a lifesaver . l-93 1 angela characterized shelly as a lifesaver . l-93 0 * the captain named the ship as seafarer . l-93 1 the world saw the beginning of a new era in 1492 . l-93 1 1492 saw the beginning of a new era . l-93 1 i dried the clothes in the sun . l-93 1 the sun dried the clothes . l-93 1 david broke the window with a hammer . l-93 0 * the spoon ate the ice cream . l-93 1 the crane loaded the truck . l-93 0 * the pitchfork loaded the truck . l-93 1 he established his innocence with the letter . l-93 1 the letter established his innocence . l-93 1 i filled the pail with water . l-93 1 water filled the pail . l-93 1 we sleep five people in each room . l-93 1 each room sleeps five people . l-93 1 i incorporated the new results into the paper . l-93 1 the paper incorporates the new results . l-93 1 that whole wheat flour bakes wonderful bread . l-93 0 * those new bricks constructed a house . l-93 1 i bought you a ticket for $ 5 . l-93 1 $ 5 will buy a ticket . l-93 1 $ 5 will buy you a ticket . l-93 1 the contractor will build a house for $ 100,000 . l-93 1 the contractor will build you a house for $ 100,000 . l-93 1 $ 100,000 will build you a house . l-93 1 $ 100,000 will build a house . l-93 1 the middle class will benefit from the new tax laws . l-93 1 the new tax laws will benefit the middle class . l-93 1 the middle class will gain from the new tax laws . l-93 0 * the new tax jaws will gain the middle class . l-93 1 the butcher cuts the meat . l-93 1 the butler polished the silver . l-93 1 this silver polishes itself . l-93 1 the audience watched the movie . l-93 0 * this movie just watches itself . l-93 1 this window just opens itself . l-93 1 the heat melted the ice cream . l-93 0 * this ice cream just melts itself . l-93 1 this book just sells itself . l-93 1 i presented a solution to the problem yesterday . l-93 1 a solution to the problem presented itself yesterday . l-93 1 the cook sliced the mushrooms . l-93 1 the mushrooms were sliced by the cook . l-93 1 columbus believed the earth to be round . l-93 1 columbus believed that the earth was round . l-93 1 it was believed that the earth was round . l-93 1 the police kept tabs on the suspect . l-93 1 tabs were kept on the suspect . l-93 1 the lax supervision was taken advantage of . l-93 1 this bed was slept in by george washington . l-93 0 * tuesday was slept on by george washington . l-93 0 * the horizon was appeared on by a pirate ship . l-93 1 the pillow remained stuffed with feathers . l-93 1 a flowering plant is on the windowsill . l-93 1 there is a flowering plant on the windowsill . l-93 1 a problem developed . l-93 1 there developed a problem . l-93 1 a ship appeared on the horizon . l-93 1 there appeared a ship on the horizon . l-93 0 * there appeared the ship on the horizon . l-93 1 a little boy darted into the room . l-93 1 there darted into the room a little boy . l-93 1 a little boy ran in the yard . l-93 0 * there ran a little boy in the yard . l-93 1 an ancient treasure trove was found in this cave . l-93 1 there was found in this cave an ancient treasure trove . l-93 1 suddenly an ugly old man entered the hall . l-93 1 suddenly there entered the hall an ugly old man . l-93 1 a lot of snow melted on the streets of chicago . l-93 0 * there melted a lot of snow on the streets of chicago . l-93 1 on the windowsill is a flowering plant . l-93 1 in the woods lives an old woman . l-93 1 a cat jumped onto the table . l-93 1 a cat jumped on the table . l-93 0 * on the table jumped a cat . l-93 1 a choir sang in the church . l-93 1 in the church sang a choir . l-93 1 in this cave was found an ancient treasure trove . l-93 1 a violent demonstration took place in the main square . l-93 0 * on the streets of chicago melted a lot of snow . l-93 1 sarah smiled . l-93 1 sarah sang . l-93 1 sarah sang a song . l-93 1 sarah sang a ballad . l-93 1 sarah sang an aria . l-93 1 sarah sang a hymn . l-93 1 sarah sang the anthem . l-93 1 heather snorted . l-93 1 kelly buttered the bread . l-93 0 * kelly buttered the bread with butter . l-93 1 kelly buttered the bread with unsalted butter . l-93 1 linda taped the box with two-sided tape . l-93 1 the men were able to mine more gold . l-93 0 * lydia pocketed the change in her pocket . l-93 0 * the cook boned the fish of bones . l-93 0 * the cook boned the fish of its backbone . l-93 1 pauline smiled her thanks . l-93 1 sandra beamed . l-93 0 * a cheerful welcome was beamed by sandra . l-93 1 she mumbled . l-93 1 she mumbled her adoration . l-93 1 they shopped their way around new york . l-93 1 he worked his way through the book . l-93 1 she stipulated her way out of the problem . l-93 1 the boy pushed his way through the crowd . l-93 1 the explorers cut their way through the jungle . l-93 0 * the children came their way to the party . l-93 0 * the flower bloomed its way to a prize . l-93 0 * they disappeared their way off the stage . l-93 1 the silversmith pounded the metal flat . l-93 0 * the silversmith pounded on the metal flat . l-93 1 pauline hammered the metal flat . l-93 1 jasmine pushed the door open . l-93 1 the guests drank the teapot dry . l-93 1 amanda burned the stove black . l-93 1 belinda walked the soles off her shoes . l-93 1 philippa cried herself to sleep . l-93 1 the river froze solid . l-93 1 the door slid shut . l-93 1 the metal was hammered flat . l-93 1 the door was pushed open . l-93 1 philippa cried her eyes dry . l-93 0 * the dog smelled the flower bed bare . l-93 0 * the teacher hated the pupils angry . l-93 0 * willa arrived breathless . l-93 0 * sharon brought willa breathless . l-93 0 * this list includes my name on itself . l-93 1 fanny pulled the blanket over herself . l-93 1 fanny pulled the blanket over her . l-93 1 the truck rumbled . l-93 1 the truck rumbled into the driveway . l-93 1 audrey tiptoed to the door . l-93 1 the couple waltzed to the window . l-93 1 the clown wobbled down the hall . l-93 1 leona pushed the cart to the market . l-93 1 it is rumored that he left town . l-93 0 * they rumor that he left town . l-93 0 * the politician perjured his aide . l-93 1 jennifer craned her neck . l-93 0 * jennifer craned his neck . l-93 0 ?* jennifer craned her arm . l-93 1 they 've got it made . l-93 1 the teacher meant well . l-93 0 * the teacher meant . l-93 1 the horse would n't budge . l-93 1 would the horse budge if you pushed ? l-93 0 * the horse budged . l-93 0 * i put the book to sally . l-93 0 * i put the book from edna . l-93 0 * i put the book from edna to sally . l-93 1 i put books on the table . l-93 0 * i put the table with the books . l-93 0 * i put the table with books . l-93 1 i put the books on the table . l-93 0 * the books put on the table easily . l-93 0 * the books put on the table . l-93 0 * i put on the table . l-93 1 cheryl stood the books next to the magazines . l-93 1 cheryl stood the books on the shelf . l-93 0 * cheryl stood the books from edna . l-93 0 * cheryl stood the books from edna to sarah . l-93 0 * cheryl stood the shelf with books . l-93 0 * cheryl stood the shelf with the books . l-93 1 cheryl stood the tall books on the table . l-93 0 * tall books stand on tables easily . l-93 1 cheryl stood the books on the table . l-93 1 the books stood on the table . l-93 0 * cheryl stood on the table . l-93 1 i funneled the mixture into the bottle . l-93 0 * i funneled the mixture to rina . l-93 0 * i funneled the mixture from edna to rina . l-93 0 * i funneled the bottle with the mixture . l-93 0 * the mixture funnels easily . l-93 0 * the mixture funnels . l-93 0 * i funneled the mixture . l-93 0 * i funneled into the bottle . l-93 1 i lifted the books . l-93 1 i lifted the book onto the table . l-93 1 i lifted the book onto the out of the box . l-93 1 i lifted the books from the floor to the table . l-93 1 i lifted the books onto the table . l-93 0 * i lifted the table with the books . l-93 1 i lifted the books to him . l-93 1 i lifted the books up to him . l-93 0 * i lifted him up the books . l-93 0 * i lifted onto the table . l-93 1 tamara poured water into the bowl . l-93 1 tamara poured water over the flowers . l-93 1 tamara poured water out of the pitcher . l-93 0 * tamara poured at water into the bowl . l-93 1 tamara poured water onto the plants . l-93 0 * water pours easily onto the plants . l-93 1 water poured onto the plants . l-93 1 cora coiled the rope around the post . l-93 0 * cora coiled the post with the rope . l-93 0 * cora coiled at the rope around the post . l-93 1 the rope coiled around the post . l-93 1 that kind of rope coils easily around the post . l-93 0 * cora coiled around the post . l-93 1 jessica loaded boxes onto the wagon . l-93 1 jessica loaded boxes into the wagon . l-93 1 jessica sprayed paint onto the table . l-93 1 jessica sprayed paint under the table . l-93 1 jessica sprayed paint over the table . l-93 1 jessica sprayed paint on the wall . l-93 1 paint sprayed on the wall . l-93 1 jessica sprayed the wall with paint . l-93 0 * the wall sprayed with paint . l-93 1 jessica squirted water at me . l-93 1 jessica sprayed water at me . l-93 1 jessica splashed water at me . l-93 0 * jessica loaded boxes at the truck . l-93 0 * jessica stuffed boxes at the truck . l-93 1 leslie staffed the store with employees . l-93 0 * leslie staffed employees in the store . l-93 0 * the store staffed with employees . l-93 1 the employees staffed the store . l-93 1 leigh swaddled the baby with blankets . l-93 1 lora buttered the toast . l-93 0 * lora buttered unsalted butter on the toast . l-93 1 lora buttered the toast with unsalted butter . l-93 0 * lora buttered at the toast with unsalted butter . l-93 0 * the toast buttered with unsalted butter . l-93 0 * the toast buttered . l-93 1 lydia pocketed the change . l-93 0 * lydia pocketed her pocket with the change . l-93 0 * the change pocketed . l-93 1 doug removed the scratches from the tabletop . l-93 1 doug removed the scratches from around the sink . l-93 0 * doug removed the scratches out of the drawer . l-93 0 * doug removed the scratches to nowhere . l-93 0 * doug removed the tabletop of scratches . l-93 0 * doug removed at the scratches from the tabletop . l-93 0 * the scratches removed from the tabletop . l-93 1 the king banished the general from the army . l-93 1 the king banished the general to a mountain fortress . l-93 0 * the king banished the general from the palace to a mountain fortress . l-93 0 * the king banished at the general from the army . l-93 0 * the general banished from the army . l-93 1 doug cleared the dishes from under the rack . l-93 1 doug cleared the table . l-93 0 * doug cleared at the table of dishes . l-93 0 * doug cleared at the table . l-93 1 the strong winds cleared the skies . l-93 1 the strong winds slowly cleared the clouds from the sky . l-93 1 brian wiped the fingerprints from the counter . l-93 1 brian wiped the fingerprints from inside the cupboard . l-93 1 brian wiped the fingerprints from under the cupboard . l-93 1 brian wiped the fingerprints from outside the cupboard . l-93 0 * brian wiped the counter of fingerprints . l-93 1 brian wiped the counter . l-93 1 paula trimmed the bush . l-93 1 brian was wiping the counter . l-93 1 brian was wiping . l-93 1 brian was wiping the wall behind the stove . l-93 1 carla shoveled the snow from the walk . l-93 1 carla shoveled the snow from under the bushes . l-93 1 carla shoveled the snow from among the bushes . l-93 1 carla shoveled the snow from near the bushes . l-93 0 * carla shoveled the walk of snow . l-93 0 * carla shoveled at the walk . l-93 1 carla was shoveling the walk . l-93 1 carla was shoveling . l-93 1 carla mopped the floor under the furniture . l-93 1 carla mopped under the furniture . l-93 1 the thief stole the painting for mr. smith . l-93 0 * the thief stole mr. smith the painting . l-93 0 * the thief stole at the painting from the museum . l-93 0 * the painting stole from the museum . l-93 0 * the doctor cured pneumonia from pat . l-93 0 * pat cured of pneumonia . l-93 1 the swindler cheated pat of her fortune . l-93 1 the cook boned the fish . l-93 0 * the fish boned . l-93 0 * the fish scrubbed . l-93 1 the men mined the gold . l-93 0 * the gold mined . l-93 1 nora sent the book from paris . l-93 1 nora sent the book to london . l-93 1 nora sent the book from paris to london . l-93 1 nora sent the book to peter . l-93 0 * nora sent at the book to peter . l-93 0 * the book sent to peter . l-93 1 nora sent books to children . l-93 0 * books send easily to children . l-93 1 carla slid the books across the table . l-93 1 carla slid the book to dale . l-93 1 carla slid dale the book . l-93 0 * carla slid at the book to dale . l-93 1 the books slid across the table . l-93 1 carla slid those books across the table . l-93 1 those books slide across the table easily . l-93 1 nora brought the book to the meeting . l-93 1 nora brought the book to pamela . l-93 1 nora brought the book from horne . l-93 1 nora brought pamela the book . l-93 0 * nora brought at the book to the meeting . l-93 0 * the book brought to the meeting . l-93 0 * the book brings easily to the meeting . l-93 1 amanda carried the package . l-93 1 amanda carried the package from boston . l-93 1 amanda carried the package to new york . l-93 1 amanda carried the package from boston to new york . l-93 0 * amanda carried at the package to new york . l-93 0 * the package carried to new york . l-93 0 * the package carried . l-93 1 amanda carried packages to new york . l-93 1 amanda carried packages . l-93 0 * packages carry easily to new york . l-93 1 amanda drove the package from boston to new york . l-93 1 amanda drove the package to new york . l-93 1 amanda drove the package from boston . l-93 1 amanda drove the package . l-93 1 amanda drove the package to pamela . l-93 0 * amanda drove at the package to new york . l-93 0 * amanda drove at the package . l-93 0 * the package drove to new york . l-93 0 * the package drove . l-93 1 amanda drove packages to new york . l-93 1 amanda drove packages . l-93 0 * packages drive easily . l-93 1 nora pushed the chair . l-93 1 nora pushed at the chair . l-93 1 nora pushed on the chair . l-93 1 nora pushed against the chair . l-93 1 nora pushed through the crowd . l-93 1 nora pushed her way through the crowd . l-93 1 nora pushed the chair against the wall . l-93 1 they lent a bicycle to me . l-93 1 they lent me a bicycle . l-93 0 * they lent me with a bicycle . l-93 0 * a bicycle lent . l-93 0 * a bicycle lent to me . l-93 1 we contributed our paycheck to her . l-93 0 * we contributed her our paycheck . l-93 0 * we contributed her with our paycheck . l-93 0 * our paycheck contributed . l-93 0 * we offered a job behind her . l-93 1 we offered her a job . l-93 0 * we offered her with a job . l-93 0 * a job offered to her . l-93 1 brown presented jones with a plaque . l-93 1 the presentation of a plaque was a proud moment . l-93 1 brown equipped jones with a camera . l-93 0 * brown equipped a camera near jones . l-93 0 * brown equipped a camera next to jones . l-93 0 * brown equipped a camera at jones . l-93 0 * brown equipped a camera to jones . l-93 0 * brown equipped jones a camera . l-93 1 carmen bought a dress . l-93 1 carmen bought a dress at bloomingdale 's . l-93 1 carmen bought a dress for mary . l-93 0 * carmen bought a dress to mary . l-93 1 carmen bought a dress from diana . l-93 0 * carmen bought diana of a dress . l-93 1 carmen bought a dress at bloomingdale 's for $ 50 . l-93 1 $ 50 wo n't even buy a dress at bloomingdale 's . l-93 1 carmen obtained the spare part . l-93 0 * carmen obtained mary a spare part . l-93 0 * carmen obtained a spare part to mary . l-93 1 carmen obtained a spare part from diana . l-93 0 * carmen obtained diana of a spare part . l-93 1 carmen purchased a dress at bloomingdale 's for $ 50 . l-93 1 $ 50 wo n't even purchase a dress at bloomingdale 's . l-93 1 gwen exchanged the dress for a shirt . l-93 0 * gwen exchanged the dress to mary . l-93 0 * gwen exchanged mary the dress . l-93 1 gwen exchanged the dress for mary . l-93 1 the children like to berry in the summer . l-93 0 * she held at the rail . l-93 0 * the rail holds easily . l-93 1 she held his arm . l-93 1 she held him by the arm . l-93 0 * she held the paper from him . l-93 1 michelle kept the papers in the desk . l-93 1 michelle kept the papers behind the desk . l-93 1 michelle kept the papers over the desk . l-93 1 michelle kept the papers under the desk . l-93 1 frances hid the presents from sally . l-93 1 frances hid the presents behind the books . l-93 0 * frances hid sally of the presents . l-93 1 steve tossed the ball . l-93 1 steve tossed the ball into the garden . l-93 1 steve tossed the ball over the fence . l-93 1 steve tossed the ball from the tree to the gate . l-93 1 steve tossed the ball at anna . l-93 0 * steve tossed anna with the ball . l-93 1 steve tossed the ball to anna . l-93 1 steve tossed the ball against the wall . l-93 0 * steve tossed the wall with the ball . l-93 0 * steve tossed at the ball . l-93 0 * the ball tossed . l-93 1 steve tossed the softball . l-93 0 * baseballs toss easily . l-93 1 steve pelted anna with acorns . l-93 0 * steve pelted acorns at anna . l-93 1 steve pelted anna . l-93 0 * steve pelted at anna . l-93 0 * steve pelted at anna with acorns . l-93 0 * steve pelted acorns against anna . l-93 0 * steve pelted acorns to anna . l-93 0 * steve pelted anna acorns . l-93 1 steve pelted the squirrels with acorns . l-93 0 * squirrels pelt easily with acorns . l-93 1 paula hit the stick on the fence . l-93 1 paula hit the stick against the fence . l-93 0 * paula hit the stick into the fence . l-93 1 paula hit at the fence with the stick . l-93 1 paula hit deirdre on the back . l-93 1 paula hit deirdre 's back . l-93 1 paula hit the sticks together . l-93 0 * paula hit the sticks . l-93 0 * the fence hit with a stick . l-93 0 * the fence hit . l-93 0 * the fence hits easily . l-93 1 the stick hit the fence . l-93 0 * paula swatted the cloth on the fly . l-93 0 * paula swatted the cloth against the fly . l-93 1 paula swatted the fly with the cloth . l-93 0 * paula swatted the cloth through the fly . l-93 0 * paula swatted the cloth into the fly . l-93 1 paula swatted the fly . l-93 1 paula swatted at the fly . l-93 1 paula swatted deirdre on the back . l-93 1 paula swatted deirdre 's back . l-93 0 * the fly swatted . l-93 1 paula swatted flies . l-93 0 * flies swat easily . l-93 1 paula swatted the fly with a cloth . l-93 0 * the cloth swatted the fly . l-93 0 * paula spanked her right hand against the naughty child . l-93 1 paula spanked the naughty child with her right hand . l-93 0 * paula spanked her right hand into the naughty child . l-93 0 * paula spanked her right hand through the naughty child . l-93 1 paula spanked the naughty child on the back . l-93 1 paula spanked the naughty child 's back . l-93 1 paula spanked the naughty child . l-93 0 * the naughty child spanked . l-93 0 * naughty children spank easily . l-93 0 * paula 's right hand spanked the naughty child . l-93 0 * the wall banged with the grocery cart . l-93 1 the old cart banged against the new cart . l-93 0 * the old and new carts banged . l-93 1 the old and new carts banged together . l-93 1 alison poked the needle through the cloth . l-93 1 alison poked the needle into the cloth . l-93 1 alison poked the cloth with a needle . l-93 1 alison poked the cloth . l-93 1 alison poked the needle through the denim . l-93 1 carrie touched the cat . l-93 0 * carrie touched the stick against the cat . l-93 1 carrie touched the cat with the stick . l-93 0 * carrie touched the stick into the cat . l-93 0 * carrie touched the stick through . l-93 0 * carrie touched at the cat . l-93 1 carrie touched him on the shoulder . l-93 0 * that cat touches easily . l-93 1 carrie touched the fence with a stick . l-93 0 * the stick touched the fence . l-93 1 carol cut the bread with a knife . l-93 1 carol cut the bread . l-93 1 carol cut at the bread . l-93 1 carol cut herself on the thumb . l-93 1 carol cut her thumb . l-93 1 carol cut the whole wheat bread . l-93 1 whole wheat bread cuts easily . l-93 1 the knife cut the bread . l-93 1 this knife cuts well . l-93 1 carol carved the stone with a chisel . l-93 1 carol carved the stone . l-93 0 * carol carved at the stone . l-93 0 * carol carved the tree on the branch . l-93 1 carol carved the tree 's branch . l-93 0 * the stone carved . l-93 1 carol carved the marble . l-93 1 marble carves easily . l-93 1 carol carved the marble with a chisel . l-93 1 the chisel carved the marble . l-93 1 that chisel carved the statue . l-93 1 that chisel carves well . l-93 1 herman mixed the eggs with the cream . l-93 1 herman mixed the eggs and the cream . l-93 1 the eggs mixed with the cream . l-93 1 the eggs and the cream mixed . l-93 1 herman mixed the eggs and the cream together . l-93 1 i mixed the soap into the water . l-93 1 i mixed the soap and the water . l-93 1 i mixed the eggs with cream . l-93 1 i mixed the eggs and cream . l-93 1 i mixed the eggs and cream together . l-93 1 harriet alternated folk songs and pop songs . l-93 1 plays alternate with ballets . l-93 1 plays and ballets alternate . l-93 1 harriet interconnected the pieces . l-93 1 herman whipped the cream . l-93 0 * linda taped the wall with the picture . l-93 1 linda taped the label and the cover together . l-93 1 the child clung to her mother . l-93 0 * the child and her mother clung . l-93 0 * the war clung the child to her mother . l-93 1 the yolk separated from the white . l-93 1 the yolk and the white separated . l-93 1 i separated the cream from the milk . l-93 1 i separated the egg yolk and the egg white . l-93 1 i separated the egg yolks and the egg whites . l-93 0 * i separated the milk of the cream . l-93 1 i broke the twig off the branch . l-93 1 i broke the twig off of the branch . l-93 0 * i broke the twig and the branch . l-93 1 i broke twigs off those branches . l-93 1 i broke twigs off of those branches . l-93 1 i broke those twigs and branches apart . l-93 1 i detached the handle . l-93 1 i detached the handle from the box . l-93 0 * i detached the handle and the box . l-93 0 * the handle detached from the box . l-93 1 i detached that new handle . l-93 1 i detached that new handle from the box . l-93 1 that new handle detaches easily . l-93 0 * that new handle detaches from the box easily . l-93 1 the winter schedule differed from the spring schedule . l-93 1 this flyer differs from that flyer . l-93 0 * i differed this flyer from that flyer . l-93 1 phyllis dyed the dress . l-93 1 smith inscribed his name over the door . l-93 1 smith inscribed his name under the picture . l-93 1 smith inscribed the ring with his name . l-93 1 smith was annealing the rings . l-93 1 smith was annealing . l-93 1 the jeweller printed the name on the ring . l-93 1 the jeweller printed the name over the door . l-93 1 the jeweller printed the name under the picture . l-93 1 the jeweller printed the name onto the cup . l-93 1 the jeweller scribbled his name on the contract . l-93 1 smith was scribbling his notes . l-93 1 smith was scribbling . l-93 1 the jeweller decorated the ring . l-93 1 the secretary transcribed the speech . l-93 1 the secretary transcribed the speech into the record . l-93 0 * the secretary transcribed the record with the speech . l-93 1 martha carved a toy out of the piece of wood . l-93 1 martha carves . l-93 1 martha carved a toy out of a piece of wood for the baby . l-93 1 martha carved a piece of wood for the baby . l-93 1 martha carved a piece of wood into a toy for the baby . l-93 1 martha carved beautiful toys out of this wood . l-93 1 this wood carves beautiful toys . l-93 1 $ 100,000 will build you a house . l-93 1 $ 100,000 will build a house . l-93 1 the gardener grew an oak tree from that acorn . l-93 1 donna fixed a sandwich . l-93 0 * donna fixed last night 's leftovers into a sandwich . l-93 1 donna fixed a sandwich for me . l-93 1 donna fixed me a sandwich . l-93 1 david constructed a house . l-93 1 david constructed a house out of bricks . l-93 0 * david constructed me a house . l-93 1 david constructed the house . l-93 0 * the house constructed . l-93 0 * david constructed the mansion from bricks into a house . l-93 1 i shaped the dough into a loaf . l-93 1 i shaped the dough . l-93 0 * i shaped a loaf from the dough . l-93 1 i twirled the dough into a pretzel . l-93 0 * i shaped a good loaf from this dough . l-93 0 * this dough shapes a good loaf . l-93 0 * i shaped the dough from a lump into a loaf . l-93 0 * he turned from a prince . l-93 1 sandy sang a song to me . l-93 1 sandy sang me a song . l-93 1 sandy sang a song for me . l-93 1 sandy sang a song . l-93 1 sandy sang . l-93 0 * the song sang . l-93 1 racial inequality engenders conflict . l-93 0 * conflict engenders . l-93 0 * the president appointed press secretary to smith . l-93 1 the captain named the ship seafarer . l-93 0 * the captain named seafarer to the ship . l-93 1 the president declared smith press secretary . l-93 0 * the president declared smith as press secretary . l-93 0 * the president declared smith to press secretary . l-93 0 * the press conjectured smith the appointee . l-93 1 the press conjectured that smith would be the appointee . l-93 1 dina posed as a lawyer . l-93 0 * dina posed a lawyer . l-93 1 miriam tutored her brother . l-93 1 her cousin clerked for judge davis . l-93 1 i see someone running down the street . l-93 1 i saw jane run down the street . l-93 1 i saw the mona lisa . l-93 0 * the mona lisa sees easily . l-93 0 * we spotted that they were running . l-93 0 * we spotted them run . l-93 0 * runaway cats spot easily . l-93 1 we peered at the baby . l-93 1 we peered around the room . l-93 1 we peered through the screen . l-93 1 we peered into the closet . l-93 1 that pea soup tasted delicious to me . l-93 1 the clown amused the children . l-93 0 * the children amused at the clown . l-93 1 the clown amused the little children . l-93 1 little children amuse easily . l-93 1 that joke never fails to amuse little children . l-93 1 that joke never fails to amuse . l-93 1 that the clown had a red nose amused the children . l-93 1 to win the prize : would thrill me . l-93 1 the clown was amusing to the children . l-93 1 tourists admire paintings . l-93 0 * paintings admire easily . l-93 1 i admired him as a teacher . l-93 0 * i admired him a teacher . l-93 1 megan marveled at the beauty of the grand canyon . l-93 1 dorothy needs new shoes . l-93 0 * dorothy is needing new shoes . l-93 1 dorothy needs her skills . l-93 1 dorothy needs her for her skills . l-93 0 * dorothy needs the skills in her . l-93 1 dorothy needs that dress as a costume . l-93 0 * dorothy needs that dress a costume . l-93 1 dana longs for a sunny day . l-93 1 dana is longing for a sunny day . l-93 1 they praised the volunteers . l-93 1 the director praised the volunteers . l-93 0 * volunteers praise easily . l-93 1 they praised them as volunteers . l-93 0 * the inspector analyzed the soundness in the building . l-93 1 i hunted game in the woods . l-93 1 i was hunting game . l-93 1 i was hunting game in the woods . l-93 1 i was hunting in the woods . l-93 1 i was hunting . l-93 1 i searched for treasure in the cave . l-93 0 * i searched treasure in the cave . l-93 0 * we rummaged the drawer for important documents . l-93 1 we rummaged in the drawer for important documents . l-93 0 * we rummaged important documents in the drawer . l-93 0 * i hunted the woods for game . l-93 0 * i hunted for game in the woods . l-93 1 i hunted the secret out of him . l-93 1 brenda haggled with molly . l-93 1 brenda and molly haggled . l-93 0 * brenda haggled molly . l-93 1 brenda and molly haggled about the party . l-93 1 bill married kathy . l-93 0 * brenda met . l-93 1 brenda and molly met . l-93 1 anne met with cathy . l-93 1 wanda taught the students . l-93 1 wanda taught french to the students . l-93 1 wanda taught the students french . l-93 1 wanda taught the students that the earth was round . l-93 1 ellen told a story . l-93 1 ellen told a story to helen . l-93 1 ellen told helen a story . l-93 1 ellen told helen . l-93 1 ellen told helen about the situation . l-93 0 * ellen told a story at helen . l-93 0 * ellen told for helen to come . l-93 1 susan whispered . l-93 1 susan whispered to rachel . l-93 1 susan whispered a few words . l-93 1 susan whispered the news to rachel . l-93 0 * susan whispered rachel the news . l-93 1 susan whispered for me to come . l-93 1 susan whispered `` shut up '' . l-93 1 susan whispered `` shut up '' at them . l-93 1 they whispered that the winner would be announced tonight . l-93 1 heather cabled the news . l-93 1 heather cabled sara . l-93 1 heather cabled the news to sara . l-93 1 heather cabled sara the news . l-93 0 * heather cabled the news at sara . l-93 1 heather cabled sara about the situation . l-93 1 heather cabled for sara to come . l-93 1 ellen talked . l-93 0 * ellen talked for helen to come . l-93 1 ellen talked with helen about the problem . l-93 1 ellen talked with helen . l-93 1 ellen and helen talked . l-93 1 ellen and helen talked together . l-93 0 * ellen talked helen . l-93 1 ellen was conferring . l-93 1 ellen conferred with helen . l-93 1 ellen conferred with helen about the problem . l-93 0 * ellen conferred to helen . l-93 0 * ellen conferred for helen to come . l-93 1 ellen and helen conferred . l-93 0 * ellen and helen conferred together . l-93 0 * ellen conferred helen . l-93 1 ellen said to helen that melons were selling well . l-93 1 ellen said something . l-93 1 ellen said something to helen . l-93 0 * ellen said to helen . l-93 1 ellen complained to helen . l-93 1 ellen complained about the situation . l-93 1 ellen complained about the situation to helen . l-93 1 ellen warned helen . l-93 0 * ellen warned to helen . l-93 1 ellen warned against skating on thin ice . l-93 1 ellen warned helen that melons were selling . l-93 1 ellen warned that melons were selling . l-93 0 * ellen warned for helen to come . l-93 1 ellen warned helen about the traffic jam . l-93 1 the dog barked . l-93 1 the dog barked at the cat . l-93 1 cynthia ate the peach . l-93 1 cynthia ate . l-93 1 cynthia ate at the peach . l-93 0 * cynthia ate on the peach . l-93 1 cynthia ate the peach with a fork . l-93 1 cynthia nibbled the carrot . l-93 1 cynthia nibbled . l-93 1 cynthia nibbled at the carrot . l-93 1 cynthia gobbled the pizza . l-93 1 cynthia gobbled the pizza down . l-93 0 * cynthia gobbled . l-93 0 * cynthia gobbled at the pizza . l-93 0 * cynthia gobbled on the pizza . l-93 1 cynthia devoured the pizza . l-93 0 * cynthia devoured . l-93 0 * cynthia devoured at the pizza . l-93 0 * cynthia devoured on the pizza . l-93 1 cynthia lunched . l-93 1 cynthia lunched on peaches . l-93 0 * cynthia lunched peaches . l-93 0 * cynthia lunched at peaches . l-93 0 * cynthia munched . l-93 1 cynthia munched on peaches . l-93 0 * cynthia munched peaches . l-93 0 * cynthia munched at peaches . l-93 1 teresa bottle fed the baby . l-93 1 teresa bottle fed soy milk to the baby . l-93 1 teresa bottle fed the baby soy milk . l-93 0 * teresa bottle fed soy milk . l-93 1 paul yawned . l-93 0 * paul yawned on mary . l-93 0 * paul yawned at mary . l-93 1 paul breathed . l-93 0 * paul breathed at mary . l-93 1 paul exhaled . l-93 0 * paul exhaled at mary . l-93 0 * paul exhaled on mary . l-93 1 paul laughed . l-93 1 she laughed from embarrassment . l-93 1 linda winked her eye . l-93 0 * linda winked her nose . l-93 0 * linda winked his eye . l-93 1 linda winked . l-93 1 linda winked at the audience . l-93 1 linda winked in agreement . l-93 0 * jennifer craned her arm . l-93 1 jennifer shook her finger at the naughty child . l-93 1 the princess bowed . l-93 1 the princess bowed to the queen . l-93 0 * the heavy meal dozed gloria . l-93 1 gloria dozed . l-93 1 sharon flinched . l-93 1 sharon flinched at the sight of the accident . l-93 0 * the shock flinched sharon . l-93 1 sharon shivered . l-93 1 sharon shivered from fear . l-93 1 sharon shivered at the thought of the cold sea . l-93 0 * the fear shivered sharon . l-93 1 the pirates drowned the sailor . l-93 1 the sailor drowned . l-93 1 the sea monster drowned the sailors . l-93 1 my eyes are itching . l-93 1 my eyes are itching me . l-93 0 * my eyes are itching my brother . l-93 1 my eyes are itching from the smoke . l-93 1 my heart is pounding . l-93 0 * my heart is pounding my brother . l-93 1 my heart is pounding from fear . l-93 1 tessa sprained her ankle . l-93 0 * tessa sprained mary 's ankle . l-93 1 sharon fainted . l-93 1 sharon fainted at the sight of the accident . l-93 0 * hunger fainted sharon . l-93 1 the baby dressed . l-93 1 marlene dressed the baby . l-93 1 marlene dressed herself . l-93 0 * marlene dressed her body . l-93 0 * the horse groomed itself . l-93 1 the barber shaved my chin . l-93 1 i shaved my chin . l-93 1 celia brushed the baby 's hair . l-93 1 celia brushed her hair . l-93 0 * celia brushed herself . l-93 1 she always wore purple dresses . l-93 0 * she always wore herself . l-93 0 * she always wore herself in purple . l-93 0 * she always wore . l-93 1 she spruced herself up before the job interview . l-93 1 she spruced up before the job interview . l-93 1 she was always clad in black . l-93 0 * her stepmother always clad her in black . l-93 0 * she always clad herself in black . l-93 0 * she always clad in black . l-93 1 brutus murdered julius caesar . l-93 0 * julius caesar murdered . l-93 1 the bandits murdered innocent victims . l-93 0 * innocent victims murder easily . l-93 1 brutus murdered julius caesar with a dagger . l-93 1 the exterminator killed the insects with ddt . l-93 1 the witch poisoned snow white . l-93 0 * children poison easily . l-93 1 the jewel sparkled . l-93 1 jewels sparkled on the crown . l-93 1 the crown sparkled with jewels . l-93 1 a magnificent diamond sparkled on his finger . l-93 1 on his finger sparkled a magnificent diamond . l-93 1 on his finger there sparkled a magnificent diamond . l-93 0 * the director sparkled the lights . l-93 1 the door hinges squeaked . l-93 1 birds sang in the trees . l-93 1 the trees sang with birds . l-93 1 in the hallway ticked a grandfather clock . l-93 1 in the hallway there ticked a grandfather clock . l-93 1 i buzzed the bell . l-93 1 the bell chimed the hour . l-93 1 a squeaking door announced john 's presence . l-93 1 the onions reeked . l-93 1 the room reeked of onions . l-93 1 the room reeked . l-93 0 * kelly reeked the onions . l-93 1 the well gushed oil . l-93 0 * i gushed the fountain . l-93 1 i bled him . l-93 1 oil gushed from the well . l-93 1 the streets gushed with water . l-93 1 a fragrant stew bubbled over the fire . l-93 1 caesar put a gushing fountain by his palace . l-93 1 the romans destroyed the city . l-93 0 * the city destroyed . l-93 0 * cities destroy easily . l-93 0 * the romans destroyed the city into ruins . l-93 0 * the romans destroyed ruins from the city . l-93 0 * the romans destroyed the city into a ruin . l-93 0 * the romans destroyed the city from a capital into a ruin . l-93 1 the builders destroyed the warehouse with explosives . l-93 1 the explosives destroyed the warehouse . l-93 1 the builders destroyed the warehouse . l-93 0 * the builders destroyed at the warehouse . l-93 1 tony broke the window . l-93 1 tony broke the crystal vase . l-93 1 tony broke the window with a hammer . l-93 1 tony broke the cup against the wall . l-93 0 * tony broke the wall with the cup . l-93 0 * tony broke at the window . l-93 0 * tony broke herself on the ann . l-93 1 tony broke her arm . l-93 1 tony bent the rod with pliers . l-93 1 the rod bent . l-93 1 tony bent the copper rod . l-93 1 copper rods bend easily . l-93 1 the pliers bent the rod . l-93 1 tony bent the rod against the table . l-93 0 * tony bent the table with the rod . l-93 0 * tony bent at the rod . l-93 0 * tony bent mary in the arm . l-93 1 tony bent mary 's arm . l-93 1 the potatoes baked . l-93 1 jennifer baked idaho potatoes . l-93 1 idaho potatoes bake beautifully . l-93 1 jennifer baked the potatoes in the oven . l-93 1 this oven bakes potatoes well . l-93 0 * jennifer baked at the potatoes . l-93 1 bill dried the clothes . l-93 1 the clothes dried . l-93 1 bill dried the cotton clothes . l-93 1 cotton clothes dry easily . l-93 1 bill dried the clothes with a hair dryer . l-93 1 the hair dryer dried the clothes . l-93 0 * bill dried at the clothes . l-93 1 a lot of clothes are drying on the line . l-93 0 * the line is drying with a lot of clothes . l-93 1 bill is drying a lot of clothes on the line . l-93 0 * bill is drying the line with a lot of clothes . l-93 0 * on the line are drying a lot of clothes . l-93 0 * on the line there are drying a lot of clothes . l-93 1 the roses bloomed . l-93 0 * the sun bloomed the roses . l-93 1 the temperature soared . l-93 0 * the heat soared the temperature . l-93 0 * there soared oil in price . l-93 0 * in price soared oil . l-93 1 cornelia lodged with the smiths . l-93 1 cornelia lodged at mrs. parker 's . l-93 1 an old woman lodged at mrs. parker 's . l-93 0 * there lodged an old woman at mrs. parker 's . l-93 0 * at mrs. parker 's lodged an old woman . l-93 1 squatters lodged in these abandoned buildings . l-93 0 * these abandoned buildings lodged with squatters . l-93 1 the soldiers lodged in the schoolhouse . l-93 1 an old woman lived in the forest . l-93 1 unicorns do n't exist . l-93 1 there exists a solution to this problem . l-93 1 in the forest languished an old woman . l-93 1 a crowd of people remained in the square . l-93 0 * the square remained with a crowd of people . l-93 0 * the famous mathematician existed a solution to the problem . l-93 1 the beer bubbled . l-93 1 a fire raged in the mountains . l-93 1 in the mountains there raged a fire . l-93 1 a fire raged all through the mountains . l-93 1 all through the mountains raged a fire . l-93 1 roses flowered in the garden . l-93 1 the garden flowered with roses . l-93 1 a fire raged over the fields . l-93 0 * the farmers raged a fire over the fields . l-93 1 a large flag fluttered . l-93 1 a large flag fluttered over the fort . l-93 1 many flags fluttered over the fort . l-93 1 over the fort there fluttered a large flag . l-93 1 over the fort fluttered a large flag . l-93 1 the tree trembled . l-93 1 the flag waved . l-93 1 the hall is echoing with voices . l-93 1 a loud cry echoed through the hall . l-93 1 through the hall there echoed a loud cry . l-93 1 through the hall echoed a loud cry . l-93 1 the music echoed . l-93 0 * the magician echoed the music . l-93 0 * an echoing voice rang out . l-93 1 a striped fish swam in the aquarium . l-93 1 in the aquarium swam a striped fish . l-93 1 in the aquarium there swam a striped fish . l-93 1 the cattle are herding in the pasture . l-93 1 the cattle herded . l-93 1 i herded the cattle . l-93 1 the bag is bulging with groceries . l-93 0 * groceries are bulging in the bag . l-93 1 the bag is bulging . l-93 0 * i had to bulge the bag with groceries . l-93 1 a statue of jefferson stood on the comer . l-93 1 there stood on the comer a statue of jefferson . l-93 1 a statue of jefferson stood on the comer of the two boulevards . l-93 1 on the comer of the two boulevards stood a statue of jefferson . l-93 1 the hanging gardens are a sight to behold . l-93 1 the river runs from the lake to the sea . l-93 1 the stream winds through the valley . l-93 1 the stream crawls through the valley . l-93 1 through the valley ran a rushing stream . l-93 1 there ran through the valley a rushing stream . l-93 1 italy borders france . l-93 1 snow caps the mountain . l-93 1 a ship appeared . l-93 1 a large ship appeared on the horizon . l-93 1 on the horizon appeared a large ship . l-93 1 a solution immediately presented itself . l-93 0 * a solution immediately presented . l-93 1 a solution immediately presented itself to him . l-93 1 a wonderful opportunity presented itself yesterday . l-93 0 * to him presented itself a wonderful opportunity . l-93 1 i presented a solution yesterday . l-93 1 a solution presented itself yesterday . l-93 1 the crowd vanished . l-93 1 a valuable 13th-century manuscript recently vanished from the library . l-93 1 the rabbit vanished into thin air . l-93 0 * the magician vanished a rabbit into thin air . l-93 1 a serious accident happened yesterday . l-93 1 there happened a serious accident yesterday . l-93 1 a serious accident happened in front of them . l-93 1 in front of them happen . l-93 1 the accident happened . l-93 0 * the motorist happened the accident . l-93 1 sylvia squirmed . l-93 0 * the lecture squirmed sylvia . l-93 1 the dog flopped onto the bed . l-93 1 the dog flopped in the comer . l-93 1 a dog lay in the comer . l-93 0 * there lay a dog in the comer . l-93 1 a dog lay in the corner . l-93 0 * in the corner lay a dog . l-93 1 the convict escaped . l-93 1 the convict escaped from the police . l-93 1 the convict escaped the police . l-93 0 * the collaborators escaped the convict . l-93 1 we abandoned the area . l-93 0 * we abandoned from the area . l-93 1 the ball rolled . l-93 1 the ball rolled down the hill . l-93 1 the ball rolled over the hill . l-93 1 the ball rolled into the gutter . l-93 1 bill rolled the ball down the hill . l-93 0 * the ball rolled the hill . l-93 1 the horse jumped over the stream . l-93 1 the horse jumped across the stream . l-93 1 the horse jumped into the stream . l-93 1 the horse jumped out of the stream . l-93 1 the lions jumped through the hoop . l-93 1 the horse jumped the stream . l-93 1 a little white rabbit jumped out of the box . l-93 1 there jumped out of the box a little white rabbit . l-93 1 we walked ourselves into a state of exhaustion . l-93 1 tom ran the soles off his shoes . l-93 1 he skated penny around the rink . l-93 1 they rowed . l-93 1 he rowed penny across the lake . l-93 1 penny rowed across the lake . l-93 1 they rowed along the canals of venice . l-93 1 they rowed the canals of venice . l-93 1 they waltzed . l-93 1 she waltzed across the floor . l-93 1 he waltzed her across the floor . l-93 1 jackie chased after the thief . l-93 1 jackie chased the thief down the street . l-93 1 jackie chased the thief . l-93 0 * the thief chased down the street . l-93 0 * the thief chased . l-93 0 * rose accompanied . l-93 1 sasha lingered in the museum . l-93 1 sasha lingered over lunch . l-93 0 * phyllis lingered sasha over lunch . l-93 1 maggie hurried through the museum . l-93 1 her sister hurried . l-93 1 maggie hurried her sister . l-93 1 the package weighed ten pounds . l-93 0 * ten pounds was weighed by the package . l-93 0 * i weighed the package ten pounds . l-93 1 i weighed the package . l-93 1 the book costs $ 10 . l-93 0 * the book valued at $ 200 . l-93 0 * the book valued $ 200 . l-93 1 the phone company billed me $ 10 for that phone call . l-93 0 * the phone company billed $ 10 to me . l-93 1 the phone company billed me $ 10 . l-93 0 * the phone company billed $ 10 as me . l-93 1 the meeting began at 4 p.m . l-93 1 i began the meeting at 4 p.m . l-93 1 wilma completed the assignment . l-93 0 * the assignment completed . l-93 1 my family always summers at the seashore . ks08 1 the man kicked a ball . ks08 1 a man kicked the ball . ks08 1 the ball kicked a man . ks08 1 a ball kicked the man . ks08 1 the ball , a man kicked . ks08 1 the man , a ball kicked . ks08 0 * kicked the man the ball . ks08 0 * man the ball kicked the . ks08 0 * the man a ball kicked . ks08 0 * kim lives in the house lee sold it to her . ks08 0 * kim fond of lee . ks08 1 kim is fond of lee . ks08 1 in january 2002 , a dull star in an obscure constellation suddenly became 600,000 times more luminous than our sun , temporarily making it the brightest star in our galaxy . ks08 1 the man kicked the ball . ks08 1 the tall man kicked the ball . ks08 1 the handsome , tall man kicked the ball . ks08 1 the handsome , tall , nice man kicked the ball . ks08 1 some sentences can go on . ks08 1 some sentences can go on and on . ks08 1 some sentences can go on and on and on . ks08 1 some sentences can go on and on and on and on . ks08 1 all native speakers have a grammatical competence which can generate an infinite set of grammatical sentences from a finite set of resources . ks08 0 * the professor found some strong evidences of water on mars . ks08 1 do not end a sentence with a preposition . ks08 1 avoid double negatives . ks08 0 * the evidence that john found was more helpful than the one that smith found . ks08 0 * we had hoped to get three new equipments every month , but we only had enough money to get an equipment every two weeks . ks08 0 * the equipment we bought last year was more expensive than the one we bought this year . ks08 1 the student was hoping for a good clue . ks08 1 the clue that john got was more helpful than the one that smith got . ks08 1 the student was hoping for a tool . ks08 1 the tool that jones got was more helpful than the one that smith got . ks08 1 much evidence is needed . ks08 1 much equipment is needed . ks08 1 much information is needed . ks08 1 much furniture is needed . ks08 1 much advice is needed . ks08 0 * much clue is needed . ks08 0 * much tool is needed . ks08 0 * much armchair is needed . ks08 0 * much bags is needed . ks08 0 * many evidence was provided . ks08 0 * many equipment is available . ks08 0 * the room contains many furniture . ks08 1 the paper provides many clues . ks08 1 the box contains many tools . ks08 1 john offers many suggestions . ks08 1 little evidence was provided . ks08 1 little equipment is available . ks08 1 john offers little advice . ks08 1 little information was provided . ks08 0 * little clue could be found . ks08 0 * the box contains little tool . ks08 0 * john offers little suggestion . ks08 0 * the room contains little armchair . ks08 0 * few evidence was provided . ks08 0 * few equipment is available . ks08 0 * the room contains few furniture . ks08 0 * john offers few advice . ks08 0 * few information was provided . ks08 1 few clues could be found . ks08 1 john offers few suggestions . ks08 1 the room contains few armchairs . ks08 1 the president was hoping for a good cake . ks08 1 the bartender gave john some good beers . ks08 1 no one knows how to tell from a good beer to a bad one . ks08 1 my pastor says i ate too much cake . ks08 1 the students drank too much beer last night . ks08 1 people now drink less beer . ks08 1 in english , the main verb agrees with the head element of the subject . ks08 0 * the recent strike by pilots have cost the country a great deal of money from tourism and so on . ks08 0 * the average age at which people begin to need eyeglasses vary considerably . ks08 0 * despite of his limited educational opportunities , abraham lincoln became one of the greatest intellectuals in the world . ks08 0 * a pastor was executed , notwithstanding on many applications in favor of him . ks08 1 visiting relatives can be boring . ks08 1 he said that that ` that ' that that man used was wrong . ks08 1 kim and sandy is looking for a new bicycle . ks08 1 i have never put the book . ks08 1 the boat floated down the river sank . ks08 1 chris must liking syntax . ks08 1 there is eager to be fifty students in this class . ks08 1 what is john eager to do ? ks08 1 what is john easy to do ? ks08 1 is the boy who holding the plate can see the girl ? ks08 1 which chemical did you mix the hydrogen peroxide and ? ks08 1 there seem to be a good feeling developing among the students . ks08 1 strings have been pulled many times to get students into that university . ks08 1 he washed himself . ks08 0 * he washed herself . ks08 0 * he washed myself . ks08 0 * he washed ourselves . ks08 1 he washed me . ks08 1 he washed us . ks08 1 wash yourself . ks08 1 wash yourselves . ks08 0 * wash himself . ks08 1 wash me ! ks08 1 the weather is lovely today . ks08 1 i am hoping that the weather is lovely today . ks08 1 the birds are singing because the weather is lovely today . ks08 1 they read the book . ks08 1 he treats john very . ks08 1 he walked right the wall . ks08 1 they have no tv . ks08 1 they have no car . ks08 1 they have no information . ks08 1 they have no friend . ks08 0 * they have no went . ks08 0 * they have no old . ks08 0 * they have no and . ks08 1 they can sing . ks08 1 they can run . ks08 1 they can smile . ks08 1 they can cry . ks08 0 * they can happy . ks08 0 * they can down . ks08 0 * they can door . ks08 0 * they can very . ks08 1 they read the new book . ks08 1 they read the interesting book . ks08 1 they read the scientific book . ks08 0 * they read the sing book . ks08 0 * they read the under book . ks08 0 * they read the every book . ks08 1 he treats john very nicely . ks08 1 he treats john very badly . ks08 1 he treats john very kindly . ks08 0 * he treats john very kind . ks08 0 * he treats john very shame . ks08 1 he walked right into the wall . ks08 0 * he walked right happy . ks08 0 * he walked right the wall . ks08 1 john sang a song , mary played the piano . ks08 1 we found out that very lucrative jobs were in jeopardy . ks08 0 * my these jobs are in jeopardy . ks08 0 * the his jobs are in jeopardy . ks08 1 i think learning english is not easy at all . ks08 1 i doubt you can help me in understanding this . ks08 1 i think that learning english is not all that easy . ks08 1 i doubt if you can help me in understanding this . ks08 1 i am anxious for you to study english grammar hard . ks08 0 * i think that learning english to be not all that easy . ks08 0 * i doubt if you to help me in understanding this . ks08 0 * i am anxious for you should study english grammar hard . ks08 1 john not leave . ks08 1 john drink beer last night . ks08 1 john leave for seoul tomorrow ? ks08 1 john will study syntax , and mary , too . ks08 1 he left . ks08 1 he did not leave . ks08 1 students wanted to write a letter . ks08 1 students intended to surprise the teacher . ks08 1 students objected to the teacher . ks08 1 students sent letters to the teacher . ks08 1 it is crucial for john to show an interest . ks08 1 it is crucial that john should show an interest . ks08 1 i know i should go to the dentist 's , but i just do n't want to . ks08 1 i do n't really want to go to the dentist 's , but i know i should . ks08 0 * she thought it was likely that everyone to fit into the car . ks08 1 she thought it was likely that everyone might fit into the car . ks08 1 she thought it was easy for everyone to fit into the car . ks08 0 * she thought it was easy for everyone would fit into the car . ks08 1 the umpire called off the game . ks08 1 the umpire called the game off . ks08 1 the two boys looked the word up . ks08 1 the umpire fell off the deck . ks08 1 the two boys looked up the high stairs . ks08 1 the two boys looked up the high stairs from the floor . ks08 0 * the umpire fell the deck off . ks08 0 * the students looked the high stairs up from the floor . ks08 0 * the students looked the high stairs up . ks08 1 the umpire called it of . ks08 0 * the umpire called off it . ks08 0 * the umpire fell it off . ks08 1 the umpire fell off it . ks08 1 a tall boy threw the ball . ks08 1 the cat chased the long string . ks08 1 that ball hit a student . ks08 1 the piano played a song . ks08 1 the piano kicked a student . ks08 1 that ball sang a student . ks08 1 the tall , handsome man kicked the ball . ks08 1 the tall , kind , handsome man kicked the ball . ks08 1 the happy , happy , happy , happy , happy , happy man sang a song . ks08 1 the mother of the boy and the girl is arriving soon . ks08 1 the mother of the boy and the girl are arriving soon . ks08 1 john saw the man with a telescope . ks08 1 we need more intelligent leaders . ks08 1 the student enjoyed his english syntax class last semester . ks08 1 the policeman met several young students in the park last night . ks08 1 it was the policeman that met several young students in the park last night . ks08 1 it was several young students that the policeman met in the park last night . ks08 1 it was last night that the policeman met several young students in the park . ks08 0 * it was several young students in that the policeman met the park last night . ks08 0 * it was in the park last night that the policeman met several young students . ks08 1 where did the policeman meet several young students ? ks08 1 what did you put in your box ? ks08 1 where did you put the book ? ks08 1 what did you do ? ks08 1 john looked up the inside of the chimney . ks08 1 john looked up the meaning of ` chanson ' . ks08 1 what did he look up ? ks08 1 where did he look ? ks08 1 up what did he look ? ks08 1 what do you think the man who is standing by the door is doing now ? ks08 1 what do you think he is doing now ? ks08 1 have you been to seoul ? ks08 1 john might go home , so might bill . ks08 1 john might pass the exam , and as might bill . ks08 1 if john can speak french fluently – which we all know he can – we will have no problems . ks08 1 john asked me to put the clothes in the cupboard , and to annoy him i really stuffed them there . ks08 1 john asked me to put the clothes in the cupboard , and to annoy him i stuffed them there . ks08 0 * john asked me to put the clothes in the cupboard , but i did so put the clothes in the suitcase . ks08 1 the girls played in the water and swam under the bridge . ks08 1 the children were neither in their rooms nor on the porch . ks08 1 many people drink beer or wine . ks08 0 * mary waited for the bus and to go home . ks08 0 * lee went to the store and crazy . ks08 1 liked ice cream . ks08 0 * the whistle tune was beautiful . ks08 0 * the easily student finished his homework . ks08 0 * the my dog is a terrier . ks08 1 the monkey wants to leave the meeting . ks08 0 * the monkey eager to leave the meeting . ks08 1 the monkeys approved of their leader . ks08 1 the men practice medicine . ks08 0 * the men doctors of medicine . ks08 1 john read the book loudly . ks08 1 john sounded happy . ks08 1 john felt proud that his son won the game . ks08 0 * john sounded happily . ks08 0 * john sounded the student . ks08 0 * john sounded in the park . ks08 0 * the monkeys seem want to leave the meeting . ks08 1 the monkeys seem eager to leave the meeting . ks08 0 * john seems know about the bananas . ks08 1 john seems certain about the bananas . ks08 1 john came from seoul . ks08 1 they put the book in the box . ks08 1 they stayed in the hotel . ks08 1 the fly fell into the soup . ks08 1 the squirrel ran straight . ks08 1 the squirrel ran right up the tree . ks08 0 * the squirrel is right angry . ks08 0 * the squirrel ran straight quickly . ks08 0 * the squirrel ran right quickly . ks08 1 this handsome man chased a dog . ks08 1 a man kicked that ball . ks08 1 that tall woman chased a cat . ks08 1 his friend kicked a ball . ks08 1 bill claims john believes mary thinks tom is honest . ks08 1 jane imagines bill claims john believes mary thinks tom is honest . ks08 1 the little boy hit the child with a toy . ks08 1 chocolate cakes and pies are my favorite desserts . ks08 0 * the children were in their rooms or happily . ks08 1 john suddenly got off the bus . ks08 1 john suddenly put off the customers . ks08 0 * john suddenly got the bus off . ks08 1 john suddenly put the customers off . ks08 1 his second book came out earlier this year and became an instant best-seller . ks08 1 when you book something such as a hotel room , you arrange to have it . ks08 1 price quotes on selected categories will be sent out upon request . ks08 1 no doubt that he was forced to leave his family against his will . ks08 1 he intended to will the large amount of money to frank . ks08 1 jane stood aside to let her pass . ks08 1 he has a rail pass that 's right for you . ks08 1 it is important for us to spend time with children . ks08 1 he was arrested for being drunk . ks08 1 i think that person we met last week is insane . ks08 1 we believe that he is quite reasonable . ks08 1 i forgot to return the book that i borrowed from the teacher . ks08 1 i am anxious that you should arrive on time . ks08 1 i am anxious for you to arrive on time . ks08 0 * i am anxious for you should arrive on time . ks08 1 i wonder whether you 'd be kind enough to give us information . ks08 1 if students study hard , teachers will be happy . ks08 1 whether they say it or not , most teachers expect their students to study hard . ks08 1 john put a book on the table . ks08 1 she turned down his offer . ks08 1 he looked at a book about swimming . ks08 1 he talked to a girl about swimming . ks08 1 he talked with a girl about swimming . ks08 1 i do n't know the people present . ks08 0 * could you turn off the fire and on the light ? ks08 0 * i know the truth and that you are innocent . ks08 1 john refused the offer proudly . ks08 1 i consider john the best candidate . ks08 1 i saw him leaving the main building . ks08 1 he took john to the school by the park . ks08 1 john sang a song and danced to the music . ks08 1 john wants to study linguistics in near future . ks08 1 they told angelica to arrive early for the award . ks08 1 that louise had abandoned the project surprised everyone . ks08 1 i know you like the back of my hand . ks08 1 time flies like an arrow . ks08 1 i need to have that report on our web page by tomorrow . ks08 1 the monkey scratched a boy on monday . ks08 1 john tagged the monkey in the forest . ks08 1 the monkey was tagged in the forest by john . ks08 1 the cat devoured the rat . ks08 1 the rat devoured the cat . ks08 1 this car stinks . ks08 1 it rains . ks08 1 the committee disliked her proposal . ks08 1 these books disappoint me . ks08 1 our neighbor takes his children to school in his car . ks08 0 * our neighbor take his children to school in his car . ks08 1 the book , including all the chapters in the first section , is very interesting . ks08 0 * the book , including all the chapters in the first section , are very interesting . ks08 1 the effectiveness of teaching and learning depends on several factors . ks08 0 * the effectiveness of teaching and learning depend on several factors . ks08 1 the tornadoes that tear through this county every spring are more than just a nuisance . ks08 0 * the tornadoes that tear through this county every spring is more than just a nuisance . ks08 1 the lady singing with a boy is a genius , is n't he ? ks08 0 * the lady singing with a boy is a genius , is n't she ? ks08 1 with their teacher , the kids have arrived safely , have n't they ? ks08 0 * with their teacher , the kids have arrived safely , has n't he ? ks08 1 the kids have arrived safely . ks08 1 it could be more detrimental . ks08 1 is this teacher a genius ? ks08 1 have the kids arrived safely ? ks08 1 could it be more detrimental ? ks08 1 the kids in our class have arrived safely . ks08 0 * have in our class the kids arrived safely ? ks08 1 his girlfriend bought this computer . ks08 1 thunder frightens the dog . ks08 1 the dog fears thunder . ks08 1 his girlfriend bought this computer for him . ks08 1 the child broke the teapot by accident . ks08 1 this computer was bought for him by his girlfriend . ks08 1 the teapot was broken by the child by accident . ks08 1 this item belongs to the student . ks08 0 * the student is belonged to by this item . ks08 1 he remained a good friend to me . ks08 0 * a good friend is remained to me . ks08 1 john gave the boys the cds . ks08 1 my mother baked me a birthday cake . ks08 1 she was sent a review copy of the book by the publisher . ks08 1 she was sent a review copy of the book . ks08 1 john gave the cds to the boys . ks08 1 the publisher sent a review copy of the book to her . ks08 1 my mother baked a cake for me . ks08 1 the cds were given to the boys by john . ks08 1 a review copy of the book was sent to her by the publisher . ks08 1 this nice cake was baked for me by my mother . ks08 1 this is my ultimate goal . ks08 1 michelle became an architect . ks08 1 they elected graham chairman . ks08 0 * chairman was elected graham . ks08 0 * the best writer was considered andrew . ks08 1 john made kim a great doll . ks08 1 the situation became terrible . ks08 1 this map is what he wants . ks08 1 the message was that you should come on time . ks08 1 i made kim angry . ks08 1 i consider him immoral . ks08 1 i regard andrew as the best writer . ks08 1 they spoil their kids rotten . ks08 1 john put books in the box . ks08 1 john talked to bill about the exam . ks08 1 she reminded him of the last time they met . ks08 1 they would inform mary of any success they have made . ks08 1 john gave a book to the student . ks08 1 john bought a book for the student . ks08 1 the bus stopped suddenly . ks08 1 shakespeare wrote his plays a long time ago . ks08 1 they went to the theater in london . ks08 1 he failed chemistry because he ca n't understand it . ks08 0 * john gave tom a book a record . ks08 1 i saw this film several times last year during the summer . ks08 1 my uncle visited today . ks08 0 * today was visited by my uncle . ks08 1 the termites destroyed the sand castle . ks08 1 being honest is not an easy task . ks08 1 that john passed surprised her . ks08 1 to finish this work on time is almost unexpected . ks08 1 under the bed is a safe place to hide . ks08 1 i sent a surprise present to john . ks08 1 they wondered what she did yesterday . ks08 1 they believed that everybody would pass the test . ks08 1 are you going on holiday before or after easter ? i prefer after easter . ks08 1 that john passed surprised her , did n't it ? ks08 1 that the march should go ahead and that it should be cancelled have been argued by different people at different times . ks08 0 * that the march should go ahead and that it should be cancelled has been argued by different people at different times . ks08 1 to finish it on time made quite a statement , did n't it ? ks08 1 to delay the march and to go ahead with it have been argued by different people at different times . ks08 0 * to delay the march and to go ahead with it has been argued by different people at different times . ks08 1 the little cat devoured a mouse last night . ks08 1 john left very early . ks08 1 john studied hard to pass the exam . ks08 1 she disappeared when the main party arrived . ks08 1 a boy hit the ball . ks08 1 the students felt comfortable in the class . ks08 1 john gave a book to the students . ks08 1 john died last night . ks08 1 john bought a lot of books for his sons . ks08 1 john promised bill to leave tomorrow morning . ks08 1 john deprived his sons of game cards . ks08 1 mary received an award from the department . ks08 1 john told the rumor to his friend . ks08 1 john put his books in the attic . ks08 1 the government kept all the money . ks08 1 john hit the ball with a bat . ks08 1 john wiped the window with a towel . ks08 1 the cat chased pat the mouse . ks08 1 the mouse was chased by the cat . ks08 1 there still remains an issue to be solved . ks08 1 there lived a man with his grandson . ks08 1 there arrived a tall , red haired and incredibly well dressed man . ks08 0 * there sang a man with a pipe . ks08 0 * there dances a man with an umbrella . ks08 1 john resembles his mother . ks08 1 a is similar to b . ks08 1 john runs into the house . ks08 1 mary looked at the sky . ks08 1 the school awarded a few of the girls in miss kim 's class scholarships . ks08 1 she was the nicest teacher in the senior school . ks08 1 they elected him america 's 31st president . ks08 1 the next morning we set out for seoul . ks08 1 doing syntax is not easy . ks08 1 he saw the man with the stick . ks08 1 they parted the best of friends . ks08 1 in the summer we always go to france . ks08 1 last year i saw this film several times . ks08 1 he baked tom the bread last night . ks08 1 that they have completed the course is amazing . ks08 1 the teacher made students happy . ks08 1 we reminded him of the agreement . ks08 1 in the garden stands a statue . ks08 0 * in the garden stand a statue . ks08 1 among the guests was sitting my friend louise . ks08 0 * among the guests were sitting my friend louise . ks08 1 this proved my hypothesis . ks08 1 the students all enjoyed that summer . ks08 1 the students all worked that summer . ks08 1 the scientist made her a robot . ks08 1 the students called me a teacher . ks08 1 a big green insect flew into the soup . ks08 1 john 's mother sent a letter to mary . ks08 1 we placed the cheese in the refrigerator . ks08 1 frank threw himself into the sofa . ks08 1 the ice melted . ks08 1 the vacuum cleaner frightens the child . ks08 1 scientists found that the birds sang well in the evenings , but performed badly in the mornings . ks08 0 * john put his gold . ks08 0 * john put his gold safe . ks08 0 * john put his gold to be under the bathtub . ks08 1 john put his gold under the bathtub . ks08 1 this is the box in which john put his gold . ks08 1 this is the gold that john put under the bathtub . ks08 0 * the king kept put his gold under the bathtub . ks08 1 the king kept putting his gold under the bathtub . ks08 1 the defendant denied the accusation . ks08 0 * the defendant denied . ks08 1 the teacher handed the student a book . ks08 0 * the teacher handed the student . ks08 1 they want to leave the meeting . ks08 0 * they eager to leave the meeting . ks08 1 the senators know that the president is telling a lie . ks08 0 * the senators certain that the president is telling a lie . ks08 1 be eager to leave the meeting . ks08 0 * the senators to be certain that the president is telling a lie . ks08 0 * the senators be certain that the president is telling a lie . ks08 1 tom offered advice to his students in his office . ks08 1 tom offered advice to his students with love . ks08 1 john kept him behind the garage . ks08 0 * john stayed kim behind the garage . ks08 0 * john placed him busy . ks08 1 john kept him busy . ks08 0 * john stayed him busy . ks08 0 * john placed behind the counter . ks08 0 * john kept behind the counter . ks08 1 john stayed behind the counter . ks08 1 john deposited some money in the bank . ks08 1 john deposited some money in the bank on friday . ks08 0 * the un blamed global warming on humans on natural causes . ks08 1 kim and sandy met in seoul in the lobby of the lotte hotel in march . ks08 1 john deposited some money in the checking account and mary did the same thing . ks08 1 john deposited some money in the checking account on friday and mary did the same thing . ks08 1 john deposited some money in the checking account on friday and mary did the same thing on monday . ks08 0 * john deposited some money in the checking account and mary did the same thing in the savings account . ks08 0 * john gave a present to the student and mary did the same thing to the teacher . ks08 0 * john locked fido in the garage and mary did so in the room . ks08 0 * john ate a carrot and mary did so a radish . ks08 1 kim jogs on the hill . ks08 1 kim jogs under the hill . ks08 1 kim jogs over the hill . ks08 1 kim depends . ks08 1 kim relies on sandy . ks08 1 kim depends on sandy . ks08 0 * kim depends at sandy . ks08 1 john met a student in the park . ks08 0 * john met in the park a student . ks08 0 * the problem disappeared the accusation . ks08 1 the problem disappeared . ks08 0 * the boy gave the book . ks08 1 the boy gave the baby the book . ks08 1 the bird devours the worm . ks08 1 the birds devour the worm . ks08 1 every photo of max and sketch by his students appeared in the magazine . ks08 1 no photo of max and sketch by his students appeared in the magazine . ks08 0 * sketch by his students appeared in the magazine . ks08 1 the present king of country music is more popular than the last one . ks08 0 * the king of rock and roll is more popular than the one of country music . ks08 1 which student were you talking about ? ks08 0 * john put in the box . ks08 0 * in the box put john the book . ks08 1 the election results surprised everybody . ks08 1 that he won the election surprised everybody . ks08 1 john disappeared . ks08 0 * john disappeared bill . ks08 1 john coughed . ks08 0 * john coughed the money . ks08 1 the president looked weary . ks08 1 the teacher became tired of the students . ks08 1 the lasagna tasted delicious . ks08 1 john remained somewhat calm . ks08 1 the jury seemed ready to leave . ks08 1 john became a success . ks08 1 john seemed a fool . ks08 1 john remained a student . ks08 1 john saw fred . ks08 1 alice typed the letter . ks08 1 clinton supported the health care bill . ks08 1 raccoons destroyed the garden . ks08 1 the school board leader asked the students a question . ks08 1 john taught new students english syntax . ks08 1 the school board leader asked a question of the students . ks08 1 the sexual revolution makes some people uncomfortable . ks08 1 ad agencies call young people generation x-ers . ks08 1 historians believe fdr to be our most effective president . ks08 0 * john carried to the door . ks08 1 tom locked fido in the garage . ks08 1 tom bathed fido in the garage . ks08 1 tom placed it under the table . ks08 1 tom played it under the table . ks08 1 i wonder if you will come back tomorrow . ks08 1 you would have a reply if you come back tomorrow . ks08 1 tom hid the manuscript in the cupboard . ks08 1 fred hired sharon to change the oil . ks08 1 they pushed the prisoners into the truck . ks08 1 frank hopes to persuade harry to make the cook wash the dishes . ks08 1 george mailed the attorney his photograph of the accident . ks08 1 tom keeps asking karen 's sister to buy the car . ks08 1 jane left the book on the table . ks08 1 we have not confirmed whether the flight had been booked . ks08 1 we saw him beaten by the champion . ks08 1 they confined his remarks to the matter under discussion . ks08 0 * oliver ascribed his longevity there . ks08 0 * oliver mentioned charles the problem . ks08 0 * oliver fined ten pounds to the prisoner . ks08 0 * oliver drove me a lunatic . ks08 0 * oliver addressed the king the letter . ks08 1 the students of english from seoul faced many issues in the process of interpreting , transcribing , and editing the poems . ks08 1 the love of my life and father of my children would never do such a thing . ks08 1 the museum displayed no painting by miro or drawing by klee . ks08 1 by law , every dog and cat in the area has to be neutered . ks08 1 learning to use a language freely and fully is a lengthy and arduous process . ks08 1 kim put the book in the box . ks08 0 * kim put the book . ks08 0 * is putting the book in the box . ks08 0 * talked with bill about the exam . ks08 1 they wrote to her . ks08 1 they are kind to her . ks08 1 they want to write to her . ks08 0 * they want to wrote to her . ks08 1 they want to be kind to her . ks08 0 * they want to are kind to her . ks08 1 the student knows the answers . ks08 1 the student knew the answers . ks08 1 the students know the answers . ks08 0 * the student knowing the answers . ks08 0 * the student known the answers . ks08 1 he is writing another long book about beavers . ks08 1 broadly speaking , the project was successful . ks08 1 he is proud of his son 's passing the bar exam . ks08 1 the chicken has eaten . ks08 1 the chicken was eaten . ks08 1 seen from this perspective , there is no easy solution . ks08 1 the monkeys kept forgetting their lines . ks08 0 * the monkeys kept forgot their lines . ks08 0 * the monkeys kept forgotten their lines . ks08 0 * we caught them ate the bananas . ks08 0 * we caught them eat the bananas . ks08 0 * we caught them eaten the bananas . ks08 1 john made mary cook korean food . ks08 0 * john made mary to cook korean food . ks08 0 * john made mary cooking korean food . ks08 1 the monkey seems despondent that it is in a cage . ks08 1 the monkey seems despondent . ks08 1 he seems intelligent to study medicine . ks08 0 * he seems intelligent to study medicine . ks08 1 monkeys are eager to leave . ks08 0 * monkeys are eager leaving the compound . ks08 0 * the chickens seem fond with the farmer . ks08 1 the foxes seem compatible with the chickens . ks08 0 * the foxes seem compatible for the chickens . ks08 1 these are similar to the bottles . ks08 0 * these are similar with the bottles . ks08 1 the teacher is proud of his students . ks08 0 * the teacher is proud with his students . ks08 1 the contract is subject to approval by my committee . ks08 0 * the contract is subject for approval by my committee . ks08 1 there exists only one truly amphibian mammal . ks08 1 there arose a great storm . ks08 1 there exist few solutions which are cost-effective . ks08 1 there is a riot in the park . ks08 1 there remained just a few problems to be solved . ks08 0 * there runs a man in the park . ks08 0 * there sings a man loudly . ks08 1 they believe that charles darwin 's theory of evolution is just a scientific theory . ks08 1 they believe charles darwin 's theory of evolution is just a scientific theory . ks08 1 john demanded that she stop phoning him . ks08 1 joe warned the class that the exam would be difficult . ks08 1 we told tom that he should consult an accountant . ks08 1 mary convinced me that the argument was sound . ks08 1 tom intends for sam to review that book . ks08 1 john would prefer for the children to finish the oatmeal . ks08 1 for john to either make up such a story or repeat it is outrageous . ks08 1 for john either to make up such a story or to repeat it is outrageous . ks08 1 for john to tell bill such a lie and bill to believe it is outrageous . ks08 1 john intends to review the book . ks08 1 john would prefer to finish the oatmeal . ks08 1 tom tried to ask a question . ks08 0 * tom tried for bill to ask a question . ks08 1 tom tends to avoid confrontations . ks08 0 * tom tends for mary to avoid confrontations . ks08 1 joe hoped to find a solution . ks08 0 * joe hoped for beth to find a solution . ks08 1 john believed it . ks08 1 john believed that he is honest . ks08 1 john mentioned the issue to me . ks08 1 john mentioned to me that the question is an issue . ks08 1 she pinched his arm as hard as she could . ks08 0 * she pinched that he feels pain . ks08 1 we hope that such a vaccine could be available in ten years . ks08 0 * we hope the availability of such a vaccine in ten years . ks08 1 cohen proved the independence of the continuum hypothesis . ks08 1 cohen proved that the continuum hypothesis was independent . ks08 1 john bothers me . ks08 1 that john coughed bothers me . ks08 1 john loves bill . ks08 0 * that john coughs loves bill . ks08 1 that john sold the ostrich surprised bill . ks08 1 for john to train his horse would be desirable . ks08 1 to train his horse would be desirable . ks08 1 that the king or queen be present is a requirement on all royal weddings . ks08 1 which otter you should adopt first is unclear . ks08 0 * that tom missed the lecture was enjoyable . ks08 0 * for john to remove the mother is undeniable . ks08 0 * how much money gordon spent is true . ks08 1 tom is confident that the elephants respect him . ks08 1 tom is insistent that the defendants be truthful . ks08 1 tom seems eager for her brother to catch a cold . ks08 1 tom seems eager to catch a cold . ks08 1 i am ashamed that i neglected you . ks08 1 i am delighted that mary finished his thesis . ks08 1 we were thankful that no one had been hurt . ks08 1 we were glad it was over . ks08 1 bill alleged that fred signed the check . ks08 1 we believe that the directors were present . ks08 1 we convinced him that the operation is safe . ks08 0 * alan is thinking about that his students are eager to learn english . ks08 0 * fred is counting on for tom to make an announcement . ks08 1 the outcome depends on how many candidates participate in the election . ks08 1 fred is thinking about whether he should stay in seoul . ks08 1 the offer made smith admire the administrators . ks08 1 john tried to make sam let george ask bill to keep delivering the mail . ks08 1 john enjoyed drawing trees for his syntax homework . ks08 1 the picture on the wall reminded him of his country . ks08 1 free enterprise is compatible with american values and traditions . ks08 1 we need to be in frequent contact with the clients . ks08 1 acknowledge that everyone has limits . ks08 1 we are aware of the existing problems . ks08 0 * why do n't you leaving me concentrate on my work ? ks08 0 * the general commended that all troops was in dress uniform . ks08 0 * my morning routine features swim free styles slowly for one hour . ks08 0 * you should avoid to travel in the rush hour . ks08 0 * you should attempt answering every question . ks08 0 * the authorities blamed greenpeace with the bombing . ks08 0 * the authorities charged the students of the cheating . ks08 0 * sharon has been eager finishing the book . ks08 0 * we respect mary 's desire for becoming famous . ks08 0 * john referred from the building . ks08 0 * john died to heart disease . ks08 0 * we were glad what to do . ks08 0 * she was busy to make lunch . ks08 1 the constant rain forced the abandonment of the next day 's competitions . ks08 1 aloe may have an analgesic effect on inflammation and minor skin irritations . ks08 1 the public never had faith in his ability to handle the job . ks08 1 he repeated his claim that the people backed his action . ks08 1 we made them take the money . ks08 0 * we made them are rude . ks08 1 do not use these words in the beginning of a sentence . ks08 1 we know the defendants seem eager to testify against the criminal . ks08 1 jane is n't sure whether the students keep the books . ks08 0 * book is available in most countries . ks08 0 * student studies english for 4 hours a day . ks08 1 students study english for 4 hours a day . ks08 1 his friend learned dancing . ks08 1 my bother 's friend learned dancing . ks08 1 the president 's bodyguard learned surveillance . ks08 1 the king of rock and roll 's records led to dancing . ks08 1 president lincoln delivered his gettysburg address in 1863 . ks08 0 * president lincoln delivered her gettysburg address in 1863 . ks08 0 * after reading the pamphlet , judy threw them into the garbage can . ks08 1 after the party , i asked myself why i had faxed invitations to everyone in my office building . ks08 1 edward usually remembered to send a copy of his e-mail to himself . ks08 1 no john smiths attended the meeting . ks08 1 this john smith lives in seoul . ks08 1 there are three davids in my class . ks08 1 it 's nothing like the america i remember . ks08 1 my brother is an einstein at maths . ks08 1 in the book , he talks about his ups and downs at school . ks08 1 if john wants to succeed in corporate life , he has to know the rules of the game . ks08 1 the critique of plato 's republic was written from a contemporary point of view . ks08 1 the characters in shakespeare 's twelfth night live in a world that has been turned upside-down . ks08 0 * the characters in shakespeare 's twelfth night lives in a world that has been turned upside-down . ks08 1 students studying english read conrad 's heart of darkness while at university . ks08 1 you is the only person that i can rely on . ks08 0 * you are the only person that i can rely on . ks08 1 he is the only person that i can rely on . ks08 0 * he are the only person that i can rely on . ks08 1 the boy swims . ks08 0 * the boys swim . ks08 1 king prawns cooked in chili salt and pepper was very much better , a simple dish deliciously executed . ks08 1 four pounds was quite a bit of money in 1950 and it was not easy to come by . ks08 1 five pounds is a lot of money . ks08 0 * five pounds are a lot of money . ks08 1 two drops sanitize anything in your house . ks08 0 * two drops sanitize anything in your house . ks08 1 fifteen dollars in a week is much . ks08 0 * fifteen dollars in a week are not much . ks08 1 fifteen years represents a long period of his life . ks08 0 * fifteen years represent a long period of his life . ks08 1 two miles is as far as they can walk . ks08 0 * two miles are as far as they can walk . ks08 1 this government have been more transparent in the way they have dealt with public finances than any previous government . ks08 1 this government has been more transparent in the way they have dealt with public finances than any previous government . ks08 1 in preparation for the return fixture this team has trained more efficiently than they had in recent months . ks08 1 she does n't believe much of that story . ks08 1 we listened to as little of his speech as possible . ks08 1 how much of the fresco did the flood damage ? ks08 0 * she does n't believe much story . ks08 0 * we listened to as little speech as possible . ks08 0 * how much fresco did the flood damage ? ks08 0 * i read some book . ks08 1 one of the people was dying of thirst . ks08 1 many of the people were dying of thirst . ks08 0 * one people was dying of thirst . ks08 1 many people were dying of thirst . ks08 1 each of the suggestions is acceptable . ks08 1 neither of the cars has air conditioning . ks08 1 none of these men wants to be president . ks08 1 most of the children are here . ks08 1 some of the soup needs more salt . ks08 1 some of the diners need menus . ks08 1 all of the land belongs to the government . ks08 1 all of these cars belong to me . ks08 1 john is in the room . ks08 1 i am fond of him . ks08 1 most of john 's boat has been repainted . ks08 1 some of the record contains evidence of wrongdoing . ks08 1 much of that theory is unfounded . ks08 0 * one of the story has appeared in your newspaper . ks08 1 he is afraid of foxes . ks08 1 it is a wooden desk . ks08 1 it is the main street . ks08 0 * it is an alive fish . ks08 0 * they are afraid people . ks08 0 * this objection is main . ks08 0 * this fact is key . ks08 1 the man eager to start the meeting is john 's sister . ks08 1 the man holding the bottle disappeared . ks08 1 the papers removed from the safe have not been found . ks08 1 the money that you gave me disappeared last night . ks08 0 * john in the doorway waved to his father . ks08 0 * he in the doorway waved to his father . ks08 1 and index values of the subject and the main verb . ks08 1 neither of these men is worthy to lead italy . ks08 1 none of his customary excuses suffices edgar now . ks08 1 one of the problems was the robins . ks08 1 all of the plant virus web sites have been conveniently collected in one central location . ks08 1 some of the water from melted snow also goes into the ground for plants . ks08 1 most of the milk your baby consumes during breastfeeding is produced during nursing . ks08 1 all special rights of voting in the election were abolished . ks08 1 one of major factors affecting the value of diamonds was their weight . ks08 1 each of these stones has to be cut and polished . ks08 1 most of her free time was spent attending concerts and plays or visiting museums and art galleries . ks08 1 the committee was unanimous in their decision . ks08 1 the committee have all now resigned . ks08 0 * the committee has all now resigned . ks08 1 the crew have both agreed to change sponsor . ks08 0 * the crew has both agreed to change sponsor . ks08 1 her family is all avid skiers . ks08 0 * her family are all avid skiers . ks08 0 * a variety of styles have been in vogue for the last year . ks08 1 both of the workers will wear carnations . ks08 1 both the workers will wear carnations . ks08 1 both will wear carnations . ks08 1 few doctors approve of our remedy . ks08 1 few approve of our remedy . ks08 1 an example of these substances be tobacco . ks08 1 the effectiveness of teaching and learning depend on several factors . ks08 1 one of the most serious problems that some students have be lack of motivation . ks08 1 ten years be a long time to spend in prison . ks08 1 everyone of us be given a prize . ks08 1 some of the fruit be going bad . ks08 1 all of his wealth come from real estate investments . ks08 1 do some of your relatives live nearby ? ks08 1 fifty pounds seem like a lot of weight to lose in one year . ks08 1 news of persephone and demeter reach the great gods and goddesses of olympus . ks08 1 half of the year be dark and wintry . ks08 1 some of the promoters of ostrich meat compare its taste to beef tenderloin . ks08 1 the committee has n't yet made up its mind . ks08 0 * the committee has n't yet made up their mind . ks08 1 the committee have n't yet made up their mind . ks08 0 * the committee have n't yet made up its mind . ks08 0 * that dog is so ferocious , it even tried to bite himself . ks08 0 * i washed me . ks08 1 i washed myself . ks08 0 * you washed myself . ks08 1 i washed you . ks08 1 he kicked you . ks08 0 * i washed yourself . ks08 1 you washed yourself . ks08 1 harry says that sally dislikes him . ks08 0 * harry says that sally dislikes himself . ks08 0 * sally wishes that everyone would praise herself . ks08 1 sally believes that she is brilliant . ks08 0 * sally believes that herself is brilliant . ks08 1 the power of your mind and the power of your body have a tight connection . ks08 1 john tries to fix the computer . ks08 1 john seems to fix the computer . ks08 1 mary persuaded john to fix the computer . ks08 1 mary expected john to fix the computer . ks08 0 * john to fix the computer . ks08 0 * seems john to fix the computer . ks08 1 john tries to be honest . ks08 1 john seems to be honest . ks08 1 john makes efforts for himself to be honest . ks08 1 it seems that john is honest . ks08 1 it tends to be warm in september . ks08 1 it seems to bother kim that they resigned . ks08 0 * it tries to be warm in september . ks08 0 * it hopes to bother kim that they resigned . ks08 1 it is easy to please kim . ks08 1 john is eager to please kim . ks08 0 * there tries to be warm in september . ks08 0 * there hopes to bother kim that they resigned . ks08 0 * it is eager to please kim . ks08 1 stephen seemed to be intelligent . ks08 1 it seems to be easy to fool ben . ks08 1 there is likely to be a letter in the mailbox . ks08 1 tabs are likely to be kept on participants . ks08 0 * john seems to be easy to fool ben . ks08 0 * john is likely to be kept on participants . ks08 1 sandy tried to eat oysters . ks08 0 * there tried to be riots in seoul . ks08 0 * it tried to bother me that chris lied . ks08 0 * tabs try to be kept on bob by the fbi . ks08 0 * that he is clever is eager to be obvious . ks08 1 the king thanked the man . ks08 1 the color red seems to be his favorite color . ks08 1 the cat seems to be out of the bag . ks08 1 the dentist is likely to examine pat . ks08 1 pat is likely to be examined by the dentist . ks08 1 the dentist is eager to examine pat . ks08 1 pat is eager to be examined by the dentist . ks08 1 stephen believed ben to be careful . ks08 1 stephen persuaded ben to be careful . ks08 1 stephen believed it to be easy to please kim . ks08 0 * stephen persuaded it to be easy to please kim . ks08 1 stephen believed there to be a fountain in the park . ks08 0 * stephen persuaded there to be a fountain in the park . ks08 1 stephen believed the cat to be out of the bag . ks08 0 * stephen persuaded the cat to be out of the bag . ks08 1 the dentist was believed to have examined pat . ks08 1 pat was believed to have been examined by the dentist . ks08 1 the dentist was persuaded to examine pat . ks08 1 stephen seems to be irritating . ks08 1 tom believes stephen to be irritating . ks08 1 john persuaded stephen to be more careful . ks08 0 * it seemed to be intelligent . ks08 1 it seemed to rain . ks08 1 there seemed to be a fountain in the park . ks08 1 stephen tried to be intelligent . ks08 0 * it tried to be intelligent . ks08 0 * there tried to be intelligent . ks08 0 * it tried to rain . ks08 0 * there tried to be a fountain in the park . ks08 1 someone tried to leave the town . ks08 1 there seems to be a fountain in the park . ks08 0 * it seems to be a fountain in the park . ks08 0 * john seems to be a fountain in the park . ks08 1 we believed there to be a fountain in the park . ks08 0 * we believed it to be a fountain in the park . ks08 0 * there tries to leave the country . ks08 0 * we believed it to try to leave the country . ks08 0 * we believed there to try to leave the country . ks08 1 we believed john to try to leave the country . ks08 1 the cat tries to be out of the bag . ks08 1 they persuaded me to leave . ks08 1 they promised me to leave . ks08 0 * they persuaded it to rain . ks08 0 * they promised it to rain . ks08 1 under the bed is a fun place to hide . ks08 0 * under the bed wants to be a fun place to hide . ks08 1 kim may have admitted to let mary mow the lawn . ks08 1 gregory appears to have wanted to be loyal to the company . ks08 1 jones would prefer for it to be clear to barry that the city plans to sue him . ks08 1 john continues to avoid the conflict . ks08 1 the captain ordered the troops to proceed . ks08 1 he coaxed his brother to give him the candy . ks08 1 john wants it to be clear to ben that the city plans to honor him . ks08 0 * john seems to rain . ks08 0 * john is likely to appear that he will win the game . ks08 0 * beth tried for bill to ask a question . ks08 0 * he believed there to be likely that he won the game . ks08 0 * it is likely to seem to be arrogant . ks08 0 * sandy appears that kim is happy . ks08 0 * dana would be unlikely for pat to be called upon . ks08 0 * robin is nothing in the box . ks08 0 * it said that kim was happy . ks08 0 * there preferred for sandy to get the job . ks08 1 there is only one chemical substance involved in nerve transmission . ks08 0 * there are only one chemical substance involved in nerve transmission . ks08 0 * there is more chemical substances involved in nerve transmission . ks08 1 there are more chemical substances involved in nerve transmission . ks08 1 there is believed to be a sheep in the park . ks08 0 * there is believed to be a sheep in the park . ks08 1 there are believed to be sheep in the park . ks08 1 there seems to be no student absent . ks08 0 * there are likely to be no student absent . ks08 1 there is likely to be no student absent . ks08 1 pat expected leslie to be aggressive . ks08 1 pat persuaded leslie to be aggressive . ks08 1 pat promised leslie to be aggressive . ks08 1 kevin urged anne to be loyal to her . ks08 1 we expect the dentist to examine us . ks08 0 * we expect the dentist to examine ourselves . ks08 1 we expect them to examine themselves . ks08 1 we persuaded the dentist to examine us . ks08 0 * we persuaded the dentist to examine ourselves . ks08 1 we persuaded them to examine themselves . ks08 0 * we persuaded them to examine them . ks08 1 john may drink water , and bill drink beer . ks08 1 tom will not leave . ks08 0 * tom kicked not a ball . ks08 1 will tom leave the party now ? ks08 0 * left tom the party already ? ks08 1 john could n't leave the party . ks08 0 * john left n't the party early . ks08 1 if anybody is spoiling the children , john is . ks08 0 * if anybody keeps spoiling the children , john keeps . ks08 1 you should leave , should n't you ? ks08 0 * you did n't leave , left you ? ks08 1 she would never believe that story . ks08 0 * she believed never his story . ks08 1 the boys will all be there . ks08 0 * our team played all well . ks08 1 the children will have been being entertained . ks08 0 * the house is been remodelling . ks08 0 * margaret has had already left . ks08 0 * he has will seeing his children . ks08 0 * he has been must being interrogated by the police at that very moment . ks08 1 mary solved the problem . ks08 1 mary would solve the problem . ks08 1 mary was solving the problem . ks08 1 mary would easily solve the problem . ks08 0 * mary not avoided bill . ks08 1 mary did not avoid bill . ks08 1 fred must have been singing songs and probably was drinking beer . ks08 1 fred must both have been singing songs and have been drinking beer . ks08 1 fred must have both been singing songs and been drinking beer . ks08 1 fred must have been both singing songs and drinking beer . ks08 1 there might be a unicorn in the garden . ks08 1 it will rain tomorrow . ks08 1 john will leave the party earlier . ks08 0 * there hopes to finish the project . ks08 0 * the bus hopes to be here at five . ks08 0 * i hope to can study in france . ks08 1 i hope to study in france . ks08 0 * john stopped can to sign in tune . ks08 0 * john stopped canning to sign in tune . ks08 0 * john wills leave the party early . ks08 0 * john can kicked the ball . ks08 0 * john can kicking the ball . ks08 0 * john can to kick the ball . ks08 1 john will kick the ball . ks08 0 * john will kicked the ball . ks08 0 * john will to kick the ball . ks08 0 * kim must bakes a cake . ks08 0 * kim must baked a cake . ks08 0 * kim must will bake a cake . ks08 1 there may exist a man in the park . ks08 0 * it may exist a man in the park . ks08 0 * it is vital that we will study everyday . ks08 1 he is a fool . ks08 1 he has a car . ks08 1 john is running to the car . ks08 1 was the child in the school ? ks08 1 was the child running to the car ? ks08 1 was the child found ? ks08 1 the child never became crazy . ks08 1 the child was never crazy . ks08 1 the child was never running to the car . ks08 1 the child was never deceived . ks08 1 john is happy about the outcome . ks08 1 john was seeing his children . ks08 1 the children are seen in the yard . ks08 1 john has not sung a song . ks08 1 has john sung a song ? ks08 1 john has n't been singing a song . ks08 1 john has sung a song and mary has too . ks08 1 john can have danced . ks08 1 john can be dancing . ks08 1 he has seen his children . ks08 1 he will have been seeing his children . ks08 0 * americans have paying income tax ever since 1913 . ks08 0 * george has went to america . ks08 1 is out since the following is finite . ks08 1 you are a student . ks08 1 you have not enough money . ks08 1 have you enough money ? ks08 1 john does not like this town . ks08 1 in no other circumstances does that distinction matter . ks08 1 they did n't leave any food . ks08 0 * they expected us to do leave him . ks08 0 * they expected us to should leave him . ks08 0 * i found myself doing need sleep . ks08 0 * he does be leaving . ks08 0 * he does have been eating . ks08 0 * they will do come . ks08 1 john did leave . ks08 1 did john find the solution ? ks08 1 how long did it last ? ks08 1 john may leave . ks08 1 it may rain . ks08 0 * john may rain . ks08 1 john did not leave . ks08 0 * john did not rain . ks08 1 he might have left . ks08 0 * he might do leave . ks08 0 * he does can leave here . ks08 0 * he does may leave here . ks08 0 * jim does have supported the theory . ks08 0 * the proposal did be endorsed by clinton . ks08 0 * i do not have sung . ks08 0 * i do not be happy . ks08 1 do be honest ! ks08 1 do n't be silly ! ks08 0 * john believed kim to do not leave here . ks08 1 john believes kim not to leave here . ks08 0 * john believed kim to leaving here . ks08 0 * john did not leaving here . ks08 0 * john expect to must leave . ks08 0 * john did not may leave . ks08 1 tom wanted to go home , but peter did n't want to . ks08 1 lee voted for bill because his father told him to . ks08 1 kim regrets not having seen the movie . ks08 1 kim regrets never having seen the movie . ks08 1 we asked him not to try to call us again . ks08 1 we asked him never to try to call us again . ks08 1 duty made them not miss the weekly meetings . ks08 1 duty made them never miss the weekly meetings . ks08 1 not speaking english is a disadvantage . ks08 0 * speaking not english is a disadvantage . ks08 0 * lee likes not kim . ks08 1 lee is believed not to like kim . ks08 1 lee is believed to not like kim . ks08 0 * lee is believed to like not kim . ks08 1 the president could not approve the bill . ks08 1 it would be possible for the president not to approve the bill . ks08 1 it would not be possible for the president to approve the bill . ks08 0 * lee not left . ks08 1 lee will never leave . ks08 1 lee will not leave . ks08 1 john could not leave the town . ks08 0 * john not left the town . ks08 0 * john not could leave the town . ks08 1 mary sang a song , but lee never did . ks08 0 * mary sang a song , but lee did never . ks08 1 mary sang a song , but lee did not . ks08 1 the president could not approve the bill , could n't he ? ks08 0 * the president could not approve the bill , could he ? ks08 1 are you studying english syntax ? ks08 1 what are you studying nowadays ? ks08 1 i shall go downtown . ks08 1 shall i go downtown ? ks08 1 may she live forever ! ks08 1 was i that stupid ? ks08 1 do n't you even touch that ! ks08 1 you better not drink . ks08 1 you can do it , but you better not . ks08 0 * better you not drink . ks08 1 they 'd leave soon . ks08 1 they would n't leave soon . ks08 1 they should n't leave soon . ks08 1 they can do it , ca n't they ? ks08 1 they ca n't do it , can they ? ks08 0 * they ca n't do it , ca n't they ? ks08 0 * they ca n't do it , can he ? ks08 1 kim can dance , and sandy can , too . ks08 1 kim has danced , and sandy has , too . ks08 1 kim was dancing , and sandy was , too . ks08 0 * kim considered joining the navy , but i never considered . ks08 0 * kim wanted to go and sandy wanted , too . ks08 1 kim is happy and sandy is too . ks08 1 when kim was in china , i was too . ks08 1 have you anything to share with the group ? ks08 1 have you brought anything to share with the group ? ks08 1 sandy must have been , too . ks08 1 sandy must have , too . ks08 1 sandy must , too . ks08 1 because john persuaded sally to , he did n't have to talk to the reporters . ks08 0 * mary sang a song , but lee could never . ks08 1 mary sang a song , but lee could not . ks08 1 john got sent to prison . ks08 1 he ought to leave his luggage here . ks08 1 he dared not argue against his parents . ks08 1 he used to go there very often . ks08 1 the gardener must trim the rose bushes today . ks08 1 this should be the beginning of a beautiful friendship . ks08 1 i am removing the shovel from the shed . ks08 1 the travelers have returned from their vacation . ks08 1 springfield would have built a police station with the federal grant . ks08 1 sharks could have been cruising near the beach . ks08 1 she seem to have given financial assistance to an important french art dealer . ks08 0 * ann may spending her vacation in italy . ks08 0 * ann may spends her vacation in italy . ks08 0 * ann may spent her vacation in italy . ks08 1 it has rained every day for the last week . ks08 0 * it has raining every day for the last week . ks08 0 * it has rains every day for the last week . ks08 0 * it has rain every day for the last week . ks08 1 tagalog is spoken in the philippines . ks08 0 * tagalog is speak in the philippines . ks08 0 * tagalog is speaks in the philippines . ks08 0 * tagalog is spoke in the philippines . ks08 1 the roof is leaking . ks08 0 * the roof is leaked . ks08 0 * the roof is leaks . ks08 0 * george is having lived in toledo for thirty years . ks08 0 * the house is been remodeling . ks08 0 * a medal was been given to the mayor by the sewer commissioner . ks08 0 * does john have gone to the library ? ks08 0 * john seems fond of ice cream , and bill seems , too . ks08 1 sam may have been being interrogated by the fbi . ks08 0 * sam may have been being interrogating by the fbi . ks08 0 * sam may be had been interrogating by the fbi . ks08 1 have social problems made police work difficult ? ks08 1 the senator should not have forgotten the concerns of her constituents . ks08 1 tokyo has not loosened trade restrictions . ks08 1 did the doctor prescribe aspirin ? ks08 1 sandy will read your reports , but harold will not . ks08 1 he can hardly believe that it 's already over . ks08 1 i could have little known that more trouble was just around the corner . ks08 1 i have never been spoken to so rudely ! ks08 1 hardly was there any rain falling . ks08 1 little did i know that more trouble was just around the corner . ks08 1 never have i been spoken to so rudely ! ks08 1 he had hardly collected the papers on his desk , had he ? ks08 0 * he had hardly collected the papers on his desk , had n't he ? ks08 1 he never achieved anything , did he ? ks08 0 * he never achieved anything , did n't he ? ks08 1 as a statesman , he scarcely could do anything worth mentioning . ks08 0 * as a statesman , scarcely he could do anything worth mentioning . ks08 0 * any zebras ca n't fly . ks08 0 * anything has n't happened to his optimism . ks08 0 * any of the citizens hardly ever say anything . ks08 1 i did n't find any bugs in my bed . ks08 1 nobody told them anything . ks08 1 never have i stolen from any members of your family . ks08 1 why have n't any books been returned ? ks08 1 hardly any of the citizens ever say anything . ks08 1 these lines were written by one of korea 's most famous poets . ks08 1 the unidentified victim was apparently struck during the early morning hours . ks08 1 targets can be observed at any angle . ks08 1 during the early evening , saturn can be found in the north , while jupiter rises in the east . ks08 1 i poured 20 liters of acid into the beaker . ks08 1 about 20 liters of acid was poured into the beaker . ks08 1 the executive committee approved the new policy . ks08 1 the new policy was approved by the executive committee . ks08 1 john has taken bill to the library . ks08 1 john has chosen bill for the position . ks08 0 * john has taken to the library . ks08 0 * john has chosen for the position . ks08 0 * the guide has been taken john to the library . ks08 0 * the department has been chosen john for the position . ks08 1 john has been taken to the library . ks08 1 john has been chosen for the position . ks08 1 pat handed a book to chris . ks08 0 * pat handed to chris . ks08 0 * pat handed a book . ks08 1 a book was handed to chris by pat . ks08 0 * a book was handed by pat . ks08 1 a book was handed to chris . ks08 0 * a book was handed . ks08 1 they believe it to be easy to annoy ben . ks08 0 * they believe stephen to be easy to annoy ben . ks08 1 they believe there to be a dragon in the wood . ks08 1 it is believed to be easy to annoy ben . ks08 0 * stephen is believed to be easy to annoy ben . ks08 1 there is believed to be a dragon in the wood . ks08 1 no one believes that he is a fool . ks08 1 no one suspects that he is a fool . ks08 1 that he is a fool is suspected by no one . ks08 1 they believe the cat to be out of the bag . ks08 1 the cat is believed to be out of the bag . ks08 1 john drove the car . ks08 1 john was driving the car . ks08 1 the car was being driven . ks08 1 john will drive the car . ks08 1 the car will be driven . ks08 1 john has driven the car . ks08 1 the car has been driven . ks08 1 john has been driving the car . ks08 1 the car has been being driven . ks08 1 the car will have been being driven . ks08 1 pat handed chris a note . ks08 1 chris was handed a note . ks08 1 chris was handed a note by pat . ks08 1 ideas are put into children 's heads by tv . ks08 1 yesterday , the child really kicked a monkey in the street . ks08 1 the model resembles kim in nearly every detail . ks08 0 * kim is resembled by the model in nearly every detail . ks08 0 * you are not fitted by the coat . ks08 1 i was born in 1970 . ks08 1 it is rumored that he is on his way out . ks08 1 john is said to be rich . ks08 1 he is reputed to be a good scholar . ks08 0 * my mother bore me in 1970 . ks08 0 * everyone rumored that he was on his way out . ks08 0 * they said him to be rich . ks08 0 * they reputed him to be a good scholar . ks08 1 he kicked the ball . ks08 1 the ball was kicked by him . ks08 1 john kicked him . ks08 1 he was kicked by john . ks08 1 john sent her to seoul . ks08 1 she was sent to seoul . ks08 1 they widely believed that john was ill . ks08 1 that john was ill was widely believed . ks08 1 they have n't decided which attorney will give the closing argument . ks08 1 which attorney will give the closing argument has n't been decided . ks08 1 which attorney will give the closing argument has n't been decided by them . ks08 1 you can rely on ben . ks08 1 ben can be relied on . ks08 1 they talked about the scandal for days . ks08 1 the scandal was talked about for days . ks08 1 the issue was dealt with promptly . ks08 1 that 's not what 's asked for . ks08 1 this should be attended to immediately . ks08 0 * the capital was gathered near by a crowd of people . ks08 0 * the hot sun was played under by the children . ks08 1 that 's something i would have paid twice for . ks08 1 these are the books that we have gone most thoroughly over . ks08 1 they look generally on john as selfish . ks08 0 * everything was paid twice for . ks08 0 * your books were gone most thoroughly over . ks08 0 * he is looked generally on as selfish . ks08 1 pavarotti relied on loren and bond on hepburn . ks08 0 * pavarotti relied on loren and bond hepburn . ks08 1 loren was relied on by pavarotti and hepburn by bond . ks08 0 * loren was relied on by pavarotti and hepburn on by bond . ks08 1 the lawyer looked into the document . ks08 1 the document was looked into by the lawyer . ks08 1 peter has been asked to resign . ks08 1 i assume the matter to have been filed in the appropriate records . ks08 1 smith wants the picture to be removed from the office . ks08 1 the events have been described well . ks08 1 over 120 different contaminants have been dumped into the river . ks08 1 the balloon is positioned in an area of blockage and is inflated . ks08 1 cancer is now thought to be unlikely to be caused by hot dogs . ks08 1 whether this is feasible has n't yet been determined . ks08 1 paying taxes ca n't be avoided . ks08 1 it has n't yet been determined whether this is feasible . ks08 1 frances has had the drapes cleaned . ks08 1 shirley seems to have fred promoted . ks08 1 nina got bill elected to the committee . ks08 1 we got our car radio stolen twice on holiday . ks08 1 frances has had her clean the drapes . ks08 1 nina got them to elect bill . ks08 1 the news was dealt with carefully . ks08 1 the tree was looked after by kim . ks08 1 we can not put up with the noise anymore . ks08 1 he will keep up with their expectations . ks08 1 this noise can not be put up with . ks08 1 their expectations will be kept up with . ks08 1 they paid a lot of attention to the matter . ks08 1 the son took care of his parents . ks08 1 the matter was paid a lot of attention to . ks08 1 a lot of attention was paid to the matter . ks08 0 * new york was slept in . ks08 0 * the lake was camped beside by my sister . ks08 1 the lake is not to be camped beside by anybody . ks08 0 * six inches were grown by the boy . ks08 0 * a mile to work was run by him . ks08 1 the beans were grown by the gardener . ks08 1 the plums were weighed by the grocer . ks08 0 * san francisco has been lived in by my brother . ks08 1 the house has been lived in by several famous personages . ks08 0 * seoul was slept in by the businessman last night . ks08 1 this bed was surely slept in by a huge guy last night . ks08 1 rosie got struck by lightning . ks08 1 i got phoned by a woman friend . ks08 1 he got hit in the face with the tip of a surfboard . ks08 1 john 's bike got fixed or got stolen . ks08 0 * the lesson got read by a priest . ks08 0 * the letter got written by a poet . ks08 0 * tom got understood to have asked for a refund . ks08 0 * mary got heard to insult her parents . ks08 1 is john clever ? ks08 1 who is clever ? ks08 1 how clever you are ! ks08 1 be very clever . ks08 1 i ask you if this is what you want . ks08 1 would you mind taking out the garbage ? ks08 1 can the child read the book ? ks08 1 what can the child read ? ks08 1 which version did they recommend ? ks08 1 with what did the baby eat the food ? ks08 1 how did he eat the food ? ks08 1 which man did you talk to ? ks08 1 to which man did you talk ? ks08 1 how ill has hobbs been ? ks08 0 * which man did you talk ? ks08 0 * to which man did you talk to ? ks08 1 who do you think hobbs imagined mary said tom saw ? ks08 1 who did kim work for and sandy rely on ? ks08 0 * who did kim work for and sandy rely ? ks08 0 * who did kim work for and sandy rely on mary ? ks08 1 you can rely on edward 's help . ks08 1 edward 's help , you can rely on . ks08 1 we talked about the fact that he was sick for days . ks08 1 the fact that he was sick for days , we talked about . ks08 0 * you can rely on that he will help you . ks08 0 * we talked about that he was sick for days . ks08 1 that he was sick , we talked about for days . ks08 1 that arrows do n't stop in midair is captured by this theory . ks08 0 * who did you see and a picture of ? ks08 1 these qualities recommended him to oliver . ks08 1 the un recommended an enlarged peacekeeping force . ks08 1 this is the book which the teacher recommended . ks08 1 who will they recommend ? ks08 1 john put the books in a box . ks08 1 which books did john put in the box ? ks08 1 where did john put the books ? ks08 1 in which box did john put the book ? ks08 1 how happy has john been ? ks08 1 who put the book in the box ? ks08 1 who did put the book in the box ? ks08 1 who can put the book in the box ? ks08 1 who do you think visited seoul last year ? ks08 1 that 's the un delegate that the government thinks visits seoul last year . ks08 1 who do you believe that sara invited ? ks08 1 who do you believe invited sara ? ks08 0 * who do you believe that invited sara ? ks08 0 * who do you think that would be nominated for the position ? ks08 1 this is the kind of person who i doubt that under normal circumstances would have anything to do with such a scheme . ks08 1 john asks whose book his son likes . ks08 1 john has forgotten which player his son shouted at . ks08 1 he told me how many employees karen introduced to the visitors . ks08 1 he had been reading the article . ks08 0 * tom denied which book he had been reading . ks08 0 * tom claimed how much money she had spent . ks08 0 * john inquired that he should read it . ks08 0 * peter will decide that we should review the book . ks08 1 john inquired which book he should read . ks08 1 peter will decide which book we should review . ks08 1 john told us that we should review the book . ks08 1 john told us which book we should review . ks08 1 in which box did he put the book ? ks08 1 which book by his father did he read ? ks08 1 john asks in which box he put the book . ks08 1 john asks which book by his father he read . ks08 1 kim has wondered in which room gary stayed . ks08 1 lee asked me how fond of chocolates the monkeys are . ks08 0 * kim has wondered that gary stayed in the room . ks08 0 * kim asked me that the monkeys are very fond of chocolates . ks08 1 john knows whose book mary bought and tom borrowed from her . ks08 0 * john knows whose book mary bought and tom talked . ks08 1 i do n't know whether i should agree . ks08 1 she gets upset if i exclude her from anything . ks08 1 she gets upset whether i exclude her from anything . ks08 1 i wonder if you 'd be kind enough to give us information . ks08 1 i am not certain about when he will come . ks08 1 i am not certain about whether he will go or not . ks08 0 * i am not certain about if he will come . ks08 0 * i am not certain about if he will go or not . ks08 1 i do n't know where to go . ks08 1 i do n't know what to do . ks08 1 i do n't know how to do it . ks08 1 i do n't know whether to agree with him or not . ks08 0 * i do n't know if to agree with him . ks08 0 * i do n't know that to agree with him or not . ks08 1 fred knows which politician to support . ks08 1 karen asked where to put the chairs . ks08 1 the student protected him . ks08 1 who protected him ? ks08 1 to protect him is not an easy task . ks08 0 * fred knows which politician for karen to vote for . ks08 0 * fred knows which politician for her to vote for . ks08 0 * karen asked where for jerry to put the chairs . ks08 0 * karen asked where for him to put the chairs . ks08 1 how carefully have you considered your future career ? ks08 1 when can we register for graduation ? ks08 1 where do we go to register for graduation ? ks08 1 why have you borrowed my pencil ? ks08 1 when did he say that he was fired ? ks08 1 where did he tell you that he met mary ? ks08 1 why do you wonder whether she will invite me ? ks08 1 how often did he ask when she will meet at the party ? ks08 1 what causes students to select particular majors ? ks08 1 who will john ask for information about summer courses ? ks08 1 which textbook did the teacher use in the class last summer ? ks08 1 whose car is blocking the entrance to the store ? ks08 1 why do you think he left ? ks08 1 who do you guess will be here ? ks08 1 who do you think borrowed my book ? ks08 1 which city does fred think that you believe that john lives in ? ks08 1 i wonder on which shelf john will put the book ? ks08 1 what proof that he has implicated have you found ? ks08 1 joseph has forgotten how many matches he has won . ks08 1 fred will warn martha that she should claim that her brother is patriotic . ks08 1 that bill tried to discover which drawer alice put the money in made us realize that we should have left him in seoul . ks08 1 jasper wonders which book he should attempt to persuade his students to buy . ks08 0 * i wonder if on which shelve john will put the book . ks08 0 * i wonder what city that romans destroyed . ks08 0 * john was wondering to whom he was referring to . ks08 0 * who do you think that has given the tickets to bill ? ks08 0 * what city will fred say that mary thinks that john lives ? ks08 0 * on whom does dana believe chris knows sandy trusts ? ks08 0 * the politician denied how the opponent was poisoned . ks08 0 * fred knows which book for the children to read during the summer vacation . ks08 1 this needs mending . ks08 0 * this needs mending the shoe . ks08 0 * he mended . ks08 1 he mended the shoe . ks08 1 this needs investigating . ks08 0 * this needs investigating the problem . ks08 0 * they investigated . ks08 1 they investigated the problem . ks08 1 the video which you recommended was really terrific . ks08 1 the video which i thought you recommended was really terrific . ks08 1 the video which i thought john told us you recommended was really terrific . ks08 1 the student who won the prize left . ks08 1 the student who everyone likes left . ks08 1 the person whom john gave the book to left . ks08 1 the day when i met her was sunny . ks08 1 the president who fred voted for has resigned . ks08 1 the president that fred voted for dislikes his opponents . ks08 1 the president fred voted for has resigned . ks08 1 has no relative pronoun at all . ks08 1 he is the kind of person with whom to consult . ks08 1 these are the things for which to be thankful . ks08 1 we will invite volunteers on whom to work . ks08 1 this is the student pictures of whom appeared in the newspaper . ks08 0 * pictures of whom appeared in the newspaper ? ks08 1 the people happy with the proposal left . ks08 1 the person standing on my foot is heavy . ks08 0 * the paper to finish by tomorrow is too long . ks08 0 * the person stand on my foot is heavy . ks08 0 * the person stood on my foot is heavy . ks08 0 * the student met the senator john met bill . ks08 0 * the student met the senator that john met bill . ks08 0 * the student met the senator for john to meet bill . ks08 1 jack is the person whom jenny fell in love with . ks08 1 jack is the person with whom jenny fell in love . ks08 0 * jack is the person whom jenny fell in love . ks08 1 i met the critic whose remarks i wanted to object to . ks08 1 this is the friend for whose mother kim gave a party . ks08 1 the teacher set us a problem the answer to which we can find in the textbook . ks08 1 we called the senators who met fred . ks08 1 the kid picked up the apple that fell down on the ground . ks08 0 * the student met john came . ks08 0 * the problem intrigued us bothered me . ks08 1 he made a statement which everyone thought was really interesting and important . ks08 1 they all agreed to include those matters which everyone believed had been excluded from the treaty . ks08 1 mary knows that john was elected . ks08 1 that john was elected surprised frank . ks08 1 mary told bill that john was elected . ks08 1 this is the book that we had read . ks08 1 the president abandoned the people that voted for him . ks08 1 it is an argument that people think will never end in egypt . ks08 0 * every essay she 's written and which i 've read is on that pile . ks08 0 * every essay she 's written and that i 've read is on that pile . ks08 1 every essay which she 's written and that i 've read is on that pile . ks08 1 every essay that she 's written and which i 've read is on that pile . ks08 1 the student whose turn it was left . ks08 0 * the student that 's turn it was left . ks08 1 the pencil with which he is writing broke . ks08 0 * the pencil with that he is writing broke . ks08 1 a pencil with which to write broke . ks08 0 * a pencil with that to write broke . ks08 0 * the people in who we placed our trust left . ks08 0 * the person with who we were talking left . ks08 1 the company in which they have invested left . ks08 1 the people in whose house we stayed left . ks08 1 the person with whom he felt most comfortable left . ks08 1 he bought a bench on which to sit . ks08 1 he bought a refrigerator in which to put the beer . ks08 1 there is a bench for you to sit on . ks08 0 * karen asked where for washington to put the chairs . ks08 1 the person i met is from boston . ks08 1 the box we put the books in is sealed . ks08 1 he made a statement everyone thought was interesting and important . ks08 1 they all agreed to include those matters everyone believed had been excluded from the treaty . ks08 1 i just know that the big 12 south teams everyone knew would win actually won the game . ks08 1 the person who john asked for help thinks he is foolish . ks08 1 mary , who john asked for help , thinks he is foolish . ks08 1 john has two sisters , who became lawyers . ks08 1 i met the lady from france who grows peaches . ks08 0 * i met john who grows peaches . ks08 0 * i met her who grows peaches . ks08 1 in the classroom , the teacher praised john , whom i also respect . ks08 1 reagan , whom the republicans nominated in 1980 , lived most of his life in california . ks08 1 every student who attended the party had a good time . ks08 0 * every student , who attended the party , had a good time . ks08 1 no student who scored 80 or more in the exam was ever failed . ks08 0 * no student , who scored 80 or more in the exam , was ever failed . ks08 1 the contestant who won the first prize , who is the judge 's brother-in-law , sang dreadfully . ks08 0 * the contestant , who is the judge 's brother-in-law , who won the first prize sang dreadfully . ks08 1 he who laughs last laughs best . ks08 1 he who is without sin among you , let him cast the first stone . ks08 1 who did he believe that he would one day meet ? ks08 1 which celebrity did he mention that he had run into ? ks08 0 * who did he believe the claim that he had never met ? ks08 0 * which celebrity did he mention the fact that he had run into ? ks08 1 the knife which he threw into the sea had a gold handle . ks08 1 the knife that he threw into the sea had a gold handle . ks08 1 the knife , which he threw into the sea had a gold handle . ks08 0 ?? the knife , that he threw into the sea had a gold handle . ks08 1 bill cooked supper and washed the dishes . ks08 0 * what did bill cook and wash the dishes ? ks08 0 * what did bill cook supper and wash ? ks08 1 he refuted the proof that you can not square it . ks08 0 * what did he refute the proof that you can not square ? ks08 1 they met someone who knows the professor . ks08 0 * which professor did they meet someone who knows ? ks08 1 that he has met the professor is extremely unlikely . ks08 0 * who is that he has met extremely unlikely ? ks08 1 she bought john 's book . ks08 1 did john wonder who would win the game ? ks08 0 * what did john wonder who would win ? ks08 1 what did he get the impression that the problem really was ? ks08 1 this is the paper that we really need to find the linguist who understands . ks08 0 * which rebel leader did you hear cheney 's rumor that the cia assassinated ? ks08 1 students enter high-level educational institutions might face many problems relating to study habits . ks08 1 a fellow student saw this felt sorry for miss kim and offered her his own book . ks08 1 experts all agree that dreams cause great anxiety and stress are called nightmares . ks08 1 the victims of the earthquake their property was destroyed in the disaster were given temporary housing by the government . ks08 1 this is the book which i need to read . ks08 1 the person whom they intended to speak with agreed to reimburse us . ks08 1 the motor that martha thinks that joe replaced costs thirty dollars . ks08 1 the official to whom smith loaned the money has been indicted . ks08 1 the man on whose lap the puppet is sitting is ventriloquist . ks08 1 we just finished the final exam the result of which we can find out next week . ks08 0 * what did herb start to play only after he drank ? ks08 0 * who did herb believe the claim that cheated ? ks08 0 * what was that the vikings ate a real surprise to you ? ks08 0 * what did you meet someone who understands ? ks08 1 the fact that scientists have now established all the genes in the human body is still not widely known . ks08 1 the fact that the scientists used the latest technology to verify was reported at the recent conference . ks08 1 they ignored the suggestion that lee made . ks08 1 they ignored the suggestion that lee lied . ks08 1 they denied the claim that we had advanced by ourselves . ks08 1 they denied the claim that they should report only to us . ks08 1 the hotel where gloria stays is being remodelled . ks08 1 the day when jim got fired was a sad day for everyone . ks08 1 john is tough to persuade . ks08 1 john made it clear that he would finish it on time . ks08 1 it is john that i met last night in the park . ks08 1 i wonder whom sandy loves . ks08 1 this is the politician on whom sandy relies . ks08 1 he is hard to love . ks08 1 it is easy to please john . ks08 1 john is easy to please . ks08 0 * to please john is eager . ks08 0 * it is eager to please john . ks08 1 john is eager to please . ks08 1 to please john is tough . ks08 1 it is tough to please john . ks08 1 john is tough to please . ks08 0 * to please john is ready . ks08 0 * it is ready to please john . ks08 1 john is ready to please . ks08 1 kim is easy to please . ks08 1 kim is eager to please . ks08 1 this doll is hard to see . ks08 1 the child is impossible to teach . ks08 1 the problem is easy to solve . ks08 0 * this doll is hard to see it . ks08 0 * the child is impossible to teach him . ks08 0 * the problem is easy to solve the question . ks08 1 john is eager to examine the patient . ks08 1 john is eager to find a new home . ks08 0 * john is eager to examine . ks08 0 * john is eager to find . ks08 1 hei is easy to please i . ks08 1 this theorem will take only five minutes to prove . ks08 1 this theorem will take only five minutes to establish that he proved in 1930 . ks08 1 this scratch will cost kim $ 500 to fix . ks08 1 this $ 500 bribe will cost the government $ 500,000 to prove that senator jones accepted . ks08 0 * kim is eager to recommend . ks08 1 who is kim eager to recommend ? ks08 1 this sonata is easy to play on this piano . ks08 1 which piano is this sonata easy to play on ? ks08 1 that dogs bark annoys people . ks08 1 it annoys people that dogs bark . ks08 1 why she told him is unclear . ks08 1 it is unclear why she told him . ks08 1 to leave so soon would be inconvenience . ks08 1 it would be inconvenience to leave so soon . ks08 1 it would be inconvenience for you to leave so soon . ks08 1 that the dalai lama claims tibet independence disturbs the chinese government . ks08 1 it disturbs the chinese government that the dalai lama claims tibet independence . ks08 1 i believe the problem to be obvious . ks08 0 * i believe that the problem is not easy to be obvious . ks08 1 i believe it to be obvious that the problem is not easy . ks08 1 i do not think it unreasonable to ask for the return of my subscription . ks08 1 he made it clear he would continue to co-operate with the united nations . ks08 1 they 're not finding it a stress being in the same office . ks08 1 that you came early surprised me . ks08 1 it surprised me that you came early . ks08 0 * surprised me that you came early . ks08 1 that chris knew the answer occurred to pat . ks08 1 it occurred to pat that chris knew the answer . ks08 1 it really freaks me out that we invaded iraq . ks08 1 that we invaded iraq really creeps me out . ks08 1 that we invaded iraq really freaks me out . ks08 1 it really bites that we invaded iraq . ks08 1 that fido barks annoys me . ks08 1 a man came into the room that no one knew . ks08 1 a man came into the room with blond hair . ks08 1 i read a book during the vacation which was written by chomsky . ks08 1 ray found the outcome frustrating . ks08 1 ray found it frustrating that his policies made little impact on poverty . ks08 0 * i made to settle the matter my objective . ks08 1 i made it my objective to settle the matter . ks08 1 i made the settlement of the matter my objective . ks08 0 * i owe that the jury acquitted me to you . ks08 1 i owe it to you that the jury acquitted me . ks08 1 i owe my acquittal to you . ks08 1 i believe strongly that the world is round . ks08 0 * i believe that the world is round strongly . ks08 1 it 's their teaching material that we 're using . ks08 1 what we 're using is their teaching material . ks08 1 their teaching material is what we are using . ks08 1 we are using their teaching material . ks08 1 i share your view but i just wonder why you think that 's good . ks08 1 it was the man that bought the articles from him . ks08 1 it was then that he felt a sharp pain . ks08 1 it was to the student that the teacher gave the best advice . ks08 1 it was not until i was perhaps twenty-five or thirty that i read and enjoyed them . ks08 0 * it was to finish the homework that john tried . ks08 0 * it is that bill is honest that john believes . ks08 1 it 's the second monday that we get back from easter holiday . ks08 1 it was the girl who kicked the ball . ks08 1 it 's mainly his attitude which convinced the teacher . ks08 1 what you want is a little greenhouse . ks08 1 what 's actually happening in london at the moment is immensely exciting . ks08 1 what is to come is in this document . ks08 1 what i 've always tended to do is to do my own stretches at home . ks08 1 what i meant was that you have done it really well . ks08 1 what happened is they caught her without a license . ks08 1 what the gentleman seemed to be asking is how policy would have differed . ks08 1 insensitive is how i would describe him . ks08 1 in the early morning is when i do my best research . ks08 0 * wear it like that is what you do . ks08 0 * they caught her without a license is what happened . ks08 0 * that you have done it really well is what i meant . ks08 1 that 's when i read . ks08 1 that was why she looked so nice . ks08 1 that 's how they do it . ks08 1 that 's who i played with over christmas . ks08 1 what you heard was an explosion . ks08 1 it was an explosion that you heard . ks08 1 what you should do is order one first . ks08 0 * it is order one first that you should do first . ks08 0 * order one first is what you should do . ks08 1 it was not until i was perhaps twenty-five or thirty that i read them and enjoyed them . ks08 0 * when i read them and enjoyed them was not until i was perhaps twenty-five . ks08 0 * not until i was perhaps twenty-five was when i read them and enjoyed them . ks08 1 it 's the writer that gets you so involved . ks08 0 * that gets you so involved is the writer . ks08 0 * the writer is that gets you so involved . ks08 1 and it was this matter on which i consulted with the chairman of the select committee . ks08 0 * on which i consulted with the chairman of the select committee was this matter . ks08 0 * this matter was on which i consulted with the chairman of the select committee . ks08 1 what i ate is an apple . ks08 1 what we are using is their teaching material . ks08 1 the student who got a in the class was very happy . ks08 1 the one who broke the window was mr. kim . ks08 1 he got what he wanted . ks08 1 he put the money where lee told him to put it . ks08 1 the concert started when the bell rang . ks08 0 * lee wants to meet who kim hired . ks08 0 * lee solved the puzzle how kim solved it . ks08 0 * which book he read the book was that one . ks08 1 i ate what john ate . ks08 1 i ate an apple . ks08 0 * to whom i gave the cake is john . ks08 0 * that brought the letter is bill . ks08 1 this is how he did it . ks08 1 this is why he came early . ks08 1 type a : it is on bill that john relies . ks08 1 type b : it is bill on whom john relies . ks08 1 it was then when we all went to bed . ks08 0 * john that we are looking for showed up . ks08 1 it 's the second monday that we get back from easter . ks08 1 it was in 1997 when the in introduced the alien registration receipt card . ks08 1 it is uncle john whose address i lost . ks08 0 * it is kim on whom that sandy relies . ks08 0 * it is kim on whom sandy relies on . ks08 0 * it is kim whom sandy relies . ks08 1 it was the director that she wants to meet . ks08 1 it was the director that she said she wants to meet . ks08 1 it was the director that i think she said she wants to meet . ks08 1 i wonder who it was who saw you . ks08 1 i wonder who it was you saw . ks08 1 i wonder in which pocket it was that kim had hidden the jewels . ks08 1 who do you think it is that mary met ? ks08 0 * to whom do you think it is the book that mary gave ? ks08 1 it is difficult for me to concentrate on calculus . ks08 1 for me to concentrate on calculus is difficult . ks08 1 calculus is difficult for me to concentrate on . ks08 1 being lovely to look at has its advantages . ks08 1 letters to grandma are easy to help the children to write . ks08 1 it was to boston that they decided to take the patient . ks08 1 it was with a great deal of regret that i vetoed your proposal . ks08 1 it was tom who spilled beer on this couch . ks08 1 it is martha whose work critics will praise . ks08 1 it was john on whom the sheriff placed the blame . ks08 1 i wondered who it was you saw . ks08 1 i was wondering in which pocket it was that kim had hidden the jewels . ks08 0 * it is on kim on whom sandy relies . ks08 1 was it for this that we suffered and toiled ? ks08 1 who was it who interviewed you ? ks08 1 i believe it to be her father who was primarily responsible . ks08 1 i believe it to be the switch that is defective . ks08 1 tom ate what mary offered to him . ks08 1 i wonder what mary offered to him . ks08 1 what mary offered to him is unclear . kl93 1 i do n't have any potatoes . kl93 0 * i have any potatoes . kl93 1 at most three girls saw anything . kl93 0 * at least three girls saw anything . kl93 1 every girl who saw anything was happy . kl93 0 * some girl who saw anything was happy . kl93 1 any owl hunts mice . kl93 1 any lawyer could tell you that . kl93 1 i would dance with anybody . kl93 1 almost every lawyer could answer that question . kl93 1 almost no lawyer could answer that question . kl93 1 almost any lawyer could answer that question . kl93 0 * i do n't have almost any potatoes . kl93 1 i would dance with mary or sue . kl93 1 mary or sue could tell you that . kl93 1 do you have dry socks ? claim . kl93 1 perhaps some dry socks would help ? kl93 1 an owl hunts mice . kl93 1 generics allow exceptions . kl93 1 a poodle gives live birth . kl93 1 every poodle gives live birth . kl93 1 i do n't have potatoes . kl93 1 every man who has matches is happy . kl93 1 every man who has any matches is happy . kl93 1 could we make some french fries ? kl93 1 why do n't we make some french fries ? kl93 1 are you prepared for school tomorrow ? kl93 1 and then all the owls go on a mice hunt . kl93 1 if you take a dry match and strike it , it lights . kl93 1 at most three teachers assigned homework . kl93 1 at most three teachers assigned any homework . kl93 1 every student who handed in some homework will get a prize . kl93 1 every student who handed in any homework will get a prize . kl93 1 before you make plans , consult the secretary . kl93 1 before you make any plans , consult the secretary . kl93 1 is there anything i can do for you ? kl93 1 a professional dancer would be able to do it . kl93 1 any professional dancer would be able to do it . kl93 1 we do n't have potatoes , or at least not enough . kl93 1 every man who has any matches is happy . happy . kl93 0 * every boy has any potatoes . kl93 0 * it 's not the case that every boy has any potatoes . kl93 1 i 'm surprised we had any potatoes . kl93 1 at most three boys did n't see anything . kl93 0 * even sue said anything . kl93 1 sue was the most likely not to say anything . kl93 1 sue said something although she was the most likely not to say anything . kl93 1 cows fly more often than john visits any relatives . kl93 0 * each candidate who has any interest in semantics will be admit ted to the department . kl93 1 every child should have a daily glass of milk . kl93 1 each child should have a daily glass of milk . kl93 1 i 'm surprised that he ever said anything . kl93 1 i 'm sorry that he ever said anything . kl93 0 * i 'm glad that i ever met him . kl93 0 * i 'm sure that i ever met him . kl93 1 i 'm surprised he bought a car . kl93 1 but these tickets are terrible ! kl93 1 i was surprised that he stole the watch , in as far as that was a daring thing to do . kl93 1 given my high opinion on his moral character , i was surprised that he stole the watch . kl93 1 were you surprised that he stole the watch ? kl93 1 i 'm sorry that anybody hates me . kl93 1 i want for nobody to hate me . kl93 1 i 'm glad he bought a car . kl93 1 i 'm sorry he bought a car . kl93 1 he bought a honda . kl93 0 * i 'm glad i saw anybody . kl93 1 i 'm glad anybody likes me ! kl93 1 could n't you get any tickets better than this ? kl93 1 it 's fine that he paid and apologized , but i do n't really care about his gratitude , or the money , or anything . kl93 0 * i 'm sure we got any tickets ! kl93 1 i 'm sure he speaks to me ! kl93 1 i 'm glad a linguist likes me . kl93 1 i did n't help him because i have any sympathy for urban guerillas . kl93 0 * it is n't because sue said anything bad about me that i 'm angry , although she did say some bad things about me . kl93 1 i do n't have any sympathy for urban guerillas . kl93 0 * almost an owl hunts mice . kl93 0 * absolutely an owl hunts mice . kl93 1 almost any owl hunts mice . kl93 1 absolutely any owl hunts mice . b_82 1 he began writing poems . b_82 1 he kept writing poems . b_82 1 he continued writing poems . b_82 1 he stopped writing poems . b_82 1 the men would have all been working . b_82 1 the men would have been all working . b_82 1 would the men each have been working ? b_82 0 * would each the men have been working ? b_82 1 the men would not enjoy that . b_82 0 * would not the men enjoy that ? b_82 1 would the men not enjoy that ? b_82 0 * the men would all not have been working . b_82 1 the men all would not have been working . b_82 1 the men would not have all been working . b_82 1 the men would not all have been working . b_82 1 the men would not have been all working . b_82 1 that john is a fool is obvious . b_82 1 it is obvious that john is a fool . b_82 0 * john believes that fred likes steak that joe likes pizza . b_82 1 john whined that he was hungry . b_82 0 * that he was hungry was whined by john . b_82 1 john is certain that the mets will win . b_82 1 that he has blood on his hands proves that john is the murderer . b_82 0 * it proves that john is the murderer that he has blood on his hands . b_82 1 to please john would be difficult . b_82 1 it would be difficult to please john . b_82 1 it is believed to be obvious by everyone that fred is crazy . b_82 0 * john is believed to be certain by everyone that fred is crazy . b_82 1 it disturbed him that people did n't like fred . b_82 1 it was believed to have disturbed him that people did n't like fred . b_82 0 * how easy to please john is it ? b_82 0 * how difficult to study for the exam was it ? b_82 0 * how hard to read the book was it ? b_82 0 * how easy to tease john it is ! b_82 0 * how hard to read the book it was ! b_82 1 how certain that the mets will win are you ? b_82 1 how likely to win is he ? b_82 1 this book i enjoyed . b_82 0 * to whom the book did you give . b_82 0 * the book to whom did you give . b_82 1 he 's a man to whom liberty we could never grant . b_82 1 it 's obvious that mary , he ca n't stand . b_82 1 i think that the trolls will take the shepherd tomorrow . b_82 1 as for max , i really like him . b_82 0 * he 's a man to whom as for liberty , we could never grant it . b_82 0 * he 's a man to whom liberty , we could never grant it . b_82 1 john would like that because he 's such a nice guy . b_82 1 john , because he 's such a nice guy , would like that . b_82 1 because he 's such a nice guy , john would like that . b_82 1 john would , because he 's such a nice guy , like that . b_82 1 because he 's such a nice guy , what would john like ? b_82 1 it 's obvious that , although he 's a nice guy , john is n't too bright . b_82 0 * john ate after getting home the steak . b_82 1 i gave mary a book . b_82 1 i considered fred crazy . b_82 1 i put the book on the table . b_82 1 i worded the telegram tersely . b_82 0 * i considered fred after the party crazy . b_82 0 * 1 put the book after the party on the table . b_82 0 * i worded the telegram after the party tersely . b_82 0 * because she 's so pleasant , mary i really like her . b_82 1 because she 's so pleasant , mary i really like . b_82 1 though he may seem intelligent , he does not seem deep . b_82 1 intelligent though he may seem , he does not seem deep . b_82 1 though i may love her , that wo n't affect the grade . b_82 1 love her though i may , that wo n't affect the grade . b_82 0 * handsome though i believe the claim that tom is , i still wo n't date him . b_82 0 * handsome though they told me that tom is , i still wo n't date him . b_82 0 * handsome though my friends suggested that mary thinks that tom is , i still wo n't date him . b_82 1 hate those who criticize carter though he may , it does n't matter . b_82 1 would john hate that ? b_82 1 would john hate that ! b_82 0 * will , after john comes home , sally take a shower ? b_82 1 will sally , after john comes home , take a shower ? b_82 1 after john comes home , will sally take a shower ? b_82 1 i would prefer that he not have finished . b_82 0 * i would prefer that he have not finished . b_82 1 he has not finished . b_82 1 he is not finishing . b_82 1 he would not finish . b_82 1 he does not finish . b_82 0 * those people will , after the party , not come home . b_82 1 those people , after the party , will not come home . b_73 1 i 've never seen a man taller than my father . b_73 1 i 've never seen a taller man than my father . b_73 1 i 've never seen a man taller than my mother . b_73 1 i 've never seen a taller man than my mother . b_73 1 jack eats caviar more than he eats mush . b_73 1 jack eats more caviar than he eats mush . b_73 1 jack eats caviar more than he sleeps . b_73 0 * jack eats more caviar than he sleeps . b_73 1 i am more angry today than i was yesterday . b_73 1 i am more angry than sad . b_73 0 * i am angrier than sad . b_73 1 mary is more than six feet tall . b_73 1 mary is taller than six feet . b_73 0 * mary is more than five feet short . b_73 1 mary is shorter than five feet . b_73 1 they think she has too much independence . b_73 0 * they think she is too much happy . b_73 0 * mary speaks so much gently . b_73 0 * a tangerine is n't as much different from an orange as i 'd thought . b_73 1 a tangerine is n't as different from an orange as i 'd thought . b_73 1 you and i are as much alike as a horse and a cow . b_73 1 you and i are as alike as a horse and a cow . b_73 1 you and i are as little alike as a horse and a cow . b_73 0 * john is as much intelligent as mary . b_73 1 john is as intelligent as mary . b_73 1 john is more than 6 feet tall . b_73 1 john is taller than 6 feet . b_73 1 these plants may grow as much as 6 feet high . b_73 1 these plants may grow as high as 6 feet . b_73 1 more has happened in the last week than will happen in the next year . b_73 1 he offers more than we had hoped for . b_73 1 he was hoping for more than we offered . b_73 1 enough is going on to keep them confused . b_73 1 you 've said enough to convince me . b_73 1 sally eats caviar more than i had expected . b_73 1 susan does n't eat her vegetables enough . b_73 1 sally eats the stuff pretty often . b_73 0 * sally eats pretty often the stuff . b_73 1 sally eats the stuff more . b_73 0 * sally eats more the stuff . b_73 0 * susan does n't eat enough her vegetables . b_73 1 john eats more . b_73 1 john does n't eat enough . b_73 1 john eats more than he sleeps . b_73 1 he gave me more of his marbles than i wanted . b_73 0 * sally enough eats caviar . b_73 0 * enough sally eats caviar . b_73 1 jack is more tall than thin . b_73 1 i did it more in jest than in anger . b_73 1 there is enough of the bread left to have tomorrow . b_73 1 there is enough bread for all of you . b_73 1 there is bread enough for all of you . b_73 1 she has enough of a problem as it is . b_73 0 * she has enough a problem as it is . b_73 0 * she has enough problem as it is . b_73 0 * she has problem enough as it is . b_73 0 * she has enough of problems as it is . b_73 1 she has enough problems as it is . b_73 1 he looks more formidable than he is . b_73 0 * he seems enough intelligent for you . b_73 1 he seems intelligent enough for you . b_73 1 she writes more clearly than she speaks . b_73 0 * she speaks enough clearly to be understood . b_73 1 he 's enough of a fool to try it . b_73 1 he 's fool enough to try it . b_73 1 i saw more of the man than you did . b_73 1 i saw enough of the fool to be convinced . b_73 1 harry got to be more of a celebrity . b_73 1 harry got to be more of the celebrity . b_73 0 * he 's enough of the coward to pull the trigger . b_73 1 what his father wants him to be is more of a man . b_73 0 * more of a man is here . b_73 0 * i 've kicked more of a man than you have . b_73 0 * i 've known more of a man than frank . b_73 1 i 've never known more of a man than frank . b_73 1 he was hoping for too much . b_73 1 sally eats caviar too much for her own good . b_73 1 john eats so much . b_73 1 he gave me many marbles . b_73 1 i have much typing to do . b_73 0 * he looks so much formidable . b_73 1 he looks so formidable . b_73 0 * she speaks too much clearly . b_73 1 she speaks too clearly . b_73 1 i 'm as much of a man as you are , my dear . b_73 1 harry got to be as much of a celebrity as his father . b_73 0 * harry got to be as much of the celebrity as his father . b_73 0 * as much of a man is here . b_73 0 * i 've seen as much of a coward as frank . b_73 1 many are called ; few are chosen . b_73 1 more are called than are ever chosen . b_73 1 we made enough pudding to last for days . b_73 0 * we ate enough a pudding to satisfy us . b_73 1 we made enough puddings to last for days . b_73 0 * we ate enough the puddings to satisfy us . b_73 1 john is the kind of a fool that i told you about . b_73 1 john is the kind of the fool that i told you about . b_73 1 he 's a bit of a gossip . b_73 0 * he 's the bit of a gossip . b_73 1 he 's something of a gossip . b_73 1 john is the kind of fool that i told you about . b_73 0 * he 's fool . b_73 0 * he 's a fool enough to try it . b_73 0 * she 's just enough tall . b_73 0 * she 's enough tall . b_73 0 * she speaks enough clearly . b_73 1 he 's that reliable a man . b_73 0 * he 's a that reliable man . b_73 1 he 's too reliable a man . b_73 0 * he 's a too reliable man . b_73 1 he 's as reliable a man . b_73 0 * he 's an as reliable man . b_73 1 he 's so reliable a man . b_73 0 * he 's a so reliable man . b_73 0 * he 's more reliable a man . b_73 0 * he 's reliable enough a man . b_73 1 he 's a reliable enough man . b_73 1 tom was not more reliable than a grasshopper . b_73 1 tom was no more reliable than a grasshopper . b_73 0 * not more reliable a man could be found . b_73 0 * any more reliable a man could not be found . b_73 1 i do n't want trouble . b_73 0 * john is not more reliable a fellow than bill . b_73 1 john is not a more reliable fellow than bill . b_73 1 john is n't any more reliable a fellow than bill . b_73 0 * john is n't an any more reliable fellow than bill . b_73 1 john is no more reliable a fellow than bill . b_73 0 * john is a no more reliable fellow than bill . b_73 1 i have as many too many marbles as you . b_73 1 i have as many marbles too many as you . b_73 1 i have six too many marbles . b_73 1 i have six marbles too many . b_73 1 i have six more of them . b_73 0 * i have six of them more . b_73 1 i have half a dozen too many of these marbles . b_73 0 * i have half a dozen of these marbles too many . b_73 1 she writes clearly enough . b_73 1 she is as brilliant a woman as her mother . b_73 0 * she is as brilliant the woman as her mother . b_73 1 i 've never known as strong a person as louise . b_73 1 fido is a smarter dog than spot . b_73 0 * fido is the smarter dog than spot . b_73 1 what his father wants him to be is a better pool player . b_73 1 a taller man than bill is here . b_73 1 i 've never known a smarter dog than fido . b_73 1 he 's so tall a man that doors are dangerous to him . b_73 1 he 's such a tall man that doors are dangerous to him . b_73 1 he 's such a tall man . b_73 1 he 's such the tall man . b_73 1 what her mother wants her to be is such a fine surgeon that everyone will respect her . b_73 1 it was as awful a picture as it first seemed . b_73 0 * it was so awful a picture as it first seemed . b_73 1 it was n't as awful a picture as it first seemed . b_73 1 it was n't such an awful picture as it first seemed . b_73 1 it was so awful a picture that i tore it up . b_73 1 it was such an awful picture that i tore it up . b_73 1 mary is such a wit that people are afraid of her . b_73 1 sally is n't such a fool as people think . b_73 0 * sally is such a fool as people think . b_73 1 i love her so much . b_73 1 i gave her so much . b_73 0 * i gave her so . b_73 1 hilda is such a scholar . b_73 1 hilda is such a scholar that all her work is impeccable . b_73 1 hilda is such a scholar as you were speaking of just now . b_73 1 so eminent a scholar as dr. lucille hein was here . b_73 1 such an eminent scholar as dr. lucille hein was here . b_73 1 so elegant a solution as you have presented us with can elicit only admiration . b_73 1 you have presented so elegant a solution that we can only admire it . b_73 1 such a scholar as you were speaking of just now is here . b_73 0 * so much of a scholar is here . b_73 1 her mother wants mary to be such an eminent woman that everyone will respect her . b_73 1 john a decidedly taller man than bill . b_73 0 * john is a decidedly too tall man . b_73 1 that 's an obviously better solution . b_73 0 * that 's an obviously so good solution . b_73 1 she made so much better a reply . b_73 0 * she made such a much better reply . b_73 0 * she made such a better reply . b_73 0 * that 's the most kind answer that i ever heard . b_73 0 * that 's a most kind answer that i ever heard . b_73 0 * that 's a kindest answer that i ever heard . b_73 1 that 's the kindest answer that i ever heard . b_73 1 most helpful advice is unwanted . b_73 1 sally will give me more helpful advice than the advice i got from you . b_73 1 i 've never seen a man who is taller than my mother . b_73 1 i 've never seen the one man taller than my father . b_73 0 * i 've never seen the taller man than my father . b_73 0 * i 've never seen the one taller man than my father . b_73 1 john wants to come up with as good a solution as christine did . b_73 1 john wants to come up with a solution as good as christine . b_73 1 john wants to find a solution better than christine 's . b_73 1 caviar is eaten by jack more than mush . b_73 1 more caviar than mush is eaten by jack . b_73 1 jack ate more of this than he ate of that . b_73 1 the table is longer than the door is wide . b_73 1 mary 's happy about her work , and john 's happy about his children . b_73 0 * mary 's happy about her work , and john 's about his children . b_73 1 mary 's happy about her work , and john is about his children . b_73 1 mary is happy with her work , and john is with his children . b_73 1 mary 's happy with her work , and john 's with his children . b_73 0 * the table is longer than the door 's wide . b_73 1 the table is long , and the door 's wide . b_73 0 * i was happier there than i 'm here . b_73 1 i 'm sad , more than i 'm angry . b_73 0 * i 'm sadder than i 'm angry . b_73 1 i 'm more sad than angry . b_73 1 i 'm worrying , more than thinking . b_73 0 * i 'm more worrying than thinking . b_73 0 * i 'm sadder than angry . b_73 1 i 'm sad , as much as i 'm angry . b_73 1 i 'm as much sad as angry . b_73 0 * i 'm as sad as angry . b_73 1 i am angrier today that i was yesterday . b_73 1 john is taller than six feet . b_73 1 john is taller than bill . b_73 1 mary has more than two friends . b_73 0 * mary has more than just bill and pete friends . b_73 1 mary has more friends than two . b_73 1 mary has more friends than just bill and pete . b_73 1 they may grow as much as six feet high . b_73 0 * they may grow as much as bamboo high . b_73 1 they may grow as high as six feet . b_73 1 some of them made as many as 20 errors . b_73 0 * some of them made as many as joan errors . b_73 1 some of them made as many errors as joan . b_73 0 * john is taller than six feet is . b_73 1 john is taller than pete is . b_73 0 * mary has more friends that two . b_73 0 * mary has more friends than just bill and pete are . b_73 0 * john is more than five feet short . b_73 1 john is shorter than five feet . b_73 1 mary has more enemies than bill has friends . b_73 0 * mary has more than bill has friends enemies . b_73 1 mary does n't have as many too many too many as jane . b_73 1 jane has more nearly as many too many than mary . b_73 1 mary swam five more laps than joan swam . b_73 1 mary swam as many more laps than joan as linda . c_13 1 bill kissed himself . c_13 0 * bill kissed herself . c_13 1 sally kissed herself . c_13 0 * kiss himself . c_13 1 the robot kissed itself . c_13 1 she knocked herself on the head with a zucchini . c_13 0 * she knocked himself on the head with a zucchini . c_13 1 the snake flattened itself against the rock . c_13 1 the joneses think themselves the best family on the block . c_13 0 * the joneses think himself the most wealthy guy on the block . c_13 1 gary and kevin ran themselves into exhaustion . c_13 0 * gary and kevin ran himself into exhaustion . c_13 1 people from tucson think very highly of themselves . c_13 1 i gave myself the bucket of ice cream . c_13 0 * she hit myself with a hammer . c_13 1 she hit herself with a hammer . c_13 1 doug blew the building up . c_13 1 doug blew up the building . c_13 1 doug blew it up . c_13 1 doug blew up it . c_13 0 * who do you wonder what bought ? c_13 1 i wonder what fiona bought . c_13 0 * toothbrush the is blue . c_13 1 cheese mice love stinks . c_13 1 the dancing chorus line of elephants broke my television set . c_13 1 rosie loves magazine ads . c_13 1 i think rosie loves magazine ads . c_13 1 dana doubts that drew believes i think rosie loves magazine ads . c_13 1 dave left . c_13 1 dave and alina left . c_13 1 dave , dan , erin , and alina left . c_13 1 who do you think that ciaran will question first ? c_13 1 who do you think ciaran will question first ? c_13 1 who do you think will question seamus first ? c_13 0 * who do you think that will question seamus first ? c_13 1 i expect soon to see the results . c_13 1 i expect to see the results soon . c_13 1 i expect to soon see the results . c_13 0 * i expect more than to double my profits . c_13 0 * i expect to double more than my profits . c_13 0 * i expect to double my profits more than . c_13 1 i expect to more than double my profits . c_13 1 who did you see in las vegas ? c_13 1 you are taller than me . c_13 0 * my red is refrigerator . c_13 0 * who do you think that saw bill ? c_13 1 my friends wanted to quickly leave the party . c_13 0 * bunnies carrots eat . c_13 1 george sang to himself . c_13 0 * himself sang to george . c_13 1 betsy loves herself in blue leather . c_13 1 everyone should be able to defend himself . c_13 1 everyone should be able to defend herself . c_13 1 i hope nobody will hurt themselves . c_13 1 i hope nobody will hurt himself . c_13 1 do n't hit yourself ! c_13 1 she is dancing . c_13 1 they are dancing . c_13 1 the man is dancing . c_13 1 the men are dancing . c_13 1 the students met to discuss the project . c_13 1 zeke cooked and ate the chili . c_13 1 zeke ate and cooked the chili . c_13 1 he put the clothes . c_13 1 he put in the washing machine . c_13 1 i gave my brother a birthday present . c_13 1 i gave a birthday present to my brother . c_13 1 where do you guys live at ? c_13 1 it is obvious to everybody that tasha likes misha . c_13 1 the man loved peanut butter cookies . c_13 1 the puppy loved peanut butter cookies . c_13 1 the king loved peanut butter cookies . c_13 0 * the green loved peanut butter cookies . c_13 0 * the in loved peanut butter cookies . c_13 0 * the sing loved peanut butter cookies . c_13 1 john went to the store . c_13 1 the man went to the store . c_13 0 * quickly walks went to the store . c_13 0 * to the washroom kissed the blarney stone . c_13 1 the destruction of the city bothered the mongols . c_13 1 sincerity is an important quality . c_13 1 the assassination of the president . c_13 1 tucson is a great place to live . c_13 1 gabrielle 's mother is an axe murderer . c_13 1 hamsters mother attractive offspring . c_13 1 wendy 's mother country is iceland . c_13 1 louis said that parts of speech intrigued her . c_13 0 * cat ate the spider . c_13 1 the cat ate the spider . c_13 1 cats ate the spider . c_13 1 the cats ate the spider . c_13 0 * i ate apple . c_13 1 i ate the apple . c_13 1 i ate sugar . c_13 1 i ate the sugar . c_13 1 he is filled with sincerity . c_13 1 i doubt his sincerity . c_13 1 the dastardly surgeon stole the physician 's lunch . c_13 1 i asked the question . c_13 1 i asked if you knew the answer . c_13 1 i hit the ball . c_13 1 i spared him the trouble . c_13 0 * i put the box the book . c_13 1 i put the book in the box . c_13 1 i gave the box to leah . c_13 1 i gave leah the box . c_13 1 i told daniel the story . c_13 1 i told daniel that the exam was cancelled . c_13 1 i told the story to daniel . c_13 1 the canadian government uses a parliamentary system of democracy . c_13 1 the canadian bought himself a barbecue . c_13 1 the prudish linguist did n't enjoy looking at the internet . c_13 1 we keep those censored copies of the book hidden to protect the sensibilities of the prudish . c_13 1 susan bought some flowers for her mother . c_13 1 susan bought some flowers for her birthday . c_13 0 * susan bought her birthday some flowers . c_13 1 i gave blood . c_13 1 i do n't give a darn . c_13 1 andy gives freely of his time . c_13 1 dan gave his life . c_13 1 dan gives to charity . c_13 1 sorry , i gave last week . c_13 1 the student loved his phonology readings . c_13 1 i saw these dancers and those musicians smoking something . c_13 1 i am drinking lemonade and eating a brownie . c_13 1 we went through the woods and over the bridge . c_13 1 the man whose car i hit last week sued me . c_13 1 the big man from ny has often said that he gave peanuts to elephants . c_13 1 the man killed the king with the knife . c_13 1 we ate at a really fancy restaurant . c_13 0 * we ate at . c_13 1 big bowls of beans are what i like . c_13 1 the big boy was kissed by the drooling dog . c_13 1 the drooling dog kissed the big boy . c_13 1 john and the man went to the store . c_13 0 * john and very blue went to the store . c_13 1 bruce loved and kelly hated phonology class . c_13 0 * the with milk coffee is hot . c_13 1 the kangaroo hopped over the truck . c_13 1 i have n't seen this sentence before . c_13 1 susan will never sing at weddings . c_13 1 the officer carefully inspected the license . c_13 1 every cat always knows the location of her favorite catnip toy . c_13 1 the cat put her catnip toy on the plastic mat . c_13 1 the very young child walked from school to the store . c_13 1 john paid a dollar for a head of lettuce . c_13 1 teenagers drive rather quickly . c_13 1 a clever magician with the right equipment can fool the audience easily . c_13 1 the police might plant the drugs in the apartment . c_13 1 those olympic hopefuls should practice diligently daily . c_13 1 the latest research on dieting always warns people about the dangers of too much cholesterol . c_13 1 that annoying faucet was dripping constantly for months . c_13 1 marian wonders if the package from boston will ever arrive . c_13 1 i said that bonny should do some dances from the middle east . c_13 1 that dan smokes in the office really bothers alina . c_13 1 the belief that syntactic theory reveals the inner structure of sentences emboldened the already much too cocky professor . c_13 1 i bought the parrot in the store . c_13 1 i put the milk in the fridge . c_13 1 i mailed the sweater to mary . c_13 1 i knew the man with the brown hair . c_13 1 john said mary went to the store quickly . c_13 1 i discovered an old english poem . c_13 1 susanne gave the minivan to george . c_13 1 clyde got a passionate love letter from stacy . c_13 1 he blew out the candle . c_13 1 he turned off the light . c_13 1 he blew up the building . c_13 1 he rode out the storm . c_13 0 * shannon kissed quietly the kitten . c_13 1 shannon left quietly every day . c_13 1 juliet says that romeo lies to his parents a lot . c_13 1 the puppy licked the kitten 's face . c_13 1 it is raining . c_13 1 fred feels fine . c_13 1 that bill 's breath smells of onions bothers erin . c_13 1 susan kissed the clown 's nose . c_13 1 cedric danced a jolly jig . c_13 1 dale said that the lawn was overgrown . c_13 1 gilgamesh cut the steak with a knife . c_13 1 we drove all the way to buenos aires . c_13 1 john tagged lewis with a regulation baseball on tuesday . c_13 1 the big man from new york loves bagels with cream cheese . c_13 1 susan rode a bright blue train from new york . c_13 1 the plucky platypus kicked a can of soup from new york to tucson . c_13 1 john said martha sang the aria with gusto . c_13 1 martha said john sang the aria from la bohème . c_13 1 the book of poems with the bright red cover stinks . c_13 1 louis hinted mary stole the purse deftly . c_13 1 the extremely tired students hated syntactic trees with a passion . c_13 1 many soldiers have claimed bottled water satisfies thirst best . c_13 1 networking helps you grow your business . c_13 1 i did n't read a single book the whole time i was in the library . c_13 1 i did not have a red cent . c_13 1 felicia wrote a fine paper on zapotec . c_13 1 heidi hit herself on the head with a zucchini . c_13 0 * herself hit heidi on the head with a zucchini . c_13 1 heidi believes any description of herself . c_13 1 john knew that there would be a picture of himself hanging in the post . c_13 1 although he loves marshmallows , john is not a big fan of chocolate . c_13 1 his yearbook picture gives tom the creeps . c_13 1 marilyn monroe is norma jeane baker . c_13 1 gene simmons was originally named haim goldberg . c_13 1 kevin ate spaghetti with a spoon and geordie did so too . c_13 1 the chef eats beans and serves salads with forks . c_13 1 i am frightened of tigers . c_13 1 i am afraid of tigers . c_13 1 i am fond of circus performers . c_13 1 i fear tigers . c_13 1 i like circus performers . c_13 1 i am afraid of tigers and fond of clowns without exception . c_13 1 i am frightened of tigers and fond of clowns without exception . c_13 1 bob is very serious about mary , but less so than paul . c_13 1 the book of poems with a red cover from blackwell by robert burns takes a very long time to read . c_13 1 the book of poems with a red cover from blackwell by robert burns takes a very long time to read . c_13 1 the book of poems from blackwell with a red cover by robert burns takes a very long time to read . c_13 1 the book of poems from blackwell by robert burns with a red cover takes a very long time to read . c_13 1 the book of poems by robert burns from blackwell with a red cover takes a very long time to read . c_13 1 the book of poems by robert burns with a red cover from blackwell takes a very long time to read . c_13 1 the book of poems with a red cover by robert burns from blackwell takes a very long time to read . c_13 0 * the book with a red cover of poems from blackwell by robert burns takes a very long time to read . c_13 0 * the book with a red cover from blackwell of poems by robert burns takes a very long time to read . c_13 0 * the book with a red cover from blackwell by robert burns of poems takes a very long time to read . c_13 1 the book of poems with a red cover and with a blue spine takes a very long time to read . c_13 1 the book of poems and of fiction from blackwell takes a very long time to read . c_13 0 * the one of poems with a red cover takes a very long time to read . c_13 1 i loved the policeman intensely with all my heart . c_13 0 * i loved intensely the policeman with all my heart . c_13 0 * i loved the policeman the baker intensely with all my heart . c_13 1 mika loved the policeman intensely and susan did so half heartedly . c_13 0 * susan did so the baker . c_13 1 john fears dogs . c_13 1 john is afraid of dogs . c_13 1 two or three books take a very long time to read . c_13 0 * two or boring books take a very long time to read . c_13 1 the red dress with the pink stripes looks good on sandy . c_13 1 the ugly man from brazil found books of poems in the puddle . c_13 1 erin never keeps her pencils in the correct drawer . c_13 1 dan walked to new mexico in the rain last year . c_13 1 george wrote a volume of poems in latin for jane . c_13 1 people with boxes of old clothes lined up behind the door of the building with the leaky roof . c_13 1 that automobile factories abound in michigan worries me greatly . c_13 1 no one understands that phrase structure rules explain the little understood phenomenon of the infinite length of sentences . c_13 1 my favorite language is a language with simple morphology and complicated syntax . c_13 1 ivan got a headache on wednesday from the disgruntled students of phonology from michigan . c_13 1 the collection of syntax articles with the red cover bores students of syntax in tucson . c_13 1 the red volume of obscene verse from italy shocked the puritan soul of the minister with the beard quite thoroughly yesterday . c_13 1 the biggest man in the room said that john danced an irish jig from county kerry to county tipperary on thursday . c_13 1 a burlap sack of potatoes with mealy skins fell on the professor of linguistics with the terrible taste in t-shirts from the twelfth story . c_13 1 the bright green filing cabinet was filled to the brim with the most boring articles from a prestigious journal of linguistics with a moderately large readership . c_13 1 the coat of the panther is dark black . c_13 1 the roof of the building is leaking . c_13 1 the hat of the man standing over there impressed me greatly . c_13 1 the panther 's coat is dark black . c_13 1 the building 's roof is leaking . c_13 1 the man standing over there 's hat impressed me greatly . c_13 0 * the man 's standing over there hat impressed me greatly . c_13 0 * the man standing over there 's the hat impressed me greatly . c_13 1 the boy ran . c_13 1 howard is a linguistics student . c_13 1 peter said that danny danced . c_13 1 bill wants susan to leave . c_13 1 peter thinks that cathy loves him . c_13 1 people selling their stocks caused the crash of 1929 . c_13 1 for mary to love that boor is a travesty . c_13 1 i said that mary signed my yearbook . c_13 1 i want mary to sign my yearbook . c_13 1 i 've never seen you eat asparagus . c_13 1 i know you ate asparagus . c_13 0 * i 've never seen you ate asparagus . c_13 0 * i 've never seen him eats asparagus . c_13 1 i 've never seen him eat asparagus . c_13 1 i think that he eats asparagus . c_13 1 i want to eat asparagus . c_13 1 i want him to eat asparagus . c_13 1 i wonder if he eats asparagus . c_13 1 for him to eat asparagus is a travesty . c_13 1 i asked for him to eat the asparagus . c_13 1 i think he will eat asparagus . c_13 1 fabio asked if claus had run a marathon . c_13 0 * fabio asked if had claus run a marathon . c_13 0 * fabio asked had if claus run a marathon . c_13 1 you can lead a horse to water but will it drink ? c_13 1 he will go . c_13 1 he goes . c_13 1 the peanut butter has got moldy . c_13 1 the swing blasted the golf ball across the green . c_13 1 that harry loves dancing is evidenced by his shiny tap shoes . c_13 1 the brazilians pumped the oil across the river . c_13 1 lenin believes the tsar to be a power hungry dictator . c_13 1 brezhnev had said for andropov to leave . c_13 1 yeltsin saw stalin holding the bag . c_13 1 robert thinks that students should eat asparagus . c_13 1 robert thinks that student should eat asparagus . c_13 1 linguistics students like phonetics tutorials . c_13 1 martha said that bill loved his cheerios in the morning . c_13 1 eloise wants you to study a new language . assume to = t . c_13 1 for maurice to quarrel with joel frightened maggie . c_13 1 no man has ever beaten the centaur . c_13 0 * some man has ever beaten the centaur . c_13 0 * every man has ever beaten the centaur . c_13 1 rosemary hates new york . c_13 0 * rosemary hates . c_13 1 jennie smiled . c_13 0 * jennie smiled the microwave . c_13 1 traci gave the whale a lollipop . c_13 0 * traci gave the whale . c_13 0 * traci gave a lollipop . c_13 1 ryan hit andrew . c_13 1 michael accidentally broke the glass . c_13 1 leah likes cookies . c_13 1 lorenzo saw the eclipse . c_13 1 syntax frightens kenny . c_13 1 alyssa kept her syntax book . c_13 1 the arrow hit ben . c_13 1 the psychologist hates phonology . c_13 1 doug went to chicago . c_13 1 dave was given the margarita mix . c_13 1 george gave jessica the book . c_13 1 daniel received a scolding from hanna . c_13 1 bob gave steve the syntax assignment . c_13 1 stacy came directly from linguistics class . c_13 1 andrew is in tucson 's finest apartment . c_13 1 chris hacked the computer apart with an axe . c_13 1 this key will open the door to the linguistics building . c_13 1 he bought these flowers for aaron . c_13 1 she cooked matt dinner . c_13 0 * john placed the flute . c_13 1 john put the book on the table . c_13 1 john put the book on the table with a pair of tongs . c_13 1 megan loves kevin . c_13 0 * megan loves . c_13 0 * megan loves jason . c_13 1 it rained . c_13 1 it snowed . c_13 1 it hailed . c_13 1 that bill loves chocolate is likely . c_13 1 it is likely that bill likes chocolate . c_13 1 i put a book on it . c_13 1 it bit me on the leg . c_13 1 shannon sent dan an email . c_13 1 stacy hit a baseball to julia . c_13 1 jaime danced a jig . c_13 1 yuko rubbed the pizza with a garlic clove . c_13 1 it is raining in san francisco . c_13 1 the stodgy professor left with his teaching assistant . c_13 1 i played a tune on my ipod . c_13 1 molly gave calvin a kiss . c_13 1 mercedes gave a test to the students in the lecture hall . c_13 1 spot ate a cat treat . c_13 1 susan ate yesterday at the restaurant . c_13 1 gwen looked at a fire truck . c_13 1 michael asked a question . c_13 1 adam asked if hyacinth likes pineapples . c_13 1 i feel it is unfortunate that television is so vulgar these days . c_13 1 that angus hates sushi is mysterious . c_13 0 * jennie smiled the sandwich . c_13 0 * placed the flute on the table . c_13 0 * john placed on the table . c_13 0 * john placed the flute the violin on the table . c_13 0 * the rock placed the sky with the fork . c_13 0 * john placed the flute the table . c_13 1 john bit the apple . c_13 1 susan forgave louis . c_13 1 the jockey rides the horse . c_13 1 phillip gave the soldier the medal . c_13 1 the apple was bitten . c_13 1 louis was forgiven . c_13 1 the horse was ridden . c_13 1 the medal was given to the soldier . c_13 1 the soldier was given the medal . c_13 1 the apple was bitten by john . c_13 1 louis was forgiven by susan . c_13 1 the horse was ridden by the jockey . c_13 1 the medal was given to the soldier by phillip . c_13 1 the soldier was given the medal by phillip . c_13 1 i ate a basket of apples . c_13 1 i ate . c_13 1 i think that john likes his beer . c_13 1 i think john likes his beer . c_13 0 * i think for john to like his beer . c_13 0 * i think if john likes his beer . c_13 1 i ordered that john drink his beer . c_13 1 i ordered john drink his beer . c_13 0 * i ordered for john to drink his beer . c_13 1 i ordered john to drink his beer . c_13 0 * i ordered if john drink his beer . c_13 0 * i inquired that john like his beer . c_13 0 * i inquired john likes his beer . c_13 0 * i inquired for john to like his beer . c_13 0 * i inquired john to like his beer . c_13 1 i inquired if john likes his beer . c_13 1 heidi thinks that andy is eating salmon flavored candy bars . c_13 1 heidi thinks that andy has eaten salmon flavored candy bars . c_13 1 heidi thinks that andy will eat salmon flavored candy bars . c_13 1 heidi thinks that andy eats salmon flavored candy bars . c_13 1 heidi thinks that the salmon flavored candy bars were eaten . c_13 1 he has danced . c_13 1 i had eaten the deep fried muffins . c_13 1 i have eaten the beef waffles . c_13 1 i will have eaten the beef waffles . c_13 1 jeff was dancing with sylvia while amy sat angrily at their table . c_13 1 the soup had been being eaten when it got spilled . c_13 1 jeff must have eaten the deep fried muffin . c_13 0 * jeff has must eaten the deep fried muffin . c_13 1 jeff must not have eaten the deep fried muffin . c_13 0 * jeff not must have eaten the deep fried muffin . c_13 1 calvin has a peanut . c_13 1 susan has a cold . c_13 1 bill had an accident . c_13 1 calvin has eaten a peanut . c_13 1 frank has drunk too much . c_13 1 bill has been dancing . c_13 1 dane is a doctor . c_13 1 jorge was the one . c_13 1 alex was eating the popsicle . c_13 1 megan was sat on by her brother . c_13 1 catherine did her homework . c_13 1 catherine did not eat . c_13 1 calvin did not do a back flip . c_13 1 has bill eaten his tuna ? c_13 1 is bill eating his tuna ? c_13 1 did bill eat his dinner ? c_13 0 * ate bill his dinner ? c_13 0 * has calvin a bowl ? c_13 1 angus is not leaving . c_13 1 calvin has not eaten his dinner . c_13 1 spot did not play with his mouse . c_13 0 * calvin ate not his dinner . c_13 0 * calvin has not any catnip . c_13 0 * angus did not his homework . c_13 1 i should not eat plums . c_13 0 * i have not should eat plums . c_13 0 * i must can eat plums . c_13 0 * i have should eat plums . c_13 0 * i want to should eat plums . c_13 1 calvin will not eat the beef waffles . c_13 0 * calvin not will eat the beef waffles . c_13 0 * calvin could will eat the beef waffles . c_13 0 * calvin will could eat the beef waffles . c_13 1 i ate deep fried muffins . c_13 1 he always eats deep fried muffins . c_13 0 * i might ate deep fried muffins . c_13 0 * he always might eats deep fried muffins . c_13 1 he might eat deep fried muffins . c_13 1 i might eat deep fried muffins . c_13 1 he will eat deep fried muffins . c_13 0 * he will eats deep fried muffins . c_13 1 sylvia will be slapping jeff upside the head in martial arts class . c_13 1 sylvia could be slapping jeff upside the head in martial arts class . c_13 1 sylvia is slapping jeff upside the head in martial arts class . c_13 1 the cat had eaten . c_13 1 the cat had been eating . c_13 1 the tuna had been eaten . c_13 0 * the cat had haven eaten . c_13 1 the cat was leaving . c_13 1 the tuna was being eaten . c_13 0 * the cat was being eating . c_13 0 * the cat was having eaten . c_13 1 the cake was eaten . c_13 0 * the cake was been eating . c_13 0 * the cake was have eaten . c_13 1 reggie did not chase the ball . c_13 1 did calvin eat the beef waffles ? c_13 1 what did calvin eat ? c_13 0 * john must not do have eaten . c_13 0 * john must do not have eaten . c_13 1 the prisoner must have been being interrogated when the supervisor walked into the room and saw what was going on and put a stop to it . c_13 1 fiona must not eat the sauteed candy canes . c_13 1 fiona has not eaten the sauteed candy canes . c_13 1 can fiona eat sauteed candy canes ? c_13 0 * i wanted that he should leave . c_13 0 * i wanted he should leave . c_13 0 * i wanted if he should leave . c_13 1 i wanted him to leave . c_13 1 i wanted to leave . c_13 0 * heidi investigated that john ate the cauliflower . c_13 0 * heidi investigated john ate the cauliflower . c_13 1 heidi investigated whether john ate the cauliflower . c_13 1 heidi investigated if john ate the cauliflower . c_13 0 * heidi investigated john to eat the cauliflower . c_13 0 * heidi investigated to eat the cauliflower . c_13 1 john said heidi was obsessed with broccoli . c_13 0 * john said if heidi was obsessed with broccoli . c_13 0 * john said heidi to eat the broccoli . c_13 1 andy promised that we would go . c_13 1 andy promised we would go . c_13 0 * andy promised if we would go . c_13 0 * andy promised us to go . c_13 1 andy promised to go . c_13 1 if i were a rich man , i 'd buy a diamond ring . c_13 0 * if he is a rich man , he 'd buy a diamond ring . c_13 1 rory eats . c_13 1 rory ate muffins . c_13 1 the muffins were eaten . c_13 1 rory had eaten the muffins . c_13 1 rory has eaten the muffins . c_13 1 rory must have eaten the muffins . c_13 1 rory may be eating the muffins . c_13 1 rory will eat the muffins . c_13 1 rory eats muffins . c_13 1 rory is eating muffins . c_13 1 rory might have been eating the muffins . c_13 1 the muffins might have been being eaten . c_13 1 the tuna had been being eaten . c_13 1 calvin will eat . c_13 1 the tuna will be eaten . c_13 1 calvin will be eating . c_13 1 calvin will have eaten . c_13 1 the tuna will be being eaten . c_13 1 the tuna will have been eaten . c_13 1 calvin will have been eating . c_13 1 calvin was eating . c_13 1 calvin had eaten . c_13 1 calvin had been eating . c_13 1 the tuna must have been eaten . c_13 1 the tuna will have been being eaten . c_13 1 he has not eaten yet today . c_13 1 i have never seen this movie . c_13 1 i never have a pen when i need it . c_13 1 i have always loved peanut butter . c_13 1 i do not love peanut butter . c_13 1 martha often thinks kim hates phonology . c_13 1 do you like peanut butter ? c_13 1 have you always hated peanut butter ? c_13 1 are you always thinking dirty thoughts ? c_13 1 bradley left . c_13 1 stacy left tucson . c_13 1 john left his wife . c_13 0 * i want bradley that left . c_13 0 * john thinks that left . c_13 1 that john will leave is likely . c_13 1 it is likely that john will leave . c_13 1 the policeman kissed the puppy . c_13 1 the puppy was kissed by the policeman . c_13 1 the puppy was kissed . c_13 1 john laughed . c_13 1 the audience laughed . c_13 0 * bill is likely john to hit . c_13 0 * it was kissed the puppy . c_13 1 jennifer swatted steve . c_13 1 steve swatted jennifer . c_13 1 she swatted him . c_13 1 he swatted her . c_13 1 i walk . c_13 1 you walk . c_13 1 it is likely that patrick left . c_13 1 that patrick left is likely . c_13 0 * patrick is likely that left . c_13 0 * it is likely patrick to leave . c_13 0 * patrick to leave is likely . c_13 1 patrick is likely to leave . c_13 1 he kissed her . c_13 1 she was kissed . c_13 0 * she was kissed him . c_13 0 * it was kissed her . c_13 1 stacy danced at the palace . c_13 1 stacy arrived at the palace . c_13 0 * there danced three men at the palace . c_13 0 * there arrived three men at the palace . c_13 0 * it seems sonny to love cher . c_13 0 * bill was bitten the dog . c_13 0 * donny is likely that left . c_13 1 the shah slept in a bed . c_13 1 the bed was slept in by the shah . c_13 1 dust fell on the bed . c_13 0 * the bed was fallen on by the dust . c_13 1 bill was hit by the baseball . c_13 0 * was been hit by bill by the baseball . c_13 1 bill gave sue the book . c_13 1 sue was given the book by bill . c_13 0 * the book was been given by bill by sue . c_13 1 i cut the soft bread . c_13 1 the soft bread cuts easily . c_13 1 the boat sank . c_13 1 the torpedo sank the boat . c_13 1 the captain sank the boat . c_13 1 the captain sank the boat with a torpedo . c_13 0 * was sunk by the boat . c_13 1 the boat was sunk by the captain with a torpedo . c_13 1 i sent a book to louis . c_13 1 i sent louis a book . c_13 1 a book was sent to louis . c_13 0 * louis was sent a book to . c_13 0 * to louis was sent a book . c_13 1 louis was sent a book . c_13 0 * a book was sent louis . c_13 1 john seems to have left . c_13 1 bill wants john to leave . c_13 1 john wants bill to leave . c_13 1 john wants him to leave . c_13 1 john believes him to have been at the game . c_13 1 he is believed by john to have been at the game . c_13 1 he is believed to have been at the game . c_13 1 becky bought the syntax book . c_13 1 what did becky buy ? c_13 1 what did stacy say becky bought ? c_13 1 matt kissed her . c_13 1 whom did matt kiss ? c_13 1 i wonder who jim kissed . c_13 1 the fact that i like strawberry flavored milk shakes is none of your business . c_13 1 she made the outrageous claim that tuna flavored milkshakes are good for you . c_13 1 i asked where you found it . c_13 1 i wo n't reveal the place . c_13 1 i asked who she kissed . c_13 1 i know several people who she kissed . c_13 1 i know several people she kissed . c_13 1 i know several people that she kissed . c_13 1 i know i bought the book you recommended . c_13 1 i know i bought the book that you recommended . c_13 1 the guy who is wearing the red hat just hit me ! c_13 1 that guy , who i think might be drunk , just hit me ! c_13 0 * the man , who i think might be drunk , that is escaping hit me . c_13 1 what did bill claim that he read ? c_13 1 what do you think matt kissed ? c_13 0 * what did bill make the claim that he read in the syntax book ? c_13 0 * which cake did you see the man who baked ? c_13 1 i wonder what john bought . c_13 1 how do you think john bought the sweater ? c_13 0 * how do you wonder what john bought ? c_13 1 how do you think john bought what ? c_13 1 i wonder what john bought how . c_13 1 i wonder what john kissed . c_13 0 * who did you wonder what kissed ? c_13 1 i asked what john kissed . c_13 1 that the police would arrest several rioters was a certainty . c_13 1 i liked mary and john . c_13 0 * who did you like and john ? c_13 1 i ate some popcorn and drank some soda . c_13 0 * what did you eat some popcorn and drink ? c_13 1 who loves who ? c_13 1 who loves whom ? c_13 1 shelly loves who ? c_13 1 fred saw a spaceship in the linguistics lounge ? c_13 1 what is bothering you ? c_13 1 who has seen my snorkel ? c_13 1 how was the plot discovered by the authorities ? c_13 1 which animals appear to have lost their collars ? c_13 1 what did jean think was likely to have been stolen ? c_13 1 car sales have surprised the stockbrokers . c_13 1 can you find the light bulb store ? c_13 1 john was bitten by an advertising executive . c_13 1 it is likely that tami will leave new york . c_13 1 tami is likely to leave new york . c_13 1 lucy seems to have been mugged . c_13 1 what did you buy at the supermarket ? c_13 1 what is it likely for beth to have bought at the supermarket ? c_13 1 what is likely to have been bought at the supermarket ? c_13 1 the trail we walked today was built by slave labor . c_13 1 bill is always complaining about the guys who work near him . c_13 1 the cost of bagels that are imported from iceland surprised the teacher who mike hired last week . c_13 0 * josh gave clay carefully a book . c_13 1 josh gave clay a book carefully . c_13 1 briana showed justin himself . c_13 0 * briana showed himself justin . c_13 1 i blew up the building . c_13 1 i blew the building up . c_13 0 * i blew up it . c_13 1 i blew it up . c_13 1 susan sent the package to heidi . c_13 1 i asked mike if he had seen the yeti . c_13 1 i bought some flowers for manuel . c_13 1 i bought manuel some flowers . c_13 1 jean is likely to leave . c_13 1 jean is reluctant to leave . c_13 0 * jean is likely . c_13 1 jean wants brian to leave . c_13 1 jean persuaded brian to leave . c_13 1 that jean left is likely . c_13 1 it is likely that jean left . c_13 0 * is likely jean to leave . c_13 0 * it is reluctant that jean left . c_13 0 * that jean left is reluctant . c_13 1 jean is likely to dance . c_13 1 the cat is out of the bag . c_13 1 the cat thinks that he is out of the bag . c_13 0 * is likely to jean dance . c_13 1 it is likely that jean will dance . c_13 1 jean wants robert . c_13 1 jean wants him . c_13 0 * i want she to dance . c_13 1 i want jean . c_13 1 i want jean to dance . c_13 1 jean wants herself to dance . c_13 1 jean is reluctant . c_13 1 to find a new mate , go to a dating service . c_13 1 jean tried to behave . c_13 1 robert knows that it is essential to be well behaved . c_13 1 robert knows that it is essential . c_13 1 robert knows it is essential that he is well behaved . c_13 1 louis begged kate to leave . c_13 0 * louis begged kate that she leave her job . c_13 0 * louis begged kate to shave himself . c_13 1 louis begged kate that he be allowed to shave himself . c_13 1 to behave oneself in public is expected . c_13 1 robert knew that it was necessary to behave himself . c_13 1 mike expected greg incorrectly to take out the trash . c_13 1 the boys do n't all want to leave . c_13 1 robert is eager to do his homework . c_13 1 jean seems to be in a good mood . c_13 1 rosemary tried to get a new car . c_13 1 susan begged bill to let her sing in the concert . c_13 1 susan begged to be allowed to sing in the concert . c_13 1 christina is ready to leave . c_13 1 fred was believed to have wanted to try to dance . c_13 1 susan consented to try to seem to have been kissed . c_13 1 alan told me who wanted to seem to be invincible . c_13 1 what did john want to eat ? c_13 1 this book is easy to read . c_13 1 john is easy to please . c_13 1 to improve myself is a goal for next year . c_13 1 to improve yourself would be a good idea . c_13 1 to improve himself , bruce should consider therapy . c_13 1 to improve herself , jane went to a health spa . c_13 1 kathleen really hates her job . c_13 1 my brother likes collecting jazz records . c_13 1 martina is deathly afraid of spiders . c_13 1 that kind of behavior annoys me . c_13 1 the news pleased the students . c_13 1 horror films disturb milo . c_13 1 the exhibition really impressed the critics . c_13 1 kathleen hates those pictures of herself . c_13 1 the children admired photos of each other . c_13 1 sandra hates reading about herself in the tabloids . c_13 1 pictures of himself always disturb milo . c_13 1 to be able to buy myself a ticket to france would be a dream . c_13 1 reading about herself in the tabloids always annoys sandra . c_13 1 brandon has been reading more novels than he has short stories . c_13 1 robin will eat cabbage but she wo n't ice cream . c_13 1 john could bake something , but i 'm not sure what . c_13 1 frank will eat an apple and morgan will too . c_13 1 frank will eat an apple and morgan will eat an apple too . c_13 1 calvin will strike himself . c_13 1 calvin will strike himself and otto will too . c_13 1 calvin will strike himself and otto will strike himself too . c_13 1 calvin has dated every girl who jeff has . c_13 1 calvin has dated every girl who jeff has dated . c_13 1 i know which guys you 've dated , but i do n't know which guys you have n't . c_13 0 * which language do you want to hire someone who speaks ? c_13 0 * they want to hire someone who speaks a balkan language , but i do n't know which language . c_13 1 calvin will fire someone today , but i do n't know who . c_13 1 peter was talking with someone but i do n't know who . c_13 1 brandon read every book that megan did . c_13 1 every book that megan did brandon read too . c_13 1 darin has eaten more squid than john has octopus . c_13 1 what does calvin like . c_13 1 alexandra wants to catch a fish and sylvia does too . c_13 1 calvin admired himself in the mirror . c_13 0 * chris said that himself was sad . c_13 1 chris wants himself to win . c_13 1 which pictures of himself did chris see in the gallery ? c_13 1 chris liked which pictures of himself ? c_13 1 which pictures of himself did chris like ? c_13 0 * heidi believes bill 's description of herself . c_13 1 heidi thinks that she has won . c_13 1 heidi thinks that pictures of herself are beautiful . c_13 1 heidi gave a present to herself . c_13 1 the army 's destruction of the palace was a tragedy . c_13 1 the army destroyed the palace . c_13 1 heidi wants to kiss herself . c_13 0 * heidi believes john 's description of herself . c_13 0 * heidi dislikes the tv 's depiction of herself . c_13 1 heidi said that pictures of herself were embarrassing . c_13 0 * heidi said that bill 's pictures of herself were embarrassing . c_13 0 * chris said that himself was angry . c_13 1 heidi saw peter 's picture of her . c_13 1 heidi saw drawings of her . c_13 1 john loves himself . c_13 1 john loves pictures of himself . c_13 0 * john loves mary 's pictures of himself . c_13 0 * john thinks that mary 's depiction of himself is wrong . c_13 1 john thinks that most depictions of himself are wrong . c_13 1 john seems to like pictures of himself . c_13 1 john believes himself to be the best at baseball . c_13 1 john wants to congratulate himself . c_13 0 * john loves him . c_13 1 john loves his puppy . c_13 1 john asked if the unflattering description of his work would be published in the paper . c_13 1 john asked if his essay would be published in the paper . d_98 1 any owl can hunt mice . d_98 0 * john talked to any woman . d_98 0 * any woman contributed to the fund . d_98 1 john talked to any woman who came up to him . d_98 1 any woman who heard the news contributed to the fund . d_98 1 any man who saw the fly in the food did n't eat dinner . d_98 1 you may pick any flower . d_98 0 * you must pick any flower . d_98 1 any pilot could be flying this plane . d_98 0 * any pilot must be flying this plane . d_98 1 any student must work hard . d_98 1 any doctor will tell you that . d_98 1 any soldier should be prepared to die for her country . d_98 1 john talked to a woman . d_98 1 john did n't talk to a woman . d_98 1 john kissed even the ugliest woman . d_98 1 john kissed even the ugliest woman who came up to him . d_98 1 a lion is usually majestic . d_98 0 * any lion is usually majestic . d_98 1 a philosopher is sometimes wrong . d_98 1 any philosopher is sometimes wrong . d_98 1 you must pick a flower . d_98 1 a pilot must be flying this plane . d_98 1 a student must work hard . d_98 1 a soldier should be prepared to die for her country . d_98 1 rarely is any lion majestic . d_98 1 seldom is any lion majestic . d_98 1 never is any lion majestic . d_98 0 * usually , any lion is majestic . d_98 0 * often , any lion is majestic . d_98 0 * always , any lion is majestic . d_98 1 you may pick absolutely any flower . d_98 1 you may pick almost any flower . d_98 1 almost any pilot could be flying this plane . d_98 1 absolutely any pilot could be flying this plane . d_98 1 you may pick any flower except the rose . d_98 1 any pilot except sue could be flying this plane . d_98 1 john talked to absolutely any woman who came up to him . d_98 1 john talked to almost any woman who came up to him . d_98 1 john talked to any woman who came up to him except sue . d_98 1 john put carrots from his garden in the salad . d_98 1 john put any carrot from his garden in the salad . d_98 1 john talked to a woman who came up to him . d_98 1 a woman who heard the news contributed to the fund . d_98 1 a man who saw the fly in the food did n't eat dinner . d_98 1 john talked to every woman who came up to him . d_98 1 every woman who heard the news contributed to the fund . d_98 1 every man who saw the fly in the food did n't eat dinner . d_98 1 john talked to every woman . d_98 1 mary regretted that she did anything to help him . d_98 0 * mary talked to any man or any woman . d_98 1 every student who is in mary 's class is working on polarity items . d_98 1 it happens to be true of every student in mary 's class that he is working on polarity items . d_98 1 every student in mary 's class , by virtue of being in her class , is working on polarity items . d_98 1 every student in mary 's class happened to vote republican . d_98 1 every woman standing under that tree is mary 's friend . d_98 1 the president thanked every soldier who had fought in the . d_98 1 everybody who attended last week 's huge rally signed the petition . d_98 1 we did n't keep a list of the names , but the president thanked every soldier who had fought in the gulf war . d_98 0 * every student in mary 's class , whoever they were , happened to vote republican . d_98 0 * every woman standing under that tree , whoever she may be , is mary 's friend . d_98 0 * any student in mary 's class happened to vote . d_98 0 * any woman standing under that tree is mary 's friend . d_98 1 the president thanked any soldier who had fought in the gulf . d_98 1 every restaurant that advertises in any of these papers happens to have four stars in the handbook . d_98 1 everybody who is in mary 's semantics seminar is writing a paper on polarity items . d_98 1 john talked to any woman at the party . d_98 1 john talked to any politician who is powerful . d_98 0 * john talked to any powerful politician . d_98 1 mary confidently answered any objections . d_98 1 after the dinner , we threw away any leftovers . d_98 0 * john bought any picture of queen elizabeth . d_98 1 john bought any picture of queen elizabeth that was on sale . d_98 1 every philosopher is sometimes wrong , but he usually does n't admit it . d_98 0 * any lion is generally majestic . d_98 0 * any lion is rare . d_98 1 any female tiger has orange fur , marked with black stripes . d_98 1 birds fly . d_98 1 any bird flies . d_98 1 all fugitives are in jail now . d_98 1 all lizards will die . d_98 0 * yesterday john talked to any woman . d_98 1 yesterday john talked to any woman he saw . d_98 1 snow is white and snow is not white . d_98 1 any man did n't eat dinner . d_98 1 mary talks to any student . d_98 0 * mary talked to any angry student . d_98 1 mary talked to any student who was angry . d_98 0 * mary talked to any actual student . d_98 0 * any pilot on duty today must be flying this plane . d_98 1 any pilot must be out flying planes today . d_98 1 every student read any book on giraffes he found . d_98 0 * you must pick any flower in this bed . d_98 1 you may pick any of the flowers . d_98 0 * you must pick any of the flowers . d_98 0 * mary picked any of the flowers . d_98 1 you may pick every flower . d_98 1 you may pick any flower , but leave a few for mary . d_98 1 you may pick any five flowers . d_98 1 mary did n't pick any of the flowers . d_98 1 pick any flower . d_98 1 confiscate any liquor . d_98 1 pick any of these flowers . d_98 0 * confiscate any of this liquor . d_98 0 * mary did n't see almost every flower . d_98 0 * mary did n't see almost any flower . d_98 1 every student in mary 's class is working on negative polarity . d_98 1 there were twenty students at the lecture and every student who was there said it was inspiring . d_98 0 * there were twenty students at the lecture and any student who was there said it was inspiring . d_98 1 we have many graduate students but this year the graduate director met with every student in the graduate program individually to discuss their progress . d_98 0 * we have many graduate students but this year the graduate director met with any student in the graduate program individually to discuss their progress . d_98 1 susan found every book she had been looking for at borders . d_98 0 * susan found any book she had been looking for at borders . d_98 1 paul has interviewed every student who was at the scene of the crime and kate has interviewed them too . d_98 0 * paul has interviewed any student who was at the scene of the crime and kate has interviewed them too . d_98 0 * professor smith would support sue and prof jones bill . d_98 1 there is every book by chomsky in this library . d_98 0 * there is any book by chomsky in this library . d_98 1 there 's everything mary had asked for in this store . d_98 0 * there 's anything mary had asked for in this store . d_98 1 there is any book you could imagine in this library . d_98 1 there 's anything mary could desire in this store . d_98 1 that evening john laughed with everybody he talked to . d_98 1 that evening john laughed with anybody he talked to . d_98 1 john talked to everybody who came up to him at the party . d_98 1 john talked to anybody who came up to him at the party . d_98 1 bill offered mary everything he had cooked for dinner . d_98 0 * bill offered mary anything he had cooked for dinner . d_98 1 those days bill offered mary everything he cooked . d_98 1 those days bill offered mary anything he cooked . d_98 1 john made a fool of himself in front of everyone who was there . d_98 1 john made a fool of himself in front of anyone who was there . d_98 1 mary sang for everyone who wanted to hear her . d_98 1 mary sang for anyone who wanted to hear her . d_98 1 john slipped in front of everyone who was there . d_98 0 * john slipped in front of anyone who was there . d_98 1 at 4 p.m . i saw john lecturing to everyone who was near him . d_98 0 * at 4 p.m . i saw john lecturing to anyone who was near him . d_98 1 john knew every language that we encountered on our trip . d_98 1 john knew any language that we encountered on our trip . d_98 1 john liked everything that was placed before him . d_98 1 john liked anything that was placed before him . d_98 1 at the end of his speech , the president thanked any soldier who had fought in the gulf war . d_98 1 bob does not think that there is anyone from greece in his basement . d_98 1 can anyone pledge $ 1000 ? d_98 1 is it possible for everyone to to pledge $ 1000 ? d_98 1 is there someone who can pledge $ 1000 ? d_98 1 if anybody comes , he rings the doorbell . d_98 1 every student who wins any trophy displays it in a prominent place . d_98 0 * john saw anything . d_98 1 john did n't see anything . d_98 0 * some who read anything passed . d_98 1 every who read anything passed . d_98 1 no student who read anything passed . d_98 0 * some answered any question . d_98 0 * every student answered any question . d_98 1 any cat does n't like mice . d_98 1 every cat does n't like mice . d_98 1 every cat does n't like mice , for example felix does n't . d_98 1 almost every cat likes mice , but felix does n't . d_98 0 * every cat does n't like mice , but felix does n't . d_98 1 almost every cat likes mice , for example felix does n't . g_81 1 the dodgers beat the red sox and the dodgers were beaten by the giants . g_81 1 the dodgers beat the red sox and the giants beat the dodgers . g_81 1 different teams beat the red sox and were beaten by the giants . g_81 1 john gave the books to mary and the records to sue . g_81 1 how many did you buy of those pies at the fair ? g_81 1 how many have you given of these books to these people . g_81 0 * the man chased fido returned . g_81 1 the man that chased fido returned . g_81 1 the man i think chased fido returned . g_81 0 * the man i think that chased fido returned . g_81 1 the man who i think chased fido returned . g_81 0 * the man who i think that chased fido returned . g_81 1 who did you think mary saw ? g_81 1 how slowly would you say he was driving ? g_81 1 how suspicious was mary ? g_81 1 who saw the man ? g_81 1 who do you think that you saw ? g_81 0 * who do you think that saw you ? g_81 1 who do you regret that you saw ? g_81 0 * who do you regret that saw you ? g_81 1 who do you think you saw ? g_81 1 who do you think saw you ? g_81 0 * who do you regret you saw ? g_81 0 * who do you regret saw you ? g_81 0 * who did you believe that came ? g_81 0 * who did you wonder whether came ? g_81 0 * who did you wonder if came ? g_81 0 * who did you arrange for to come ? g_81 0 * which table did you wonder on kim put the book ? g_81 0 * which did you buy the table on kim put the book ? g_81 0 * what do you believe that iron is to be a fact well known to virtually everybody ? g_81 0 * who did you wonder saw kim ? g_81 0 * which did you buy the table supported the book ? g_81 0 * the fact , i put it down to that kim came . g_81 0 * the table , i put kim on which supported the book . g_81 1 who is it that mary likes ? g_81 1 he was talkative . g_81 1 he was a bully . g_81 1 he was talkative and a bully . g_81 0 * the talkative and a bully man entered . g_81 0 * talkative and a bully entered . g_81 0 * john is easy to please and to love mary . g_81 0 * the man who mary loves and sally hates george computed my tax . g_81 1 john is easy to please and to love . g_81 1 the kennel which mary made and fido sleeps in has been stolen . g_81 1 the kennel in which mary keeps drugs and fido sleeps has been stolen . g_81 0 * the kennel in which mary made and fido sleeps has been stolen . g_81 1 john saw more horses than bill saw or pete talked to . g_81 1 john saw more horses than bill saw cows or pete talked to cats . g_81 0 * john saw more horses than bill saw cows or pete talked to . g_81 1 i know a man who bill saw and mary liked . g_81 1 i know a man who saw bill and liked mary . g_81 0 * i know a man who bill saw and liked mary . g_81 1 i wonder who bill saw and mary liked . g_81 0 * i wonder who bill saw and liked mary . g_81 1 i wonder who mary likes and hopes will win . g_81 0 * john asked who and where bill had seen . g_81 1 which book and which pencil did john buy ? g_81 0 * where and when did bill put the book ? g_81 1 on which table and under which flower pot did john put the keys ? g_81 1 to which city and to which conference did bill go ? g_81 1 to which city and which conference did bill go ? g_81 1 which city and which conference did bill go to ? g_81 0 * which city and which conference did bill go to to ? g_81 0 * which city and to which conference did bill go to ? g_81 0 * to which city and which conference did bill go to ? g_81 0 * john , who and whose friends you saw , is a fool . g_81 1 john , to who and to whose friends that letter was addressed , is a fool . g_81 1 i wonder when and how often she went that day . g_81 1 i wonder who and whose friends he handed over to the fbi . g_81 1 i have wanted to know exactly what happened to rosa luxemburg for many years . g_81 1 i have wanted to know for many years exactly what happened to rosa . g_81 1 i had hoped that it was true that rosa luxemburg had actually defected to iceland for many years . g_81 1 i had hoped that it was true for many years that rosa luxemburg had actually defected to iceland . g_81 1 i have wanted to meet the man who spent so much money planning the assassination of kennedy for many years . g_81 1 i have wanted to meet for many years the man who spent so much money planning the assassination of kennedy . g_81 1 the woman believed that the man was ill who was here . g_81 1 the woman believed that the man who was here was ill . g_81 1 the woman who was here believed that the man was ill . g_81 1 a woman hit a girl who was pregnant . g_81 1 people are said to do crazier things at higher speeds there by dorothy than they are by other people . g_81 1 people are said to do such crazy things at such high speeds there by dorothy that i am getting skeptical . g_81 1 a woman hit a pregnant girl . g_81 1 a pregnant woman hit a girl . g_81 1 a man just came in and a woman went out who were similar in all kinds of ways . g_81 1 a man just came in and a woman went out who hate each other like poison and always have . g_81 0 * i find it easy to believe - but joan finds it hard to believe - tom to be dishonest . g_81 0 * john offered , and harry gave , sally a cadillac . g_81 0 * john told , and harry showed , seymour that sally was a virgin . g_81 1 jack may be and tony certainly is a werewolf . g_81 1 harry has claimed but i do not believe that melvin is a communist . g_81 1 i like but tom does n't like to visit new places . g_81 1 i can tell you when , but i ca n't tell you why , he left me . g_81 1 i 've been wondering whether , but would n't positively want to state that . g_81 1 john hummed , and mary sang , the same tune . g_81 1 john hummed , and mary sang , at equal volumes . g_81 1 john gave mary , and joan presented to fred , books which looked . g_81 1 the red sox beat , and the giants were beaten by , different teams . g_81 1 smith loaned , and his widow later donated , a valuable collection of manuscripts to the library . m_02 1 which club did you hit the winning putt with ? m_02 1 with which club did you hit the winning putt ? m_02 1 ethel was sitting at her desk . m_02 0 * the ethel was sitting at her desk . m_02 0 * accountant was sitting at her desk . m_02 1 the accountant was sitting at her desk . m_02 1 accountants audit our finances every year . m_02 0 * i would like an accountants to sort out my tax return . m_02 1 some accountants were quietly counting in the back office . m_02 1 would more accountants make any difference to my tax bill ? m_02 1 the truck spread salt . m_02 1 the truck spread the salt . m_02 1 the truck spread salts . m_02 1 this truck spread less salt than that one . m_02 0 * this truck spread fewer salt than that one . m_02 1 there are fewer trucks on the motorway this winter . m_02 1 there are less trucks on the motorway this winter . m_02 0 * the white rabbit vanished his watch . m_02 1 dogs chase cats . m_02 0 * dogs chase . m_02 1 flora cooks . m_02 1 flora cooks gourmet meals . m_02 1 the cat shot into the kitchen on sunday morning carrying a dead mouse . m_02 1 the cat sauntered into the kitchen carrying a dead mouse . m_02 1 maisie drove her car from morningside to leith on wednesday . m_02 1 on wednesday maisie drove her car from morningside to leith . m_02 1 maisie drove her car on wednesday from morningside to leith . m_02 1 jeeves sauntered into the room . m_02 0 * into jeeves sauntered the room . m_02 1 into the room sauntered jeeves . m_02 1 which room did jeeves sauntered into ? m_02 1 into which room did jeeves sauntered ? m_02 1 barbara handed the results to alan on tuesday . m_02 1 the pupils in this maths class gave cakes to margaret every . m_02 1 cakes were given to margaret every friday by the pupils in this maths class . m_02 1 this parcel is very heavy . m_02 1 this very heavy parcel was delivered yesterday . m_02 1 very heavy , this parcel ! m_02 1 what this parcel is is very heavy . m_02 1 we felled the murder with this chainsaw . m_02 1 with this chainsaw we felled the murder . m_02 1 barbara handed the intriguing results of the latest examination to alan on tuesday . m_02 1 barbara handed them to alan on tuesday . m_02 1 this large parcel is very heavy . m_02 1 this large parcel is very heavy and so is this small packet . m_02 1 vera is knitting in the lounge . m_02 1 vera is knitting there . m_02 1 grandma is coming to mr chalky 's school tomorrow . m_02 1 grandma is coming here tomorrow . m_02 1 the cat was sleeping in the kitchen . m_02 1 the cat trotted into the kitchen . m_02 1 the mouse jumped out of the cheese box . m_02 1 the mouse was out the cheese box . m_02 1 the cat trotted in the kitchen . m_02 1 the cat trotted in . m_02 1 the mouse jumped out . m_02 1 the terrier attacked the burglar . m_02 1 the terrier savaged the burglar 's ankles . m_02 1 the terrier attacked the burglar and the terrier savaged the burglar 's ankles . m_02 1 the terrier attacked the burglar and savaged the burglar 's ankles . m_02 1 did the wealthy young man buy that piano for his secret fiancée ? m_02 1 who bought that piano for his secret fiancée ? m_02 1 what did the wealthy young man buy for his secret fiancée ? m_02 1 who did the wealthy young man buy that piano for ? m_02 1 the wealthy young man bought his secret fiancée that piano . m_02 1 that piano was bought for his secret fiancée by the wealthy young man . m_02 1 i do n't like the plum brandy , but the port i just love . m_02 1 frank bought the piano for jane . m_02 1 frank bought jane the piano . m_02 1 the piano was bought for jane by frank . m_02 1 the piano frank bought for jane . m_02 1 did frank buy the piano for jane ? m_02 1 did frank buy jane the piano ? m_02 1 was the piano bought for jane by frank ? m_02 1 what did frank buy for jane ? m_02 1 frank bought something for jane . m_02 1 did frank buy something for jane . m_02 1 what did frank buy for jane . m_02 1 the children chased the dog . m_02 1 the cook saved no scraps for the dog . m_02 1 sarah devoured the cakes in the kitchen last night . m_02 1 mr knightley despaired . m_02 1 emma slighted miss bates . m_02 1 jane fairfax seemed upset . m_02 1 mr woodhouse sat in an armchair . m_02 1 mr knightley walked into the drawing room . m_02 1 mr elton handed his wife into the carriage . m_02 1 emma gave bad advice to harriet . m_02 1 mr knightley suggested that thieves would break into hartfield . m_02 1 eleanor blamed willoughby for marianne 's unhappiness . m_02 1 eleanor blamed marianne 's unhappiness on willoughby . m_02 1 the romans built this aqueduct . m_02 1 the computer will calculate the value of the variable . m_02 1 these objections killed the proposal . m_02 0 * lecturer was sitting at her desk . m_02 1 too much salt damages vehicles . m_02 0 * too much vehicles are damaged by salt . m_02 0 * too many salt damages vehicles . m_02 1 too many vehicles are damaged by salt . m_02 1 frank churchill gave a piano to jane fairfax . m_02 1 a piano was given to jane fairfax by frank churchill . m_02 1 wickham eloped with lydia . m_02 1 miss bates can chatter on for hours . m_02 1 henry crawford loved fanny but fanny loved edmund . m_02 1 mr bingley became tired of jane or mr d'arcy persuaded mr . m_02 1 elizabeth regretted that she had met wickham . m_02 1 catherine feared that the abbey was haunted . m_02 1 that anne was in conversation with mr elliott dismayed captain . m_02 1 fanny was delighted by the idea that she could subscribe to a library . m_02 1 who thought up the proposal that the committee be abolished ? m_02 1 the cottage which mrs dashwood accepted was rather small . m_02 1 the gentleman who saved marianne was willoughby . m_02 1 the building that we liked is in thornton lacey . m_02 1 it was anne elliott who loved captain wentworth but who rejected his first proposal . m_02 1 a motorist has reported that the road is blocked by snow at bunker hill . m_02 1 the labrador ate all the food which we left on the kitchen table . m_02 1 show me the folder in which you stored the documents . m_02 1 i like the book that you gave me . m_02 1 i love the food they cook in the halls of residence . m_02 1 a motorist has reported the road is blocked at bunker hill . m_02 1 i am delighted at the idea they might demolish the appleton tower . m_02 1 the cottage which mrs dashwood accepted was very small . m_02 1 anne musgrave has just seen mr elliott in bath street . m_02 1 nurse rooke has discovered where anne elliott stayed . m_02 1 nurse rooke suspected that mrs clay planned to run away with . m_02 1 anne astonished her father . m_02 1 that captain wentworth married anne astonished her father . m_02 1 sir walter elliott imagined the scene . m_02 1 sir walter elliott imagined that he was still handsome . m_02 1 yesterday lydia eloped with wickham . m_02 1 lydia eloped with wickham yesterday . m_02 1 when lydia went to brighton , she eloped with wickham . m_02 1 lydia eloped with wickham when she went to brighton . m_02 1 because of the strike the commuters travelled by army lorry . m_02 1 the commuters travelled by army lorry because of the strike . m_02 1 because the bus drivers were on strike , the commuters travelled by army lorry . m_02 1 the commuters travelled by army lorry because the bus drivers were on strike . m_02 1 although mr d'arcy disliked mrs bennet he married elizabeth . m_02 1 in spite of his dislike of mrs bennet , mr d'arcy married elizabeth . m_02 1 if emma had left hartfield , mr woodhouse would have been unhappy . m_02 1 did captain wentworth write a letter to anne elliott ? m_02 1 write a letter to anne elliott . m_02 0 * because did marianne love willoughby , she refused to . m_02 0 * if did emma leave hartfield , mr woodhouse would be unhappy . m_02 0 * when did fanny return , she found tom bertram very ill . m_02 0 * the cottage which did mrs dashwood accept was rather small . m_02 0 * catherine feared that was the abbey haunted . m_02 1 the girls wondered who mr bennet had received in his library . m_02 1 we were wondering who did you meet at the conference . m_02 1 she said that in came aunt norris . m_02 1 she said that into the room came aunt norris . m_02 0 * the person who in came at that moment was aunt norris . m_02 0 * because in came aunt norris , fanny stopped talking . m_02 0 * when in came aunt norris , fanny stopped talking . m_02 0 * because into the room came aunt norris , fanny stopped talking . m_02 0 * when into the room came aunt norris , fanny stopped talking . m_02 1 never had sir thomas been so offended . m_02 0 * the person who never had he been so offended was sir thomas . m_02 0 * because never had sir thomas been so offended , even mr yates left . m_02 0 * when never had sir thomas been so offended , mr yates left . m_02 1 dr jones habitually ate too much rich food , did n't he ? m_02 0 * we realised that dr jones died because he ate too much rich food , did n't he ? m_02 0 * the person who ate too much rich food did n't he was dr . m_02 0 * because dr jones ate too much rich food did n't he , he died of apoplexy . m_02 0 * when dr jones died of apoplexy did n't he , mary crawford went to live with his wife . m_02 1 fanny stopped talking because in came aunt norris . m_02 0 * because in came aunt norris fanny stopped talking . m_02 1 mr yates left because never had sir thomas been so offended . m_02 0 * because never had sir thomas been so offended , mr yates left . m_02 0 * fanny stopped talking when in came aunt norris . m_02 0 * when in came aunt norris fanny stopped talking . m_02 0 * fanny continued talking although in came aunt norris . m_02 0 * although in came aunt norris , fanny continued talking . m_02 1 fanny had just stopped talking when in came aunt norris . m_02 1 fanny regretted talking to mary . m_02 1 henry wanted to marry fanny . m_02 1 mrs bennet having taken the others upstairs , mr bingley proposed to . m_02 1 all mr collins does is praise lady de bourg . m_02 1 lady de bourg tried to persuade elizabeth to renounce mr d'arcy . m_02 1 henry wanted to have married fanny before edmund returned . m_02 1 fanny regretted having talked to mary . m_02 1 what mr collins is doing is praising lady de bourg . m_02 0 * fanny regretted being talking to mary . m_02 0 * all mr collins has done is have praised lady de bourg . m_02 1 julia and maria wanted to be allowed to perform a play . m_02 1 edmund wanted fanny to be able to ride a horse . m_02 0 * henry wanted to possibly marry fanny . m_02 1 fanny loved talking to mary . m_02 1 slamming the door , he ran down the steps . m_02 0 * he was knowing the country well . m_02 1 when ripe , these apples will be delicious . m_02 1 the tigers hunt prey at night . m_02 1 fiona hoped to meet the prime minister . m_02 1 arthur tried to bake a cake . m_02 1 fiona persuaded arthur to bake a cake . m_02 1 susan wanted jane to study german . m_02 1 ayala went to the ball and chatted to jonathan stubbs . m_02 0 * ayala went to the ball and jonathan stubbs chatted to . m_02 1 ayala went to the ball and was chatted to by jonathan stubbs . m_02 1 all the beatles came to merle park . m_02 1 the beatles all came to merle park . m_02 1 both jane and elizabeth were at home . m_02 1 jane and elizabeth were both at home . m_02 1 larry hunted all the foxes . m_02 0 * larry all hunted the foxes . m_02 0 * larry hunted the foxes all . m_02 1 george built both the houses . m_02 0 * george both built the houses . m_02 0 * george built the houses both . m_02 1 all the foxes were hunted by larry . m_02 1 augusta blamed herself for what happened . m_02 1 these documents elizabeth is checking at this very moment . m_02 1 louise broke the cup . m_02 1 alison drove the car . m_02 1 martha chewed the bread . m_02 1 the cup was broken by louise . m_02 1 the car was driven by alison . m_02 1 the bread was chewed by martha . m_02 1 these fields were marched over by all the armies of europe . m_02 1 how is someone to chat to a girl if she does not go out ? m_02 1 all the armies of europe marched over these fields . m_02 1 ayala sent back the diamond necklace . m_02 1 ayala sent the diamond necklace back . m_02 1 ayala sent her cousin the diamond necklace . m_02 0 * ayala sent back her cousin the diamond necklace . m_02 1 tatiana wrote to onegin . m_02 1 frank bought a piano for jane . m_02 1 lucy sent a letter to jane . m_02 1 lucy sent jane a letter . m_02 1 the company sent china its senior mining engineers to help plan the new mines . m_02 0 * the experts attributed raphael this picture . m_02 0 * i forwarded winifred the letter . m_02 0 * the manager presented the foreman a gold watch . m_02 0 * kick john the ball . m_02 0 * the critics ascribe shakespeare this play . m_02 1 who did john send a book to ? m_02 1 to whom did john send a book ? m_02 1 what place did you travel to ? m_02 1 to what place did you travel ? m_02 1 what place did john send the book ? m_02 0 * who was the book sent by john . m_02 0 * what place was the book sent by john ? m_02 1 only to the best students would he give this book . m_02 0 * only the best students would he give this book . m_02 1 only to glasgow would he go by train . m_02 0 * only glasgow would he travel by train . m_02 1 it is to the best students that he gives this book . m_02 0 * it is the best students he gives this book . m_02 1 it is to ireland that he is going . m_02 0 * it is ireland that he is going . m_02 1 he told her the whole story . m_02 1 she told him the whole story . m_02 1 the other plan she rejected out of hand . m_02 1 the vase got broken that sheila had brought all the way from . m_02 1 the plan was rejected out of hand that traffic should be banned . m_02 1 norman lemming jumped off the cliff and william lemming did so too . m_02 1 norman lemming jumped off the cliff and so did william lemming . m_02 1 harriet could n't marry mr knightley but emma could . m_02 1 what harriet did was marry mr martin . m_02 1 marry mr martin was what harriet did . m_02 1 emma insulted miss bates and annoyed mr knightley . m_02 1 harriet swooned . m_02 1 the book is astonishingly boring . m_02 1 the ethel we all know and love wishes to ask you some awkward questions . m_02 1 golfers can be good company . m_02 1 enthusiastic golfers with large handicaps can be good company . m_02 1 these enthusiastic golfers that i met at the nineteenth hole can be good company . m_02 0 * golfer who is in training has a pretty powerful swing . m_02 1 memo ate the spaghetti . m_02 1 memo liked lasagna . m_02 1 emma made harriet her friend . m_02 1 the quiche and i were cooking . m_02 1 erika made her mother an omelet and the kitchen a mess . m_02 1 bill went to london on monday . m_02 1 bill went on monday to london . m_02 1 my brother lives near strasbourg . m_02 1 near strasbourg my brother lives . m_02 1 he planted the garden with roses last november . m_02 1 he planted the garden last november with roses . m_02 1 the baby chewed the biscuit . m_02 1 the baby is heavy . m_02 1 what the baby did was chew the biscuit . m_02 1 the baby was chewing the biscuit . m_02 1 chew the biscuit ! m_02 1 hartfield house is in surrey . m_02 1 mr knightley rode to kingston . m_02 1 eleanor and marianne travelled from shropshire . m_02 1 frank gave a piano to jane fairfax . m_02 1 jane fairfax received a piano from frank . m_02 1 the thief smashed the window with a hammer . m_02 1 captain wentworth recovered the property for mrs smith . m_02 1 the window was broken by a hammer . m_02 1 wren built st paul 's cathedral . m_02 1 siobhan burnt a pattern on the piece of wood . m_02 1 the dog dug a hole in the lawn . m_02 1 the vase stood on the table in the hall . m_02 1 imogen took the vase to her mother 's . m_02 1 imogen broke the vase . m_02 1 sue knows the answer . m_02 1 the answer is known to sue . m_02 1 jim was happily chopping logs . m_02 1 jim was chopping logs when margaret left and was still at it when she got back . m_02 1 jim was enthusiastically chopping logs . m_02 1 captain oates died in order to save his comrades . m_02 1 this arch supports the weight of the tower . m_02 1 what this arch does is support the weight of the tower . m_02 1 this arch is supporting the weight of the tower . m_02 1 the computer is playing six simultaneous games of three dimensional chess . m_02 1 the intense cold killed the climbers . m_02 1 the climbers were killed by the intense cold . m_02 1 the climbers were killed with the intense cold . m_02 1 catriona opened the door with this key . m_02 1 the visas are with the passports . m_02 1 sally went to the party with andrew . m_02 1 alan made the loaf with strong white flour . m_02 1 the builders made the wall with concrete blocks . m_02 1 the gardener planted roses in the garden . m_02 1 it was roses that the gardener planted in the garden . m_02 1 it is the garden that the gardener planted with roses . m_02 1 roses are certain to be planted in the garden by the gardener . m_02 1 the garden is certain to be planted with roses by the gardener . m_02 1 helen sent a scarf to jim for margaret . m_02 1 what happened was they went home . m_02 0 * what happened was they knew his parents . m_02 0 * we are knowing this theory . m_02 1 they 're believing everything you say . m_02 1 you 'll soon be owning all the land round here . m_02 1 what she did was e-mail all her friends . m_02 0 * what she did was know this theory . m_02 0 * what she did was be very cold . m_02 0 * what she did was own all the land round here . m_02 1 harriet talked to emma for hours . m_02 1 the dog chased the cat for days . m_02 1 harriet told emma the whole story . m_02 1 the dog caught the cat . m_02 1 the beaver built a dam . m_02 1 anne played the tune on the piano . m_02 1 jane was playing the piano . m_02 1 jane played the piano . m_02 1 tess was knocking at the door . m_02 1 tess knocked at the door . m_02 1 frank churchill was crossing the street . m_02 1 jane is visiting emma . m_02 1 jane visits emma . m_02 1 tess is knocking at the door . m_02 1 tess knocks at the door . m_02 1 frank churchill is crossing the street . m_02 1 frank churchill crosses the street . m_02 1 real play valencia next sunday . m_02 1 i leave for paris next week . m_02 0 * the volcano erupts on tuesday . m_02 1 the minister has arrived . m_02 1 i 've been at work for six hours . m_02 1 have you ever visited doubtful sound ? m_02 1 there was an attack yesterday . m_02 1 emma and harriet were attacked by those bandits . m_02 1 those bandits attacked emma and harriet yesterday . m_02 1 the vase was smashed deliberately . m_02 1 the sheep got infected with scrapie . m_02 1 the fans were deliberately provoked by a rival group . m_02 1 the fans got deliberately provoked by a rival group . m_02 1 six students got shot accidentally . m_02 1 some gifts get used a dozen or so times a year . m_02 1 ca n't you see i 'm reading ? m_02 1 people go hunting in the autumn . m_02 1 we spent yesterday cooking . m_02 1 she buys for harrods . m_02 1 i saw and he chops . m_02 1 this sweater washes well . m_02 1 this book reads well . m_02 1 these cars sold very quickly last week . m_02 1 it will take years for the mersey to clean . m_02 1 the course is jumping well . m_02 1 one bomb did n't guide and crashed . m_02 1 fiona may be here by 5 o'clock . m_02 1 if fiona is here by 5 o'clock , we can go to the party . m_02 1 it 's high time fiona got a job . m_02 0 * it 's high time fiona gets a job . sgww85 1 pat is either stupid or a liar . sgww85 1 pat is a republican and proud of it . sgww85 1 pat is healthy and of sound mind . sgww85 1 pat is either asleep or at the office . sgww85 1 that was a rude remark and in very bad taste . sgww85 1 sandy is either a lunatic or under the influence of drugs . sgww85 1 i am hoping to get an invitation and optimistic about my chances . sgww85 1 i am neither an authority on this subject nor trying to portray myself as one . sgww85 1 pat was neither recommended for promotion nor under any illusions about what that meant . sgww85 1 pat has become a banker and very conservative . sgww85 1 i consider that a rude remark and in very [ np and pp ] bad taste . sgww85 1 the scene of the movie was in chicago . sgww85 0 * the scene of the movie and that i wrote was in chicago . sgww85 1 john sang beautifully . sgww85 1 john sang a carol . sgww85 0 * john sang beautifully and a carol . sgww85 1 kim sang and sandy danced . sgww85 1 kim and sandy met . sgww85 1 kim sang and was accompanied by sandy . sgww85 0 * the irritating and a bully man was my brother . sgww85 0 * soon irritating and a bully started shouting again . sgww85 1 kim was a banker . sgww85 1 dana was quite competent . sgww85 1 leslie was in the flood zone . sgww85 1 ronnie was talking to lou . sgww85 1 jean was given a prize . sgww85 1 pat has become a republican . sgww85 1 gerry became quite conservative . sgww85 0 * connie has become of the opinion that we should get out . sgww85 0 * tracy became awarded a prize . sgww85 0 * chris will become talking to colleagues . sgww85 1 pat became a republican and quite conservative . sgww85 0 * tracy has become a republican and of the opinion that we must place nuclear weapons in europe . sgww85 0 * chris became quite conservative and trying to change their minds . sgww85 0 * gerry became a republican and awarded a prize . sgww85 1 we walked slowly and with great care . sgww85 1 they wanted to leave tomorrow or on tuesday . sgww85 1 we are open saturdays , any national holiday , and on alternate . sgww85 1 kim alienates cats and beats his dog . sgww85 1 kim alienates cats and beat his dog . sgww85 1 kim alienated cats and beats his dog . sgww85 1 kim alienated cats and beat his dog . sgww85 0 * kim alienated cats and beaten his dog . sgww85 0 * kim beating his dog and alienates cats . sgww85 0 * kim to beat his dog and alienated cats . sgww85 0 * kim beaten his dog and alienates cats . sgww85 1 which student 's grades went unreported ? sgww85 1 they found pictures of themselves . sgww85 0 * who did you say my talking to would bother hilary ? sgww85 1 who did you say my talking to would bother ? sgww85 1 which article did terry file without reading ? sgww85 1 which books did robin read and hate ? sgww85 0 * which books did robin talk to chris and read ? sgww85 0 * which books did robin read and talk to chris ? sgww85 0 * who did robin visit and ? sgww85 1 they talked to kim and to each other . sgww85 1 he hated himself and his friends . sgww85 1 they were wary of themselves and of each other . sgww85 1 they asked which students and which teachers would get along together . sgww85 1 we called up every man whose father and whose mother had played on the team . sgww85 1 i went to the store and bought some whiskey . sgww85 1 she 's gone and ruined her dress now . sgww85 1 i 've got to try and find that screw . sgww85 1 she goes and buys some whiskey . sgww85 1 i have gone and bought some whiskey . sgww85 1 i will go and buy some whiskey . sgww85 1 i will try and buy some whiskey . sgww85 0 * i have gone and buys some whiskey . sgww85 0 * to go and buying whiskey is not the solution to your problem . sgww85 0 * i will go and bought some whiskey . sgww85 0 * i tried and buy some whiskey . sgww85 0 * i was trying and buying some whiskey . sgww85 0 * what did you say i went and get ? sgww85 0 * what did you say i go and got ? sgww85 1 i went to the store and i bought some whiskey . sgww85 1 i 've got to try and i 've got to find that screw . sgww85 1 i both went to the store and bought some whiskey . sgww85 1 i 've got to both try and find that screw . sgww85 1 here 's the whiskey which i went to the store and bought . sgww85 1 which dress has she gone and ruined now ? sgww85 1 the screw which i 've got to try and find holds the door to the frame . sgww85 1 either we americans or i myself will get ourselves in trouble . sgww85 1 either you or i will incriminate ourselves . sgww85 1 you and i may incriminate ourselves . sgww85 1 we americans and the british pamper ourselves . sgww85 1 you british and you americans pamper yourselves . sgww85 1 you british or you americans will get yourselves in trouble . sgww85 1 you and kerry have outdone yourselves . sgww85 1 you or kerry have perjured yourselves . sgww85 1 the boys and the girls seem happy . sgww85 0 * the boys and the girls seems happy . sgww85 1 either the boys or the girls are going to be there . sgww85 1 the students and professor swansong are meeting in the park . sgww85 1 either professor swansong or the graduate students are going to proctor the exam . sgww85 1 either dana or lee is going to lead the parade . sgww85 1 kim and terry are happy . sgww85 0 * either the boys or the girls is going to be there . sgww85 0 * the students and professor swansong is meeting in the park . sgww85 0 * either professor swansong or the graduate students is going to proctor the exam . sgww85 1 either dana or lee are going to lead the parade . sgww85 1 kim likes sandy , and lee leslie . to try to go to rome . sgww85 1 pat wanted to try to go to berne , and chris to go to rome . to rome . sgww85 1 kim went to the store , and then lou . sgww85 1 some people go by car , but others by bike . sgww85 1 some people like bagels , but others cream cheese . sgww85 1 on weekdays , terry eats meat and vegetables , but on weekends , only vegetables . sgww85 0 * john drinks coffee at 11 , and mary , tea at 10:30 . sgww85 1 john gave the books to mary at christmas , and the records to sue for her birthday . sgww85 1 john talked to his supervisor about his thesis , and erich to the dean about department politics . sgww85 1 a businessman will drink a martini to relax , and a health nut , a glass of wine , just to remain healthy . sgww85 0 * john left at 11 and at 12 , bill . sgww85 1 john left his office at 11 and at 12 , the library . sgww85 1 a policeman walked in at 11 , and at 12 , a fireman . sgww85 1 two days ago , we went out to dinner , and this afternoon , to the movies . sgww85 1 on this table , they put a lamp , and on that table , a radio . sgww85 0 * john did n't see mary and bill sue . sgww85 1 john did n't give the books to mary and the papers to sue . sgww85 0 * kim likes sandy , and lee to leslie . sgww85 0 * pat wanted to go to berne , and chris going to rome . sgww85 0 * kim gave a dollar to bobbie and a dime into his pocket . sgww85 0 * kim likes lee , and to ronnie . sgww85 0 * kim likes sandy and lee likes to leslie . sgww85 1 leslie is rather foolish , and lou a complete idiot . sgww85 1 kim seems to be just surviving , and terry in dire need of our help . sgww85 1 we consider leslie rather foolish , and lou a complete idiot . sgww85 1 pat has become crazy , and chris an incredible bore . sgww85 0 * pat has become crazy , and chris in good spirits . sgww85 0 * i gave a book to john 's mother and a magazine to him . sgww85 1 pat remembered the appointment and that it was important to be on time . sgww85 1 that goldstein appointed heydrich and the implications thereof frightened many observers . sgww85 1 we talked about mr. colson and that he had worked at the . sgww85 1 you can depend on my assistant and that he will be on time . sgww85 1 pat was annoyed by the children 's noise and that their parents did nothing to stop it . sgww85 0 * we talked about that he had worked at the white house . sgww85 0 * you can depend on that he will be on time . sgww85 0 * pat was annoyed by that their parents did nothing to stop it . sgww85 1 we talked about the issues we had worked on as students and that our perspectives had changed over the years . sgww85 0 * we talked about that our perspectives had changed over the years and the issues we had worked on as students . sgww85 1 that our perspectives had changed over the years and the issues we had worked on as students were the topics of discussion . sks13 1 the clever snake disappeared into a hole in the ground . sks13 0 * hole into disappeared ground the the in clever a little . sks13 0 * the snake clever disappeared into a hole in the ground . sks13 0 * this girl in the red coat will put a picture of bill it on your desk before tomorrow . sks13 0 * this girl in the red coat will put a picture of bill on your desk there before tomorrow . sks13 0 * this girl in the red coat one will put a picture of bill on your desk before tomorrow . sks13 0 * this girl in the red coat will put a picture of bill on your desk it before tomorrow . sks13 1 this girl in the red coat will put a picture of bill on your desk before tomorrow . sks13 1 bill will put a picture of this girl in the red coat on your desk before tomorrow . sks13 1 she will put a picture of bill on your desk before tomorrow . sks13 0 * bill will put a picture of she on your desk before tomorrow . sks13 1 bill will put a picture of her on your desk before tomorrow . sks13 0 * her will put a picture of bill on your desk before tomorrow . sks13 0 * she her will put a picture of bill on your desk before tomorrow . sks13 0 * bill will put a picture of she her on your desk before tomorrow . sks13 1 clean your desk before tomorrow . sks13 1 this girl will put a picture of bill on your desk before tomorrow . sks13 1 this boy must not go to school , and his father must not go to school either . sks13 1 this boy must not go to france , but his father must go to france . sks13 1 this actress must play in this movie and she will play in this movie . sks13 1 can mary win the race and will sue win the race too ? sks13 1 this girl will buy bread and so will that one buy bread . sks13 1 the tourists will go to the park . sks13 1 will the tourists go to the park ? sks13 1 some student from australia speaks chinese . sks13 1 does some student from australia speak chinese ? sks13 1 they would have been walking for hours . sks13 1 would they have been walking for hours ? sks13 1 this girl will not buy bread , will she buy bread ? sks13 1 sean penn can act well in many kinds of movies , ca n't he act well in many kinds of movies ? sks13 1 you will put a picture of bill on your desk before tomorrow . sks13 1 this girl in the red coat or you will put a picture of bill on your desk before tomorrow . sks13 1 no boys will put a picture of bill on your desk before tomorrow . sks13 1 this girl in the red coat but no boys will put a picture of bill on your desk before tomorrow . sks13 1 this girl in the red coat will put it and a picture of bill on your desk before tomorrow . sks13 1 this girl in the red coat will put a picture of bill in the mailbox before tomorrow . sks13 1 this girl in the red coat will put a picture of bill on your desk after the dinner . sks13 1 this girl in the red coat will put a picture of bill on your desk after the dinner and before tomorrow . sks13 1 this girl in the red coat will eat her breakfast before tomorrow . sks13 1 this girl in the red coat will eat her breakfast before tomorrow and put a picture of bill on your desk before tomorrow . sks13 1 this girl in the red coat will eat her breakfast and will put a picture of bill on your desk before tomorrow . sks13 1 this girl in the red coat will put a picture of bill on your desk . sks13 1 this girl in the red dress must put a picture of bill on your desk . sks13 0 * this girl in the red coat will and dress must put a picture of bill on your desk . sks13 0 * this girl in the or on the red coat will put a picture of bill on your desk . sks13 1 john and mary will play with henry and with sue . sks13 1 they play unusual music , and i listen to unusual music . sks13 1 they play and i listen to unusual music . sks13 1 i love ice milk tea but you hate ice milk tea . sks13 1 i love but you hate ice milk tea . sks13 1 she may have thawed the roast and should have thawed the roast . sks13 1 she may have and should have thawed the roast . sks13 1 smith loaned a valuable collection of manuscripts to the library , and his widow later donated a valuable collection of manuscripts to the library . sks13 1 smith loaned and his widow later donated a valuable collection of manuscripts to the library . sks13 1 i borrowed large sums of money from the bank , and my sister stole large sums of money from the bank . sks13 1 i borrowed and my sister stole large sums of money from the bank . sks13 0 * put a picture of bill on your desk , this girl in the red coat will put a picture of bill on your desk before tomorrow . sks13 1 mary should know that you must go to the station . sks13 1 that you must go to the station , mary should know that you must go to the station . sks13 0 * this your , this girl in the red coat will put a picture of bill on your desk before tomorrow . sks13 0 * will bill , this girl in the red coat will put a picture of bill on your desk before tomorrow . sks13 0 * red picture desk , this girl in the red coat will put a picture of . sks13 0 * before your , this girl in the red coat will put a picture of bill on your desk before tomorrow . sks13 0 * girl in the red coat , this girl in the red coat will put a picture of bill on your desk before tomorrow . sks13 0 * will put a picture of bill on your desk before tomorrow , this girl in the red coat will put a picture of bill on your desk before tomorrow . sks13 0 * the red , this girl in the red coat will put a picture of bill on your desk before tomorrow . sks13 0 * of bill on , this girl in the red coat will put a picture of bill on your desk before tomorrow . sks13 0 * will put , this girl in the red coat will put a picture of bill on your desk before tomorrow . sks13 0 * your desk before , this girl in the red coat will put a picture of bill on your desk before tomorrow . sks13 1 it is your notes that john wants to look at after class . sks13 1 it is after class that john wants to look at your notes . sks13 1 it is john who wants to look at your notes after class . sks13 1 it was ann who bought a first edition of richard iii for $ 1000 . sks13 1 it was a first edition of richard iii that ann bought for $ 1000 . sks13 1 it was for $ 1000 that ann bought a first edition of richard iii . sks13 0 * it is before tomorrow that this girl in the red coat will put a picture of bill on your desk before tomorrow . sks13 1 mary saw the tall man coming from england . sks13 0 * it is the tall man coming from england that mary saw the tall man coming from england . sks13 1 mary saw the tall man come from the back . sks13 0 * it is the tall man come from the back that mary saw the tall man come from the back . sks13 1 it is a picture of bill that this girl in the red coat will put on your desk before tomorrow . sks13 0 * it is put a picture of bill on your desk before tomorrow that this girl in the red coat will . sks13 1 what john wants to look at now is your notes . sks13 0 * what mary gave was a book to john . sks13 0 * what mary donated was a lot of money to npr . sks13 1 it is to cleveland that john drove the truck . sks13 1 what john became was deadly afraid of flying . sks13 0 * it is deadly afraid of flying that john became . sks13 1 john told us that he wants to quit school . sks13 0 * it is that he wants to quit school that john told us . sks13 1 what john told us is that he wants to quit school . sks13 1 john promised us to be gentle . sks13 0 * it is to be gentle that john promised . sks13 1 mary will arrive tomorrow . sks13 0 * it is arrive tomorrow that mary will . sks13 1 henri wants the book which is on the top shelf . sks13 1 what henri wants is the book which is on the top shelf . sks13 1 the spy became too friendly with his new contacts . sks13 1 what the spy became was too friendly with his new contacts . sks13 1 what this girl in the red coat will do is put a picture of bill on your desk before tomorrow . sks13 1 henri wants to buy these books about cooking . sks13 1 which books about cooking does henri want to buy ? sks13 1 i sent it to you . sks13 0 * i sent to you it . sks13 0 * i sent to you recipes . sks13 1 bill 's mother 's friends are waiting at the restaurant . sks13 1 bill 's mother 's friends and john are waiting at the restaurant . sks13 1 it was john that was waiting at the restaurant . sks13 0 * it was john bill that were waiting at the restaurant . sks13 1 it was john and bill that were waiting at the restaurant . sks13 1 i will eat spaghetti on sunday with marco . sks13 1 i will speak to hector about this . sks13 1 i doubt that mary reads mysteries . sks13 1 he muttered that the visitors will leave . sks13 1 the fact that john is snoring is informative . sks13 1 the man that mary saw knew me . sks13 1 that the visiting team won the race could surprise them . sks13 1 that is what you should see . sks13 1 john knows that she left . sks13 1 john knows whether she will come back . sks13 1 john knows that she left and whether she will come back . sks13 1 john knows that she left and john knows whether she will come back . sks13 1 john asked whether she left . sks13 1 i doubt if she kicks perfect goals every time . sks13 1 they think that she can do it . sks13 1 whether she left is most unclear . sks13 1 that the girl put a picture there proves her guilt . sks13 1 i prefer for the girl to put a picture there . sks13 1 for the girl to put a picture there is what i prefer . sks13 1 for the girl to put a picture there would surprise you . sks13 1 i prefer for the girl to win . sks13 0 * i prefer for the girl to will win . sks13 0 * i prefer for the girl to wins . sks13 1 let 's walk . sks13 1 i run on the beach . sks13 1 the three sunbathers went swimming . sks13 1 i hope that mary wins . sks13 1 they know if mary won . sks13 1 i wonder whether mary will win . sks13 1 they prefer for mary to leave . sks13 1 john wonders whether mary will win . sks13 1 john wonders whether to win . sks13 1 whether she will win is a question mary never considered . sks13 1 whether to win is a question mary never considered . sks13 1 i think that you will see that the girl will put a picture on your desk . sks13 1 they understand that you will prefer for the girl to put a picture on your desk . sks13 1 mary cuts the paper easily . sks13 1 the paper cuts easily . sks13 1 that he won the race could surprise them . sks13 0 * that him won the race could surprise them . sks13 1 for him to win the race would surprise them . sks13 0 * for he to win the race would surprise them . sks13 1 john saw mary . sks13 1 harry likes movies . sks13 1 for mary to leave on time is important . sks13 0 * think about linguistics all night , she does think about linguistics all night . sks13 0 * climb to the top , they do climb to the top . sks13 1 john can go to the market on his bike . sks13 1 mary should buy some flowers on sunday . sks13 1 my niece could write me letters before her third birthday . sks13 1 my nephew could write letters to his parents with a fountain pen . sks13 1 john can go to the market quickly . sks13 1 mary should buy some flowers for her mother to arrange . sks13 1 my niece could write me letters more faithfully . sks13 1 john can quickly go to the market . sks13 1 my niece could more faithfully write me letters . sks13 0 * john can go to the market to india . sks13 0 * mary should buy some flowers some bread . sks13 0 * my niece could write me you letters . sks13 0 * my nephew could write letters the postcards to his parents . sks13 1 john can go to the market on his bike on a truck . sks13 1 mary should buy some flowers on sunday at 5 o'clock . sks13 1 my nephew could write letters to his parents with a fountain pen with your help . sks13 1 pelé visited his uncle . sks13 1 she sold the car to sam for five dollars . sks13 1 she ran the car on propane from reno to vegas . sks13 1 the process changed the substance from solid to liquid to gas to energy . sks13 1 we associated their subsidiaries with our corporate office . sks13 1 i cycled around france . sks13 1 mary drank some beer in the barn from 6 to nine . sks13 1 it was in the barn or it took place in the barn . sks13 0 * it was some beer or it took place some beer . sks13 1 they wonder whether mary will run . sks13 1 they wonder about this . sks13 1 they wonder . sks13 1 i know that she runs . sks13 1 i know . sks13 1 i said that she runs . sks13 1 i said that . sks13 0 * i said . sks13 1 i prefer for mary to run . sks13 1 i prefer this . sks13 0 * i prefer . sks13 1 i said for mary to run . sks13 1 i said this . sks13 1 i put the book on the shelf . sks13 0 * i put the book . sks13 0 * i put . sks13 1 two ships appeared , arrived , remained , emerged . sks13 1 suddenly , there appeared two ships on the horizon . sks13 1 two inspectors from the ins appeared , arrived , remained , emerged . sks13 1 the ice melts , breaks . sks13 1 the door opens , closes . sks13 1 they melted , broke the ice . sks13 1 they opened , closed the door . sks13 1 they cooked , thickened the soup . sks13 1 i go , run , swim , jump , fly , crawl , dance , walk . sks13 0 * they went me , ran me , swam me , jumped me , flew me , crawled me , danced me , walked me . sks13 1 i danced a dance . sks13 1 he walked the walk . sks13 1 the time elapsed slowly . sks13 0 * the time elapsed the day . sks13 1 i see stars . sks13 1 i see . sks13 1 i liked mary . sks13 0 * i liked . sks13 1 they surrounded the fort . sks13 0 * they surrounded . sks13 1 i gave the charity . sks13 1 i gave money . sks13 1 i gave . sks13 1 i handed the ball to reg . sks13 0 * i handed the ball . sks13 0 * i handed to reg . sks13 0 * i handed . sks13 1 john ate . sks13 1 john knows . sks13 0 * john needed . sks13 0 * john criticized . sks13 1 john saw . sks13 1 john told . sks13 1 the agency classified the documents . sks13 0 * the agency classified . sks13 1 the war intensified the poverty . sks13 1 this project is manageable . sks13 1 it mattered on sunday . sks13 1 i saw john on sunday . sks13 1 i put the book on the desk on sunday . sks13 1 i saw john with a telescope . sks13 0 * it mattered with a telescope . sks13 1 i covered the bread with butter . sks13 0 * i emptied it with butter . sks13 1 mary will complete her exam within an hour . sks13 0 * mary will complete her exam for an hour . sks13 1 the hiker will reach the top of the mountain within an hour . sks13 0 * the hiker will reach the top of the mountain for an hour . sks13 1 henri will paint the floor for an hour . sks13 1 i will read linguistics for an hour . sks13 1 the student left . sks13 1 only the student left . sks13 1 even the student left . sks13 1 all the students left . sks13 1 i saw the student . sks13 1 i saw only the student . sks13 1 i saw all the students . sks13 1 john , who i saw yesterday , will visit us . sks13 1 i saw the brilliant student . sks13 1 i saw the brilliant one . sks13 1 i saw the brilliant student with long hair . sks13 1 i saw the brilliant one with long hair . sks13 1 i saw the one with long hair . sks13 1 i saw the physics student . sks13 0 * i saw the physics one . sks13 1 i saw the student of physics . sks13 0 * i saw the one of physics . sks13 1 i saw the student of physics with long hair . sks13 1 the big student of physics with long hair in the library . sks13 1 it is big . sks13 1 it is with long hair . sks13 0 * it is of physics . sks13 1 it is in the library . sks13 1 they are intense . sks13 0 * they are intense of bill . sks13 1 they intensified . sks13 1 they are special . sks13 0 * they are special of bill . sks13 1 they specialized . sks13 1 she is proud . sks13 1 she is the mother . sks13 1 she is the mother of john . sks13 1 they read the paper . sks13 1 the paper is readable . sks13 0 * it is readable of the paper . sks13 0 * they are readable of the paper . sks13 1 the driver of the car thinks that mary should leave dallas for boise tomorrow . sks13 1 her little sister will disagree with her . sks13 1 the girl he met at the departmental party will very surely call him . sks13 1 beavers build dams . sks13 1 john will see you . sks13 1 john thinks that mary left . sks13 1 john thinks mary left . sks13 1 john whispered that mary left . sks13 1 john will carefully study russian . sks13 1 john carefully studies russian . sks13 0 * john studies carefully russian . sks13 1 i wonder if she will use paints . sks13 1 yes , she will . sks13 0 * yes , she . sks13 0 * yes , she will use . sks13 1 i wonder if she used paints . sks13 1 yes , she did . sks13 0 * yes , she used . sks13 1 john will have been eating cake . sks13 0 * mary wo n't have been eating cake , but john . sks13 1 mary wo n't have been eating cake , but john will . sks13 1 mary wo n't have been eating cake , but john will have . sks13 1 mary wo n't have been eating cake , but john will have been . sks13 1 john will enthusiastically have been eating cake . sks13 1 john will have enthusiastically been eating cake . sks13 0 * john will have been eating enthusiastically cake . sks13 1 john will have been eating cake enthusiastically . sks13 0 * john studied carefully russian . sks13 1 john has carefully studied russian . sks13 1 john had carefully studied russian . sks13 1 john is carefully studying russian . sks13 1 john was carefully studying russian . sks13 1 john goes to school . sks13 0 * goes john to school ? sks13 1 mary thinks that bill will come . sks13 0 * mary thinks whether bill will come . sks13 0 * mary thinks for bill to come . sks13 1 mary wonders whether bill will come . sks13 0 * mary wonders for bill to come . sks13 0 * mary prefers that bill will come . sks13 0 * mary prefers whether bill will come . sks13 1 mary prefers for bill to come . sks13 1 i wonder has mary worked for microsoft . sks13 1 i wonder whether mary has worked for microsoft . sks13 0 * i wonder whether has mary worked for microsoft . sks13 0 * i wonder has whether mary worked for microsoft . sks13 1 will john not go to school ? sks13 1 has henri not studied for his exam ? sks13 1 did sue not pass her exam ? sks13 1 wo n't john go to school ? sks13 1 should n't mary taste the soup ? sks13 1 has n't henri studied for his exam ? sks13 1 is n't bill sick ? sks13 1 did n't sue pass her exam ? sks13 0 * will not john go to school ? sks13 0 * should not mary taste the soup ? sks13 0 * has not henri studied for his exam ? sks13 0 * is not bill sick ? sks13 0 * did not sue pass her exam ? sks13 0 * sue put . sks13 0 * henri arrived bill . sks13 0 * mary wonders that john said if bill left . sks13 0 * henri told sue in the drawer that bill put socks . sks13 1 she will win the race . sks13 0 * her will the race . sks13 1 elmer finished the cake and john did too , finish the cake . sks13 1 we need to provide two trees and . sks13 1 we also need to explain the relation between these trees . sks13 0 * john not liked mary . sks13 0 * john liked not mary . sks13 1 john did not like mary . sks13 1 john will endorse the treaty , but georges will not endorse the treaty . sks13 1 will george indeed not endorse the treaty ? sks13 0 * he will indeed not endorse the treaty . sks13 1 he will indeed endorse the treaty . sks13 1 he will not endorse the treaty ; and indeed . sks13 1 john thinks that bill left . sks13 1 john asked whether bill left . sks13 1 john was wondering whether to leave or not . sks13 1 john was wondering whether to leave . sks13 0 * i read these big three books . sks13 0 * mary sent . sks13 1 mary sent a book to bill . sks13 1 mary send a book . sks13 1 mary sent bill a book , … . sks13 0 * bill examined a book . sks13 0 * sincerity examined a book . sks13 0 * we put . sks13 1 we put a book on the table . sks13 1 we think that bill left . sks13 0 * we think for bill left . sks13 0 * we think if bill left . sks13 1 we wonder whether bill left . sks13 1 we wonder if bill left . sks13 0 * we wonder that bill left . sks13 1 john came in . sks13 1 then , john left . sks13 1 he took his umbrella . sks13 1 he hurt himself with it when he tried to open it . sks13 1 the idiot ca n't even open an umbrella ! sks13 0 * john hurt john with john 's umbrella when john tried to open it . sks13 1 john ca n't even open an umbrella ! sks13 1 john said he was sick . sks13 1 the ta who graded him says that john did really well . sks13 0 * himself should decide soon . sks13 0 * mary wrote a letter to himself last year . sks13 1 he should decide soon . sks13 1 mary wrote a letter to him last year . sks13 1 our rabbit and the neighbor 's cat like each other . sks13 1 the boys fought with each other . sks13 1 each of our rabbit and the neighbor 's cat likes the other . sks13 1 each of the boys fought with the other boys . sks13 1 the boy likes himself . sks13 0 * the boy likes herself . sks13 0 * the boy likes themselves . sks13 1 the girls likes themselves . sks13 0 * the girls likes herself . sks13 1 each of the girls likes herself . sks13 0 * the girls likes yourselves . sks13 0 * himself likes john . sks13 1 mary 's pictures of herself surprised bill . sks13 1 i noticed john 's excessive appreciation of himself . sks13 1 mary noticed john 's excessive appreciation of himself . sks13 0 * mary noticed john 's excessive appreciation of herself . sks13 0 * mary noticed that john excessively appreciates herself . sks13 1 john loved the new pictures of himself . sks13 1 i showed mary several portraits of herself . sks13 0 * john believes that mary saw himself . sks13 1 mary noticed that john excessively appreciates himself . sks13 1 mary appreciates only john and herself . sks13 0 * mary appreciates john and himself . sks13 1 mary really appreciates and constantly praises herself and sue knows it . sks13 0 * mary really appreciates and constantly praises himself and bill knows it . sks13 1 john heard their criticism of each other . sks13 1 john heard their criticism of themselves . sks13 0 * they heard john 's criticism of each other . sks13 0 * they heard john 's criticism of themselves . sks13 0 * john heard that they criticized each other . sks13 0 * they heard that john criticized each other . sks13 1 john likes himself . sks13 1 the students are proud of themselves . sks13 1 everyone likes himself . sks13 1 no spy betrayed himself . sks13 1 i heard john 's criticism of himself . sks13 0 * i heard john 's criticism of myself . sks13 1 john heard that i criticized myself . sks13 0 * i heard that john criticized myself . sks13 1 mary likes herself . sks13 0 * our rabbit and the neighbor 's cat like them . sks13 0 * bill likes herself . sks13 0 * himself laughs . sks13 1 the girls likes them . sks13 1 john 's mother likes him . sks13 1 john believes that bill saw himself . sks13 1 john believes that bill saw him . sks13 0 * mary believes that bill saw herself . sks13 1 they like their books . sks13 1 everyone thinks he is smart . sks13 1 who in this class thinks he is smart ? sks13 1 bill 's mother saw him . sks13 0 * no one 's mother saw himself . sks13 1 the mayor of john 's hometown wrote to him . sks13 1 the builder of his house visited peter . sks13 1 that is a bird . sks13 1 that 's the truth . sks13 1 he is john . sks13 1 bob dylan is robert zimmerman . sks13 1 i like mary and she likes me . sks13 0 * i like mary and she does too . sks13 0 * i like mary and she does like mary too . sks13 1 she considers john proud of his work . sks13 1 they saw bill leave . sks13 1 mary prefers that her ice cream is in a cone . sks13 1 henry saw that bill left . sks13 1 what mary prefers is her ice cream in a cone . sks13 0 * what she considers is john proud of his work . sks13 0 * what henry found is bill sad . sks13 0 * what they saw is bill leave . sks13 0 * what henry find was bill sad . sks13 0 * john heard mary describe himself . sks13 1 john heard mary describe herself . sks13 0 * mary considers john proud of herself . sks13 1 mary considers john proud of her . sks13 1 mary considers john proud of himself . sks13 1 john believes himself to be proud of mary . sks13 1 the pictures of bill she put on your desk . sks13 1 which pictures of bill did she put on your desk . sks13 1 susan wanted to sleep . sks13 1 she put the pictures of bill on your desk . sks13 1 the pictures of bill , she put on your desk . sks13 0 * the picture of bill she slept . sks13 0 * she slept the picture of bill . sks13 1 you put which picture of bill on his desk ? sks13 1 which picture of bill did you put on his desk ? sks13 1 how many strings did you say she had to pull in order to do that ? sks13 1 how much care do you think he would be taking of his patients under those circumstances ? sks13 1 how much headway is he likely to make . sks13 1 who left bill . sks13 0 * whom left bill . sks13 1 who did bill leave . sks13 1 whom did bill leave . sks13 1 is there anything to do today ? sks13 1 there are two main characters in the novel . sks13 1 there are 3 firemen available . sks13 0 * there stabbed an animal . sks13 0 * there ran many people . sks13 0 * mary judged there . sks13 0 * i had a realization of there . sks13 1 there were seven people . sks13 1 there were several doctors available . sks13 1 rodney was eating some squid , was n't he ? sks13 1 there is a man ready to jump from the roof , is n't there ? sks13 1 sharks seem to swim slowly in the tropics . sks13 1 the cat seems to be out of the bag . sks13 1 the shit seems to have hit the fan . sks13 0 * there run many people . sks13 1 there seems to be a nurse available . sks13 0 * there seems to stab an animal . sks13 0 * there seems to run many people to the station . sks13 1 it seems that john left . sks13 1 several people seem sick . sks13 1 john considers several people sick . sks13 1 there are several people sick . sks13 1 several people seem several people sick . sks13 1 several people are sick . sks13 1 bill is sick . sks13 1 susan hopes to sleep . sks13 1 susan hopes that she will sleep . sks13 0 * susan hopes susan to sleep . sks13 0 * everyone hopes him to sleep . sks13 1 everyone hopes to sleep . sks13 1 everyone hopes that everyone will sleep . sks13 0 * susan hopes her to sleep . sks13 1 only churchill remembered giving the blood , sweat and tears speech . sks13 1 only churchill remembered his giving the blood , sweat and tears speech . sks13 1 only churchill remembered himself giving the blood , sweat and . sks13 1 susan hopes herself to sleep . sks13 1 for john to hurt his friends is stupid . sks13 1 to hurt his friends is stupid . sks13 1 for john to hurt himself is stupid . sks13 1 to hurt oneself is stupid . sks13 0 * for john to hurt oneself is stupid . sks13 1 john promised bill to leave . sks13 1 john promised mary that he would leave . sks13 1 john promised mary to cut the grass . sks13 1 john promise mary to control himself . sks13 0 * john promised mary to control herself . sks13 0 * john promised mary to shave herself . sks13 1 john seems to sleep all day . sks13 1 john hopes to sleep . sks13 1 john tried to sleep . sks13 0 * john believes to have slept . sks13 1 john believes bill to have slept . sks13 0 * john believes for bill to have slept . sks13 1 john believes that bill has slept . sks13 0 * john believes bill that mary has slept . sks13 0 * john convinced to sleep . sks13 1 john convinced bill to sleep . sks13 0 * john convinced bill for mary to sleep . sks13 0 * john convinced that bill has slept . sks13 0 * it convinced bill that mary should sleep . sks13 1 john believes it to be obvious that bill left . sks13 1 john believes it to be raining . sks13 0 * john convinced it to be obvious that bill left . sks13 0 * john convinced it to be raining . sks13 0 * john convinced there to be several firemen available . sks13 1 bill cooked the rice . sks13 1 the rice was cooked by bill . sks13 1 bill visited mary . sks13 1 mary was visited by bill . sks13 1 john believes bill to have cooked the rice . sks13 1 john believes the rice to have been cooked by bill . sks13 1 john believes bill to have visited mary . sks13 1 john believes mary to have been visited by bill . sks13 1 john convinced bill to cook the rice . sks13 0 * john convinced the rice to be cooked by bill . sks13 1 john convinced bill to visit mary . sks13 1 john believes that bill slept . sks13 1 i sent money . sks13 1 i sent mary money . sks13 1 i sent money to mary . sks13 0 * i sent bill money to mary to sam . sks13 1 i worked on sunday in the city on that project without a break . sks13 1 i praised mary . sks13 0 * i praised . sks13 1 the moon glows in the darkness . sks13 1 the moon glows . sks13 1 i sang a song with mary while you did so with bill . sks13 1 what mary did with bill was sing a song . ad03 1 she tried to leave ad03 1 who said he would give the cloak to lee ? ad03 0 * gilgamesh does n't be in the dungeon ad03 1 which book about herself did jenny say that anson had written . ad03 1 paul had eighty eight billion sixty three million forty-four thousand nine hundred at ad03 0 * what i said that was we would go . ad03 1 the boy thought she was happy . ad03 1 the landlord donated a helicopter ad03 1 most dragons have been neutered . ad03 1 who did you meet all when you were in derry ? ad03 1 jason persuaded medea to desert her family . ad03 1 michael abandoned an old friend at mardi gras ad03 1 you friends of the king are all the same ad03 1 he is that kind of actor ad03 0 * lucy 's gomez 's wallet ad03 1 medea tended to appear to be evil . ad03 1 he 's bound to could do it ad03 1 nathan received the cloak from benjamin ad03 1 that the world is round is obvious . ad03 1 poseidon wept , after the executioner left . ad03 1 i asked who did medea poison . ad03 1 i never liked his analysis . ad03 0 * peter is some happy pigs which can fly . ad03 0 * gilgamesh not left . ad03 0 * there arrived by medea . ad03 1 i might have eaten some seaweed . ad03 1 there appears to be a problem with this solution . ad03 1 what julie became was fond of lloyd . ad03 1 bill did not defeat the gods but gilgamesh did . ad03 1 aphrodite frees animals ad03 0 * the hospital was donated the book to . ad03 1 medea , jason poisoned . ad03 0 * they kicked himself ad03 1 emily showed benjamin himself in the mirror . ad03 1 jason was killed by medea . ad03 1 how did you eat the cake ? ad03 1 i asked who medea poisoned . ad03 1 aphrodite wanted to live and ishtar tried to ad03 1 i was sitting not under the tree but under the bush ad03 1 the child wails ad03 1 gilgamesh has n't left ad03 0 * whiskey do i drink . ad03 1 dracula thought that he was the prince of darkness . ad03 1 he looked up the number . ad03 1 she has kissed her . ad03 1 agamemnon stopped jason casting the spell ad03 1 humans love to eat some disgruntled old pigs in those ditches . ad03 1 jason whispered that the phoenix had escaped ad03 1 ron definitely has bought a dog . ad03 0 * he book ad03 1 who is it obvious that plato loves . ad03 0 * which god the statue ? ad03 0 * kiss pigs is my happiest memory ad03 0 * dante accused ad03 1 that picture of jenny in a rubber dress does n't flatter her . ad03 1 he might could go ad03 1 benjamin gave lee the cloak and nathan the chalice . ad03 1 that monkey is ate the banana ad03 1 i bought a book about harry ad03 0 * the children wails ad03 1 who was it obvious that plato loved ? ad03 1 it was for jenny that i intended to be present . ad03 1 i think she is pregnant ad03 1 it 's extremely windy today . ad03 0 * who did you believe that to kiss seemed wrong ? ad03 1 jason would prefer medea to have cursed agamemnon . ad03 0 * the therapist 's analysis of lucy 's ad03 1 who did athena introduce to whom ? ad03 1 it appears that poseidon owns a dragon ad03 1 i have often eaten muffins . ad03 1 gilgamesh can seek ishtar ad03 1 you kicked yourself ad03 1 agamemnon seems to have left . ad03 1 the dragons had all eaten the pigs . ad03 1 anson shot the dinosaur with his rifle in the jungle ad03 0 * genie intoned the mirror . ad03 1 i often have eaten muffins . ad03 1 he kicked himself ad03 1 gilgamesh has not read the cuneiform tablets . ad03 1 he might maybe do that , might n't he ? ad03 1 i intended for jenny to be present . ad03 0 * we believed to be omnipotent . ad03 1 whose poem about achilles did homer persuade jason that he should read ? ad03 1 jason would prefer for medea to have cursed agamemnon . ad03 1 i asked who medea gave what ? ad03 1 i have every hope that you will defeat him . ad03 1 paris is no more ad03 1 he will can do it ad03 1 we believed him to be the headmaster ad03 1 who kissed who ? ad03 1 who did you say that john thought would leave early ? ad03 0 * any boy saw no one . ad03 0 * what i arranged for jenny was to be present . ad03 0 * he kicked herself ad03 1 cassandra has warned agamemnon again . ad03 1 gilgamesh has been fighting the dragon . ad03 1 lucy 's photograph of jane ad03 1 who did jason think medea had poisoned ? ad03 1 gilgamesh may have quickly cast the spell ad03 0 * having read of shakespeare satisfied me ad03 0 * medea tried her to leave . ad03 1 the potion boiled over ad03 1 there arrived a new actor . ad03 1 i ate fruit ad03 1 i hoped that you would defeat him . ad03 1 who seems to be certain to leave first ? ad03 0 * she liked moya 's football . ad03 1 his hen loves anson . ad03 1 i ate a mango and gillian did too . ad03 1 why did you eat the cake ? ad03 0 * he would can go ad03 1 perhaps gilgamesh should be leaving ad03 1 i want to can do it ad03 1 jason intended for him to learn magic . ad03 1 i went to the shop for to get bread . ad03 1 i asked which king invaded which city . ad03 1 we made the claim that perseus killed the gorgon . ad03 1 plato listened to dp demosthenes ' oration about philip . ad03 1 the old house collapsed . ad03 0 * i believed she is pregnant ad03 1 how are you feeling ? ad03 1 aphrodite misses gilgamesh . ad03 1 anson very happily demonized david . ad03 1 that plato loved aster proved to be his undoing . ad03 1 has n't the potion worked ? ad03 1 bill 's reading shakespeare satisfied me ad03 1 every vampire slept . ad03 1 i might be leaving soon . ad03 0 * it 's arrived first that julie and jenny ad03 1 the man i saw left . ad03 0 * he replied his answer . ad03 1 because they hated him , the druids forced jason to live in a cupboard ad03 1 we kicked ourselves ad03 1 did medea poison jason ? ad03 1 aphrodite freed animals ad03 1 the book was donated to the hospital . ad03 1 medea poisoned more children than jason did . ad03 1 nathan showed benjamin himself in the mirror . ad03 1 that plato loved aster deeply was obvious . ad03 1 he kicked him ad03 1 jason expected the doctor to treat medea ad03 1 the therapist 's analysis of lucy ad03 1 where are you living ? ad03 1 who showed what to who ? ad03 1 medea thought that , after the executioner had left , poseidon would be relieved . ad03 1 the consul 's gift of the gladiator to himself . ad03 1 all the dragons have been slain . ad03 1 gilgamesh should slowly be tickling the mandrake . ad03 1 odysseus planned to hear the sirens . ad03 1 bill reading shakespeare and maureen singing schubert satisfies me ad03 1 the shooting of the hunters was very loud . ad03 0 * the librarians likes books . ad03 0 * can he will do it ? ad03 0 * i ordered there to be three books on the subject . ad03 1 truman punched johnson ad03 1 he became fond of peanuts . ad03 1 the therapist analysed lucy ad03 1 dracula thought himself to be the prince of darkness . ad03 0 * which poem did you hear those recitals of last night ? ad03 1 athena introduced medea to jason ad03 1 he 'll no can do it , will he ? ad03 1 anson is incredibly difficult to please . ad03 1 it was claimed by everyone that the poison was neutralised ad03 1 the banana is being eaten by that monkey . ad03 1 i want to kiss pigs ad03 1 burn letters to him ! ad03 1 his analysis of her was flawed ad03 1 did the potion boil over ? ad03 1 i did n't see him ever . ad03 0 * she said moya liked football . ad03 1 we all thought him to be unhappy ad03 1 which book are you reading ? ad03 1 that monkey is eating the banana . ad03 1 that bottle of water might have cracked open . ad03 1 who did gilgamesh believe to have kissed aphrodite ? ad03 1 paul had three affairs . . . ad03 1 close the door ! ad03 1 i was eating not a peach but an apple ad03 1 which poem did you go to hear a recital of last night ? ad03 0 * when time will you be there . ad03 1 i have sent 0 letters to environmental heath . ad03 1 why did you kill pegasus ? ad03 1 aphrodite does free animals ad03 1 gilgamesh will seek ishtar ad03 1 i assumed him to be innocent ad03 1 i am being whipped ad03 1 never will i do syntax again . ad03 1 the children wail ad03 1 mary fell . ad03 1 i inquired if we could leave early . ad03 1 benjamin gave the cloak and sent the book to lee ad03 1 hera tried to appear to be happy . ad03 0 * i arranged for to see her . ad03 0 * bill 's reading shakespeare and maureen 's singing schubert satisfies me ad03 0 * myself shaved me . ad03 0 * no reading shakespeare satisfied me ad03 1 the emperor 's every wish was immediately carried out . ad03 1 jenny has eaten a cake . ad03 0 * moya played football with her ad03 0 * i intoned fruit ad03 1 the sheep cry ad03 1 he ca n't possibly do that , can he ad03 1 we believed that aphrodite was omnipotent . ad03 1 which book about ulysses did you say that you would read ? ad03 0 * i wanted any cake . ad03 1 gilgamesh is not reading the cuneiform tablets . ad03 1 jason persuaded medea to try to run away . ad03 1 we believed aphrodite to be omnipotent ad03 1 that bottle of water might have . ad03 1 i do n't remember what all i said ? ad03 0 * aphrodite said he freed the animals and freed the animals he ad03 1 that aphrodite was so promiscuous astounded the other gods . ad03 0 * gilgamesh does n't ate the honey ad03 1 i claimed that she was pregnant ad03 0 * aphrodite do freed animals . ad03 1 david wrote that you said that anson thought that julie had fainted ad03 0 * gilgamesh failed often biology ad03 1 it rained ad03 1 poseidon was asleep , when the executioner arrived . ad03 1 people are in the garden ad03 1 anson became happy ad03 1 it is tough to teach syntax . ad03 1 there 's going to be a party , is n't there ? ad03 0 * i have might be flying helicopters . ad03 1 they brought the hat to the teacher ad03 1 what medea attempted was to poison her children . ad03 1 benjamin gave the cloak 0 and sent the book to lee ad03 1 the man chuckles ad03 1 milena will make pasta . ad03 1 aphrodite did free animals ad03 1 gilgamesh should seek ishtar ad03 1 they depend on mary . ad03 0 * the greeks arrived all . ad03 0 * has not the potion worked ad03 0 * gomez 's photograph of pugsley of lucy 's ad03 0 * will can he do it ? ad03 1 humans love to eat the old pigs . ad03 0 * he could might go ad03 1 it 's under the bed that 's the best place to hide ad03 1 he left . ad03 1 that picture of her pleases jenny . ad03 1 constantly reading shakespeare satisfied me ad03 1 there was a dragon in the cave . ad03 1 people like lard . ad03 1 these ones are to be smuggled from hungary . ad03 0 * emily showed himself to benjamin in the mirror . ad03 0 * i said that that jason was jealous annoyed medea ad03 1 we donated a chopper to the new hospital ad03 1 these expensive and illegal bottles of absinthe are to be smuggled from hungary . ad03 1 a programme about euripides is on a radio 4 tonight . ad03 1 lucy 's analysis was the most successful ad03 0 * i am having eaten seaweed . ad03 1 medea tried to poison her children . ad03 0 * anson demonized old ad03 1 what i said was that we would go . ad03 1 there are many fish in the sea . ad03 1 jason gave the poisoned clothes to who ? ad03 0 * by is eaten monkey banana that the being ad03 1 benjamin said he would run away and he did . ad03 1 who is sailing to ithaca ? ad03 1 they sat on mary . ad03 1 julie filed letters to herself . ad03 1 he looked it up ad03 1 who 's there ? ad03 0 * there was he in the garden . ad03 1 he might no could have done it ad03 0 * gilgamesh did n't ate the honey ad03 1 the sheep cries ad03 1 for aphrodite to appear to be happy would be impossible . ad03 0 * who did plato listen to dp demosthenes ' oration about ? ad03 0 * me gave it to him . ad03 1 if one were to steal talismans from witches , then that would be dangerous . ad03 1 bill did not destroy the world . ad03 1 both the twins might have been at the party . ad03 1 that plato loved aster was obvious . ad03 1 the pigs grunt ad03 0 * where place are you living . ad03 0 * who was for medea to poison awful ? ad03 0 * julie maintained if the barman was sober . ad03 1 the analysis of lucy took longer than that of gomez . ad03 0 * julie became a fond . ad03 1 i climbed up the tree . ad03 1 i inquired when we could leave . ad03 1 where alison and david soaked their feet was in the kitchen ad03 0 * what medea wondered if was the potion was ready ad03 1 that photograph of jane of lucy 's ad03 1 the constant reading of shakespeare satisfied me ad03 0 * medea wondered if that the potion was ready ad03 1 what she thought was that the poison was neutralised ad03 1 because she had got the highest marks , medea was happy ad03 1 when did you arrive ? ad03 1 which poisonous plant is it certain that we will find in amazonia ? ad03 1 the microphone salesman 's 0 irritating patter was relentless . ad03 1 the paris i used to know is no more ad03 1 sam gave the cloak to lee and gave the magic chalice to matthew . ad03 1 gilgamesh has eaten the honey ad03 1 i will eat a mango , and gillian will too . ad03 1 computer viruses increased in virulence last year . ad03 1 at trade , anson danced extremely frantically ad03 1 richard is going to chop some wood . ad03 1 the poem that homer wrote . ad03 1 who did drink the poison ? ad03 1 evan 's every idea was completely insane . ad03 1 sally is making scones , and gillian is too . ad03 1 everyone claimed that the poison was neutralized . ad03 0 * jonathan persuaded kate to lick himself . ad03 1 that jason arrived infuriated medea . ad03 1 medea was happy , because she had got the highest marks ad03 1 keep yourself clean ! ad03 1 cassandra has foretold disaster again . ad03 1 bill 's reading of shakespeare satisfied me ad03 1 who poisoned who ? ad03 1 pigs love truffles . ad03 1 owners of pigs love truffles ad03 1 so quickly did the vampire move , that we barely saw him . ad03 1 humans love to eat those pigs . ad03 0 * she has kissed she . ad03 0 * jason intended for he to learn magic . ad03 1 jason persuaded medea to be treated by the doctor ad03 1 it is true that i might be doing something other than going to the party . ad03 1 jason expected medea to be treated by the doctor ad03 0 * i found there . ad03 1 moya said she liked football . ad03 1 anson became the mayor ad03 1 kane ate dirt . ad03 1 benjamin gave the cloak to nathan ad03 0 * the fig chuckled ad03 1 poseidon had run away , because the executioner murdered hera . ad03 1 a description of aristotle is in the book . ad03 1 julie and jenny did . ad03 1 it 's quarter past four . ad03 1 owners of a pig love to eat truffles . ad03 0 * that whether the world is round is unknown bothered athena . ad03 1 no one expected agamemnon to to win ad03 1 euclid was interested in plato 's description of geometry . ad03 0 * every reading shakespeare satisfied me ad03 0 * can will he do it ? ad03 1 medea poisoned who ? ad03 0 * he looked up it ad03 0 * who guy did you see . ad03 0 * we kicked myself ad03 0 * who would poseidon run away , if the executioner murdered ? ad03 0 * anson kissed him ad03 1 which city the claim that philip would invade . ad03 1 i have n't left yet ad03 0 * i am eating a mango and gillian has too . ad03 0 * letter is on the table ad03 1 who ate the cake ? ad03 1 why did you say that you were leaving ? ad03 1 michael left meg ad03 0 * aphrodite quickly may free the animals ad03 1 the reading of shakespeare satisfied me ad03 0 * the weather rained ad03 0 * gilgamesh seek may ishtar ad03 1 no one expected to win . ad03 0 * who did that plato loved seem to be known by everyone . ad03 1 the bear sniffs ad03 1 it hung on the wall . ad03 0 * jason killed . ad03 0 * many people were there playing on the beach ad03 1 know yourself ! ad03 1 agamemnon attempted to behave well . ad03 1 julie felt he was there ad03 1 he thought that dracula was the prince of darkness . ad03 1 i have eaten already ad03 1 it is not true that i have left yet . ad03 0 * that monkeys is eating the banana . ad03 1 i could have been flying helicopters by now . ad03 0 * anson put a book ad03 0 * gilgamesh might have not been reading the cuneiform tablets . ad03 1 i asked if medea poisoned jason . ad03 1 who did you persuade to go ? ad03 1 what did you get all for xmas ? ad03 1 some disgruntled old pigs in those ditches love truffles ad03 1 jason was killed . ad03 0 * i would like to might do it ad03 0 * peter is some disgruntled old pigs in those ditches . ad03 0 * there was him in the garden . ad03 1 gilgamesh is in the dungeon . ad03 1 anson will come to the party . ad03 1 gilgamesh has never flown a dragon . ad03 0 * julie maintained her own questions over the course of the argument . ad03 1 his analysis , i never liked . ad03 1 that bottle of water might . ad03 0 * did medea poison who ? ad03 1 she took a picture of the phoenix ad03 0 * look after herself ! ad03 1 who did medea poison ? ad03 1 i tried for to get them . ad03 1 who did you introduce athena to ? ad03 1 can i keep the screwdriver just like a carpenter keep the screwdriver ? ad03 1 jason refrained from casting the spell ad03 1 andrew likes lard on his sandwiches ad03 1 who seemed to have left first ? ad03 0 * ron asked that the potion was ready ad03 1 hierarchy of projections : ad03 1 we decided to paint the bathroom a lurid lime green colour . ad03 1 she kicked her ad03 0 * he knows he . ad03 1 i believed there to be three books on the subject . ad03 0 * the child wail ad03 1 which girl ate the cake ? ad03 1 that plato lived in the city of athens was well-known . ad03 0 * collapsed harry . ad03 1 for you to do that would be a mistake . ad03 0 * jason thinks who medea had poisoned . ad03 1 i believe she is pregnant ad03 1 no one expected him to to win . ad03 1 he 'll no can do it , can he ? ad03 0 * which poem did you hear homer 's recital of last night ? ad03 1 raffi slept well , and gillian will too ad03 1 he 's bound to should do it ad03 1 it might have cracked open ad03 1 where did perseus see the gorgon ? ad03 1 the scissors are lost ad03 1 gilgamesh should be slowly tickling the mandrake ad03 0 * agamemnon seems pro to be a maniac ad03 0 * myself saw me ad03 1 i believed she was pregnant ad03 1 anson gave fluffy to jenny . ad03 1 the very old and extremely wise owl . ad03 0 * who did that plato loved prove to be his undoing . ad03 0 * what medea believed was jason to be a murderer . ad03 1 the owl hated the evil bat and loved the wise eagle . ad03 1 no one could remove the blood on the wall ad03 0 * he can can go ad03 0 * gillian has made pasta and david is too . ad03 0 * jason intended for pro to learn magic . ad03 1 the boys should could all go ad03 0 * i assumed to be innocent ad03 1 anson danced extremely frantically at trade . ad03 0 * the gorgon is easy to believe the claim that perseus slew . ad03 0 * she kicked itself ad03 1 julie became a fond of lloyd . ad03 1 lee 's youngest and dawn 's oldest son ran away . ad03 1 anson kicked the cat ad03 1 merlin is extremely evil . ad03 1 syntax is easy to pretend that you can teach . ad03 1 i want to eat macaroni ad03 1 which ode did which poet write ? ad03 0 * what she thought that was the poison was neutralised ad03 1 who drank the poison ? ad03 1 what medea arranged was for her children to be poisoned . ad03 1 no one 's mother had baked anything . ad03 1 what kind of actor is he ? ad03 1 what did she eat ? ad03 0 * frantically at , anson danced extremely trade ad03 1 i have often a cold . ad03 1 who did maria say that she 'd kiss and kick ? ad03 1 where did they go all for their holidays ? ad03 1 they came running over the hill and through the woods ad03 0 * the airport yawned ad03 1 how quickly did you eat the cake ? ad03 1 many fish are in the sea . ad03 1 they arrived first ad03 1 people were playing on the beach . ad03 0 * benjamin gave to lee it . ad03 0 * he liked anson . ad03 0 * the bear sniff ad03 0 * i inquired could we leave early . ad03 1 the bears sniff ad03 0 * i persuaded there to be a problem . ad03 1 his book ad03 1 he looked the number up ad03 1 has jenny eaten a cake ? ad03 1 which goddess helped us ? ad03 1 medea killed jason . ad03 1 ron certainly will buy a dog . ad03 0 * they shaved david and anson . ad03 0 * we believed to be the headmaster ad03 1 which king did you wonder invaded which city ? ad03 1 no one expected agamemnon to win . ad03 0 * the day snowed ad03 1 gilgamesh never flies dragons . ad03 0 * keep myself clean ! ad03 1 the dragons have all been slain . ad03 0 * did that medea killed her children upset jason ? ad03 1 the amoeba coughed and then it fainted . ad03 1 i want to sing ad03 1 he will can go ad03 0 * medea seemed that has poisoned jason . ad03 1 having read shakespeare satisfied me ad03 0 * peter is owners of pigs . ad03 0 * odysseus attempted the helmsman to hear the sirens . ad03 1 gilgamesh may seek ishtar ad03 1 the librarian likes books . ad03 1 alison and david soaked their feet after dinner ad03 1 mary is faster than john is . ad03 1 alison and david soaked their feet in the kitchen ad03 0 * you kicked you ad03 1 did you see mary ? ad03 1 raffi has made pasta , and david has too . ad03 1 there seemed to be three men in the garden . ad03 1 that medea murdered jason did n't surprise anyone . ad03 1 moya 's football team loved her ad03 0 * i sent she away . ad03 1 jason persuaded medea that she should desert her family ad03 0 * aphrodite stinks to be omnipotent . ad03 1 every reading of shakespeare satisfied me ad03 0 * bill reading shakespeare and maureen singing schubert satisfy me ad03 1 when the executioner arrived , poseidon was asleep ad03 1 they kicked themselves ad03 1 many vampires have become vegetarian . ad03 0 * that that the world is round is obvious upset hermes . ad03 0 * bill not destroyed the world . ad03 1 john saw stephan ad03 0 * i destroyed there . ad03 0 * what was euclid interested in plato 's description of ? ad03 1 i like anson ad03 1 the dragons simply all died out . ad03 1 gilgamesh did not fly the dragon . ad03 1 which goddess might help us ? ad03 1 humans love to eat pigs . ad03 1 which poem about achilles did homer recite ? ad03 1 the boys should all could go ad03 0 * the owl hated the evil and the wise eagle . ad03 1 the shield that saved achilles life . ad03 1 evan 's every desire ad03 1 i wondered whether medea had fled . ad03 1 i have eaten my hat already ad03 0 * he will could go ad03 1 jenny swallowed the fly ad03 1 the flying car hit the tree in the air ad03 1 i have a book . ad03 1 jason thought of defending the dragon ad03 1 it seems that agamemnon is a maniac ad03 0 * which city do you believe the claim that philip would invade ? ad03 1 we claimed that perseus had killed the gorgon ad03 1 we need some technician to help us . ad03 0 * the scissors is lost ad03 1 i have been flying helicopters for years . ad03 1 sam gave the cloak to lee and the magic chalice to matthew . ad03 0 * we kicked us ad03 1 no reading of shakespeare satisfied me ad03 1 what did he reply ? ad03 0 * it was claimed that by everyone the poison was neutralised ad03 1 i asked which city which king invaded . ad03 1 raffi makes pesto pasta , and david does too ad03 1 eat dirt ! ad03 1 look after yourself ! ad03 0 * she wanted to can leave ad03 1 arthur gave the tapestry to lancelot . ad03 1 we took the car to the town ad03 1 benjamin gave the cloak to lee . ad03 1 not reading shakespeare satisfied me ad03 0 * there were killed three men . ad03 1 gilgamesh has not been reading the cuneiform tablets ad03 0 * the imposition of the government of a fine . ad03 1 when alison and david soaked their feet was after dinner ad03 1 this problem 's analysis is made a lot easier when you understand differential equations . ad03 0 * dracula thought that himself was the prince of darkness . ad03 0 * gilgamesh might have been not reading the cuneiform tablets . ad03 1 who asked which statue which tourist had taken a photo of ? ad03 1 willow said that she 'd kiss tara and kick xander . ad03 1 i climbed right up the tree . ad03 1 all the dragons had escaped . ad03 1 who did you attempt to force jason to kill ? ad03 1 i thought of the moon ad03 0 * benjamin thought he would give the cloak to lee and the cloak to lee he gave . ad03 1 i wondered had he left yet . ad03 1 i thought she was pregnant ad03 1 i arranged for him to see her . ad03 1 it was over the hill and through the woods that they came running ad03 1 who did you ask saw what ? ad03 0 * gilgamesh might can seek ishtar ad03 1 gilgamesh arrived ad03 0 * jason arrived by medea . ad03 1 oil spread over the sea shore . ad03 1 what jason asked was whether the potion was ready ad03 0 * jason asked whether that the potion was ready ad03 1 have you seen mary ? i have vp seen mary ad03 1 it seems that agamemnon left . ad03 1 those monkeys are eating the banana . ad03 0 * i introduced her to he . ad03 0 * nathan showed to benjamin it . ad03 0 * he kicked yourself ad03 1 anson tried to shave himself . ad03 0 *? gilgamesh never has flown a dragon . ad03 0 * what julie did of lloyd was become fond . ad03 1 it is not allowed to incriminate oneself . ad03 1 the analysis of the problem was flawed ad03 1 which goddess did help us ? ad03 1 poseidon appears to have turned out to have left . ad03 0 * gilgamesh has been not reading the cuneiform tablets . ad03 0 * danced extremely , anson frantically at trade ad03 0 * aphrodite wanted to live and ishtar tried to do ad03 0 * i kicked yourself ad03 1 how fond of esther is agamemnon ? ad03 1 ron heard a discussion in the foyer ad03 0 * my mother hated myself ad03 1 the students demonstrated the technique this morning ad03 1 he walked up the hill . ad03 0 * we wanted to ate cake ad03 0 * jason knew those medea had cast the spell ad03 0 * gilgamesh must should seek ishtar ad03 1 aphrodite said he freed the animals and free the animals he did ad03 1 did you drink the poison ? ad03 1 whether agamemnon had triumphed was unknown . ad03 0 * her has kissed her . ad03 1 i often have a cold . ad03 0 * jason whispered the phoenix had escaped ad03 0 * bill reading of shakespeare satisfied me ad03 1 did n't the magic work ? ad03 1 anson thought julie had fainted ad03 1 the horse fell ad03 0 * odysseus attempted odysseus to hear the sirens . ad03 1 burn letters to peter ! ad03 1 genie intoned the prayer ad03 1 gilgamesh did n't fly the broomstick . ad03 1 ron 's likely to be on the web , is n't he ? ad03 1 bill 's reading shakespeare and maureen 's singing schubert satisfy me ad03 0 * owners of a pig loves to eat truffles ad03 0 * gilgamesh might loved ishtar ad03 1 paul had an affair ad03 1 poseidon appears to own a dragon ad03 1 the twins might have both been at the party . ad03 0 * that jason had arrived was obvious infuriated medea . ad03 1 that i should evaporate is my fondest dream ad03 1 what gilgamesh may do is seek ishtar ad03 1 you said that anson thought that julie had fainted ad03 1 the owl hated the evil bat and the wise eagle ad03 1 what did john buy ? ad03 1 agamemnon forced aphrodite to leave the school . ad03 1 there is a description of aristotle in the book . ad03 0 * medea exclaimed if the potion was ready ad03 1 humans love to eat them . ad03 0 * someone did medea poison . ad03 1 perhaps iphigenia will have murdered oedipus by tomorrow . ad03 1 so that he could escape , jason became invisible ad03 0 * i wondered who had medea poisoned . ad03 1 i asked did medea poison jason . ad03 1 agamemnon stopped jason from casting the spell ad03 1 no one wanted any cake . ad03 1 i wanted jimmy for to come with me . ad03 0 * he walked the hill up . ad03 1 they should have all sent oedipus to thebes ad03 0 * those monkey are eating the banana . ad03 0 * who had poseidon run away , before the executioner murdered ? ad03 1 i asked anson if he was happy ad03 1 daniel became a blond . ad03 0 * has that we have arrived back at our starting point proved that the world is round ? ad03 1 it was the man i saw that you wanted to meet . ad03 1 that photograph by gomez of pugsley of lucy 's ad03 1 i ate that . ad03 1 it snowed ad03 1 aphrodite said he would free the animals and free the animals he will ad03 1 that the golden thread would show jason his path through the labyrinth was ad03 1 julie and jenny arrived first ad03 1 what have you eaten ? ad03 0 * peter is owners . ad03 0 * i said this he left ad03 1 who has drunk my whiskey ? ad03 0 * you said she liked yourself ad03 0 * she tried to left ad03 1 i 'd planned to have finished , and finished i have ad03 1 ron expected the sack . ad03 1 that i am here proves that i care . ad03 0 * she tried to may leave ad03 1 gilgamesh misses aphrodite ad03 0 * who seemed had poisoned jason ? ad03 1 that plato loved aster seemed to be known by everyone . ad03 1 when dining with evil crocodiles , it is advisable to wear armour . ad03 0 * benjamin said he would give the cloak to lee and give the cloak he did to lee . ad03 1 did the magic work ? ad03 1 who has drunk the poison ? ad03 1 benjamin said he would give the cloak to lee and give the cloak to lee he did . ad03 1 jason became invisible , so that he could escape ad03 1 aphrodite may quickly free the animals . ad03 1 the horse galloped ad03 1 how quickly did the greeks take troy ? ad03 1 some happy pigs which can fly love truffles ad03 1 julie felt a twinge in her arm ad03 1 the wizard turned the beetle into beer with a wave of his wand ad03 0 * who seemed that had poisoned jason ? ad03 1 kick me ! ad03 1 we wanted to eat cake ad03 1 gomez 's photograph of pugsley belonging to lucy . ad03 1 all the boys should could go ad03 1 julie maintained her own ideas over the course of the argument . ad03 1 the intrepid pirate and the fearful captain 's mate sunk the galleon . ad03 1 gilgamesh might not have been reading the cuneiform tablets . ad03 1 it was obvious that plato loved aster obvious . ad03 1 he loves him ad03 1 we all thought he was unhappy ad03 0 * emily showed himself benjamin in the mirror . ad03 1 anson believed the report . ad03 1 i looked the number up . ad03 0 * anson is incredibly difficult to be pleased . ad03 1 no vampire slept . ad03 1 after the executioner left , poseidon wept . ad03 1 peter was at the party ad03 0 * whales have i seen . ad03 0 * i thought she is pregnant ad03 0 * himself saw him ad03 1 that he is coming is clear . ad03 0 * there seem three men to be in the garden . ad03 0 * he analysis her was flawed ad03 1 where all did they go for their holidays ? ad03 1 gilgamesh decided not to kill ishtar ad03 1 bill reading shakespeare satisfied me ad03 1 perseus saw the gorgon in his shield . ad03 1 poseidon would run away , if the executioner murdered hera . ad03 0 * who did a statue of surprise medea ? ad03 1 what did you say ( that ) the poet had written ? ad03 1 i saw people playing there on the beach . ad03 0 * who was that plato loved obvious ? ad03 1 i did n't want any cake . ad03 1 that i should kiss pigs is my fondest dream ad03 0 * gilgamesh flew not the broomstick . ad03 1 ron failed biology , unfortunately ad03 1 the men chuckle ad03 1 i expected there to be a problem . ad03 1 gilgamesh wanted to seduce ishtar , and seduce ishtar he did . ad03 1 harry collapsed . ad03 1 i asked who saw what . ad03 0 * the doctor arrived a new actor . ad03 0 * him loves him ad03 0 * who had poseidon run away , because the executioner murdered ? ad03 1 he has been happy ad03 1 poseidon had run away , before the executioner murdered hera . ad03 0 * which the poem did homer recite ? ad03 0 * not reading of shakespeare satisfied me ad03 0 * who did athena introduce who to ? ad03 1 merlin is a dangerous sorcerer . ad03 1 anson saw anson . ad03 1 i am to eat macaroni . ad03 1 poseidon had escaped , before the executioner arrived . ad03 1 owners love truffles ad03 0 * the dragons were slain all . ad03 0 * i saw him ever . ad03 1 humans love to eat owners of pigs . ad03 0 * i have sent 0 letter to environmental heath ad03 0 * what jason asked whether was the potion was ready ad03 1 those pigs love truffles ad03 0 * we all thought he to be unhappy ad03 1 i 'd planned to have finished by now . ad03 1 has the potion not worked ? ad03 1 what i love is toast and sun dried tomatoes ad03 1 mary ran . ad03 0 * the man i saw shaved myself . ad03 0 * readings shakespeare satisfied me ad03 0 * the picture of no one hung upon any wall . ad03 1 he replied that he was happy . ad03 1 no one could remove the blood from the wall ad03 1 julie maintained that the barman was sober . ad03 0 * i kicked me ad03 1 benjamin gave lee the cloak . ad03 1 aphrodite wanted hera to persuade athena to leave . ad03 1 gilgamesh is fighting the dragon . ad03 1 i claimed she was pregnant ad03 1 for jenny , i intended to be present . ad03 1 gilgamesh missed aphrodite ad03 1 she might be pregnant . ad03 0 * the pig grunt ad03 1 anson demonized david at the club . ad03 1 jason asked whether the potion was ready ad03 1 frieda closed the door ad03 0 * peter is the old pigs . ad03 1 medea might have given jason a poisoned robe ( just treat a poisoned robe as an np ad03 1 quickly kiss anson ! ad03 0 * anson believed jenny to have hurt himself . ad03 1 julie felt hot ad03 1 agamemnon expected esther to seem to be happy . ad03 0 * him book ad03 1 that the answer is obvious upset hermes . ad03 0 * the consul 's gift of himself to the gladiator . ad03 1 homer recited the poem about achilles ? ad03 1 no vampire can survive sunrise . ad03 1 under the bed is the best place to hide ad03 1 anson appeared ad03 1 there seems to be a problem . ad03 0 * anson became that he was happy ad03 1 i intoned that she was happy ad03 0 * we all thought him was unhappy ad03 1 medea saw who ? ad03 1 no one expected that agamemnon would win . ad03 1 believing that the world is flat gives one some solace . ad03 1 kick them ! ad03 0 * the bears sniffs ad03 0 * where did you disappear before you hid the gold ? ad03 0 * she tried to do go . ad03 1 medea wondered if the potion was ready ad03 1 who all did you meet when you were in derry ? ad03 1 who did you hear an oration about ? ad03 1 alison ran ad03 1 romeo sent letters to juliet . ad03 1 richard 's gift of the helicopter to the hospital and of the bus to the school . ad03 1 nathan caused benjamin to see himself in the mirror . ad03 1 a. madeleine planned to catch the sardines and she did . ad03 0 * medea tried medea to poison her children . ad03 0 * which temple did athena contemplate the reason that her devotees had built ? ad03 1 i did not understand . ad03 1 gilgamesh loved ishtar and aphrodite did too ad03 1 we believed him to be omnipotent ad03 0 * ron captured quickly a phoenix ad03 1 david ate mangoes and raffi should too . ad03 1 julie and fraser ate those delicious pies in julie 's back garden . ad03 1 the old pigs love truffles ad03 1 the boys all should could go ad03 1 aphrodite quickly freed the animals ad03 1 paul had two affairs ad03 1 what alison and david did was soak their feet in a bucket ad03 1 anson demonized david almost constantly . ad03 0 * agamemnon seemed that left . ad03 1 anson 's hen nibbled his ear . ad03 0 * what a kind of actor is he ? ad03 0 * the constantly reading shakespeare satisfied me ad03 1 before the executioner arrived , poseidon had escaped ad03 1 gilgamesh did n't leave . ad03 1 genie intoned that she was tired ad03 1 look at all these books . which book would you like ? ad03 0 * there were killed three men by the assassin . ad03 0 * peter is those pigs . ad03 1 i do n't remember what i said all ? ad03 1 the pig grunts ad03 0 * the poison was neutralised was claimed that by everyone ad03 1 people are stupid ad03 1 what i arranged was for jenny to be present . ad03 1 i compared ginger to fred ad03 0 * peter is pigs ad03 1 which poet wrote which ode ? ad03 1 how did julie ask if jenny left ? ad03 1 dracula thought him to be the prince of darkness . ad03 0 * he ca n't possibly do that , possibly he ? ad03 1 i must eat macaroni . ad03 1 i asked who john would introduce to who . ad03 0 * the owl hated the and loved the bat . ad03 1 reading shakespeare satisfied me ad03 1 humans love to eat owners . ad03 1 gilgamesh fears death and achilles does as well ad03 0 * the pigs grunts ad03 0 * constant reading shakespeare satisfied me ad03 0 * anson believed to be happy . ad03 1 how did julie say that jenny left ? ad03 1 show me letters ! ad03 1 the readings of shakespeare satisfied me ad03 1 anson demonized david every day . ad03 1 the students demonstrated this morning ad03 1 we believed aphrodite to be omnipotent . ad03 1 emily caused benjamin to see himself in the mirror . ad03 0 * anson left before jenny saw himself . ad03 1 nothing like that would i ever eat again . ad03 1 where has he put the cake ? ad03 1 jason persuaded medea to desert her family ad03 1 gilgamesh perhaps should be leaving . ad03 1 gilgamesh has n't kissed ishtar . ad03 0 * anson thought that himself was going to the club . ad03 0 * poseidon appears to own a dragon ad03 0 * digitize is my happiest memory ad03 1 it is easy to slay the gorgon . ad03 1 i had the strangest feeling that i knew you . ad03 1 what all did you get for christmas ? ================================================ FILE: Chapter02/out_of_domain_dev.tsv ================================================ clc95 1 somebody just left - guess who . clc95 1 they claimed they had settled on something , but it was n't clear what they had settled on . clc95 1 if sam was going , sally would know where . clc95 1 they 're going to serve the guests something , but it 's unclear what . clc95 1 she 's reading . i ca n't imagine what . clc95 1 john said joan saw someone from her graduating class . clc95 0 * john ate dinner but i do n't know who . clc95 0 * she mailed john a letter , but i do n't know to whom . clc95 1 i served leek soup to my guests . clc95 1 i served my guests . clc95 0 * she was bathing , but i could n't make out who . clc95 0 * she knew french for tom . clc95 0 * john is tall on several occasions . clc95 0 * the ship sank , but i do n't know with what . clc95 0 * they noticed the painting , but i do n't know for how long . clc95 0 * john was tall , but i do n't know on what occasions . clc95 1 joan ate dinner with someone but i do n't know who . clc95 1 joan ate dinner with someone but i do n't know who with . clc95 0 * i know that meg 's attracted to harry , but they do n't know who . clc95 0 * since jill said joe had invited sue , we did n't have to ask who . clc95 1 i know that meg 's attracted to harry , but they do n't know who . clc95 0 * she said she had spoken to everybody , but he was n't sure who . clc95 0 * each of the performers came in , but were sitting so far back that we could n't see who . clc95 1 she did n't talk to one student . clc95 0 * she does n't meet anyone for dinner , but they ca n't figure out who . clc95 1 everyone relies on someone . it 's unclear who . clc95 1 each student wrote a paper on a mayan language , but i do n't remember which one . clc95 1 the newspaper has reported that they are about to appoint someone , but i ca n't remember who the newspaper has reported that they are about to appoint . clc95 1 the newspaper has reported that they are about to appoint someone , but i ca n't remember who they are about to appoint . clc95 1 most columnists claim that a senior white house official has been briefing them , and the newspaper today reveals which one . clc95 1 most columnists claim that a senior white house official has been briefing them , but none will reveal which one . clc95 1 bill wondered how many papers sandy had read , but he did n't care which ones . clc95 1 i never know which papers sandy has read , but i usually know how many . clc95 1 sandy had read how many papers ? ! clc95 1 everybody gets on well with a certain relative , but often only his therapist knows which one . clc95 1 which book did each author recommend ? clc95 1 his or her least known work . clc95 1 they were going to meet sometime on sunday , but the faculty did n't know when . clc95 1 john likes some students , but i do n't know who . clc95 1 i do n't know who john likes . clc95 0 * john likes some students , but i do n't know who john likes some students . clc95 0 * joan said she talked to the students , but fred could n't figure out who . clc95 0 * he announced he had eaten the asparagus , but we did n't know what . clc95 1 she was reading the books under the table , but fred did n't know what books . clc95 1 he announced he would marry the woman he loved most , but none of his relatives could figure out who . clc95 1 she talked to john or mary but i do n't know which . clc95 1 she talked to john or mary but i do n't know which one . clc95 1 she talked to harry , but i do n't know who else . clc95 1 i will see them , but i do n't know how many of them . clc95 1 everyone who knows either susan or laura likes her . clc95 0 * she said she talked to three students but i do n't know how many . clc95 0 * she said she talked to those students but i do n't know how many . clc95 1 he shouted again , but i do n't know who to . clc95 1 she was dancing with somebody , but i do n't know who with . clc95 1 several firefighters were injured , but it 's not known . clc95 1 meg is attracted to harry , but they do n't know who she is attracted to . clc95 1 sandy was trying to work out which students would be able to solve a certain problem , but she would n't tell us which one . clc95 0 * sandy was trying to work out which students would be able to solve a certain problem , but she would n't tell us which one . clc95 0 * john and someone were dancing together , but i do n't know who . clc95 1 the ta 's have been arguing about whether some student or other should pass , but i ca n't now remember which one . clc95 0 * it has been determined that somebody will be appointed ; it 's just not clear yet who . clc95 0 * sally asked if somebody was going to fail math class , but i ca n't remember who . clc95 0 * the ta 's have been arguing about whether some student or other should pass , but i ca n't now remember which one . clc95 1 sandy is very anxious to see if the students will be able to solve the homework problem in a particular way , but she wo n't tell us which . clc95 1 sandy is very anxious to see if the students will be able to solve the homework problem in a particular way , but she wo n't tell us in which way . clc95 1 clinton is anxious to find out which budget dilemmas panetta would be willing to tackle in a certain way , but he wo n't say in which . clc95 1 sandy is wondering whether there will be students who have to drop the class for a certain reason , but she wo n't reveal what . clc95 0 * in which way is sandy very anxious to see if the students will be able to solve the homework problem ? clc95 0 * in which way is clinton anxious to find out which budget dilemmas panetta would be willing to solve ? clc95 1 i know how many assignments i 've graded , but i do n't know how many bill has . clc95 0 * what did you leave before they did ? clc95 0 * what did you leave before they started playing ? clc95 1 sandy was trying to work out which students would be able to solve a certain problem . clc95 1 the administration has issued a statement that it is willing to meet with one of the student groups . clc95 1 sandy was trying to work out which students would be able to solve a problem . clc95 1 the administration has issued a statement that it is willing to meet a student group . clc95 1 the administration has issued a statement that it is willing to meet a student group , but i 'm not sure which one . clc95 1 i think agnes said that bill would speak , but i do n't remember what about . clc95 0 * agnes wondered how john could eat but it 's not clear what . clc95 0 * tony sent mo a picture that he painted , but it 's not clear with what . clc95 1 she 's been dancing but we do n't know with whom . clc95 0 * who did they see someone ? c-05 1 it was believed by everybody that mary was a thief . c-05 1 that professor is feared by all students . c-05 1 mary was respected by john . c-05 1 ted was bitten by the spider . c-05 0 * the book was by john written . c-05 0 * the argument was summed by the coach up . c-05 1 the paper was written up by john . c-05 0 * the paper was written by john up . c-05 1 john was spoken to by mary . c-05 0 * john was spoken by mary to . c-05 1 the book was seen by mary . c-05 0 * john was seen the book . c-05 1 the book was written . c-05 0 * john was spoke by mary to . c-05 1 the table was wiped clean by john . c-05 0 * the table was wiped by john clean . c-05 0 * mary was given by john the book . c-05 1 john was believed to be telling the truth by mary . c-05 1 john was believed by mary to be telling the truth . c-05 1 the car was driven by john to maine . c-05 1 it was believed by the students that they would have an exam . c-05 0 * the magazines were sent to herself by mary . c-05 0 * chocolate eggs were hidden from each other by the children . c-05 1 the magazines were sent by mary to herself . c-05 1 chocolate eggs were hidden from no child by any adult . c-05 1 tabs were kept on each agent by the other . c-05 1 chocolate eggs were hidden from every child by his mother . c-05 1 books were taken from no student and given to mary . c-05 0 * books were taken from no student and given to mary by any professor . c-05 1 books were taken from each student by the other . c-05 1 books were taken from each student and given to mary . c-05 0 * books were taken from each student and given to mary by the other . j_71 1 jack hates sue and is loved by mary . j_71 1 vera sent a baby alligator to max and a leather dinosaur to phyllis . j_71 1 either sam plays the bassoon or jekyll the oboe . j_71 1 sam does n't play bassoon , nor medusa oboe . j_71 0 * bill ate the peaches , but harry the grapes . j_71 1 i no more could have stolen that steak than jack the diamonds . j_71 1 bill ate more peaches than harry did grapes . j_71 0 * bill ate the peaches and harry did the grapes . j_71 0 * tom will smoke the grass , and reuben has the hash . j_71 1 if the ants were called elephants and elephants ants , i 'd be able to squash an elephant . j_71 1 simon quickly dropped the gold , and jack the diamonds . j_71 1 bob tried to wash himself , and mary to read the funnies . j_71 1 harry told sue that albania is a lovely place for a vacation , and tom told sally that albania is a lovely place for a vacation . j_71 1 harry told sue that albania is a lovely place for a vacation , and tom . j_71 1 max seemed to be trying to begin to love harriet , and fred to be trying to begin to love sue . j_71 1 max seemed to be trying to force ted to leave the room , and walt , ira . j_71 0 * max seemed to be trying to force ted to leave the room , and walt to stay a little longer . j_71 0 * arizona elected goldwater senator , and massachusetts , mccormack . j_71 0 * millie will send the president an obscene telegram , and paul , the secretary a rude letter . j_71 0 * maytag will give a brand-new dryer to the winner of the mrs . j_71 0 * bill did n't eat the peaches , nor harry . j_71 1 bill ate the peaches , and harry did , too . j_71 0 * bill must quickly eat the peaches , and harry must slowly . j_71 1 whenever russia has made a major political blunder , the u.s. has too . j_71 1 bill 's story about sue and max 's about kathy both amazed me . j_71 1 i bought three quarts of wine and two of clorox . j_71 1 scientists at the south hanoi institute of technology have succeeded in raising one dog with five legs , another with a cow 's liver , and a third with no head . j_71 1 bill 's story about sue may be amazing , but max 's is virtually incredible . j_71 1 i like bill 's yellow shirt , but not max 's . j_71 1 bill 's funny story about sue and max 's boring one about kathy both amazed me . j_71 1 bill 's wine from france and ted 's from california can not be compared . j_71 0 * as a teacher , you have to deal simultaneously with the administration 's pressure on you to succeed , and the children 's to be a nice guy . j_71 1 neither von karajan 's recording of beethoven 's 6th on columbia nor klemperer 's on angel has the right tempo . j_71 0 * gould 's performance of bach on the piano does n't please me anywhere as much as ross 's on the harpsichord . j_71 0 * tom 's dog with one eye attacked frank 's with three legs . j_71 0 * because steve 's of a spider 's eye had been stolen , i borrowed fred 's diagram of a snake 's fang . j_71 1 neither von karajan 's recording of beethoven 's 6th on columbia nor klemperer 's has the right tempo . j_71 1 tom 's dog with one eye attacked fred 's . j_71 1 i borrowed fred 's diagram of a snake 's eye because steve 's had been stolen . j_71 1 jerry attempted to blow up the pentagon . j_71 1 so fast did he run that nobody could catch him . j_71 1 bill bought a red house , and max bought one too . s_97 1 who always drinks milk ? s_97 1 the book which inspired them was very long . s_97 0 * the book what inspired them was very long . s_97 1 i know the person whose mother died . s_97 1 the person whose mother 's dog we were all fond of . s_97 1 i wonder whose mother died . s_97 1 i wonder whose mother 's dog died . s_97 1 i wonder to whom they dedicated the building . s_97 1 give me the phone number of the person whose mother 's dog died . s_97 1 this is the senator to whose mother 's friend 's sister 's i sent the letter . s_97 0 * i want goes to the store . s_97 0 * i wonder what to be a clown on the cover of . s_97 0 * bother you that kim left ! s_97 0 * a student who to talk to us just walked in . s_97 1 whose bagels do you like ? s_97 1 i wonder in whom to place my trust . s_97 1 there were several old rocks songs that she and i were the only two who knew . s_97 0 * it was to to amuse us that kim was singing that they wanted . s_97 0 * what they feared most was to be no one available to help them . s_97 0 * we tried to amuse them that kim was singing . s_97 1 mary asked me if , in st. louis , john could rent a house cheap . s_97 0 * mary arranged for , in st. louis , john to rent a house cheap . s_97 1 it would be unwise for there to be no fire exit . s_97 1 i believe there to be no way out . s_97 0 * i wonder in whom them to place their trust . s_97 0 * i wonder whom us to trust . s_97 0 * i wonder who for us to trust . s_97 1 i wonder who to place my trust in . s_97 1 i know the people that voted in the election . s_97 1 i threw away a book that sandy thought we had read . s_97 1 i thought that you were sick . s_97 0 * i dislike the people in who we placed our trust . s_97 1 i dislike the company in which we placed our trust . s_97 1 i dislike the people in whose house we stayed . s_97 1 i dislike the person with whom we were talking . s_97 0 * jones , that we were talking to last night , always watches football games alone . s_97 1 a letter was received that jones would be upset by . s_97 0 * a letter was received jones would be upset by . s_97 1 i saw someone yesterday i had n't seen for years . s_97 1 something happened i could n't really talk about . s_97 1 the only person that i like whose kids dana is willing to put up with is pat . s_97 1 the book that i like which everyone else in the class hates was written by john . s_97 0 * the only person whose kids dana is willing to put up with was written by john . s_97 0 * the book that i like - everyone else in the class hates . s_97 1 the only person whose kids dana is willing to put up with is pat . s_97 1 which book 's , author did you meet ? s_97 0 * which boy 's , mother , did you meet who you liked ? s_97 0 * which book 's , author did you meet who you liked ? s_97 1 which boy 's , mother , did you meet ? s_97 1 all who lost money in the scam are eligible for the program . s_97 0 * who for sandy to talk to is still enrolled in the class ? s_97 1 who who you like does sandy also like ? s_97 1 everything you like is on the table . s_97 1 the bills passed by the house yesterday that we objected to were vetoed . s_97 1 the only people being added to our group who were at harvard were students . swb04 1 we like ourselves . swb04 1 nobody likes us . swb04 0 * leslie likes ourselves . swb04 0 * ourselves like ourselves . swb04 1 she voted for herself . swb04 0 * we gave us presents . swb04 1 we gave ourselves presents . swb04 1 we gave presents to ourselves . swb04 0 * we gave us to the cause . swb04 1 we gave ourselves to the cause . swb04 0 * leslie told us about us . swb04 0 * leslie told ourselves about us . swb04 1 we think that leslie likes us . swb04 0 * we think that leslie likes ourselves . swb04 1 our friends like us . swb04 1 those pictures of us offended us . swb04 0 * we found your letter to ourselves in the trash . swb04 0 * vote for you ! swb04 1 vote for yourself ! swb04 0 * we appeared to them to vote for themselves . swb04 1 we admired the pictures of us in the album . swb04 1 we admired the pictures of ourselves in the album . swb04 1 leslie used a pen . swb04 1 we put the pigs in a pen . swb04 1 we need to pen the pigs to keep them from getting into the corn . swb04 1 they should pen the letter quickly . swb04 1 the car wo n't run . swb04 1 this dye will run . swb04 1 she can run an accelerator . swb04 1 these stockings will run . swb04 1 we need another run to win . swb04 1 lee saw the student with a telescope . swb04 1 i forgot how good beer tastes . swb04 1 visiting relatives can be boring . swb04 1 if only superman would stop flying planes ! swb04 1 that 's a new car dealership . swb04 1 i know you like the back of my hand . swb04 1 max is on the phone now . swb04 1 i saw her duck . swb04 1 i 'm creating a committee . kim – you 're in charge . swb04 1 lights go out at ten . there will be no talking afterwards . swb04 1 they found the book on the atom . swb04 1 which experts testified against defendants who exposed them ? swb04 1 list all experts for the defendant who represented himself . swb04 1 list associates of each defendant who speaks spanish . swb04 0 * they lost themselves ' books . swb04 1 some sentences go on and on and on . swb04 0 * sentences some go on and on and on and on . swb04 1 that surprised me . swb04 0 * i noticed the . swb04 1 they were interested in his . swb04 1 this is my favorite . swb04 1 a large dog chased a small cat . swb04 1 some people yell at noisy dogs in my neighborhood . swb04 1 some people yell at the dogs in my neighborhood . swb04 1 some people yell at the dogs . swb04 1 some people yell at noisy dogs . swb04 1 some people yell at dogs . swb04 1 some people consider the noisy dogs dangerous . swb04 1 some people consider the dogs in my neighborhood dangerous . swb04 1 some people consider noisy dogs in my neighborhood dangerous . swb04 1 some people consider the dogs dangerous . swb04 1 some people consider noisy dogs dangerous . swb04 1 some people consider dogs in my neighborhood dangerous . swb04 1 some people consider dogs dangerous . swb04 1 people with children who use drugs should be locked up . swb04 1 this disease gave leslie a fever in rome . swb04 1 the love of my life and mother of my children would never do such a thing . swb04 1 most elections are quickly forgotten , but the election of 2000 , everyone will remember for a long time . swb04 0 * it is painting by klee or drawing by miro that the museum displays no . swb04 1 the defendant denied the accusation . swb04 0 * the teacher disappeared the problem . swb04 0 * the teacher handed the student . swb04 1 the bird sings . swb04 0 * the bird sing . swb04 0 * birds sings . swb04 1 the birds give the worm a tug . swb04 0 * the bird give the worm a tug . swb04 0 * the birds gives the worm a tug . swb04 1 terry delighted in my pain . swb04 0 * terry delighted . swb04 0 * terry delighted my pain . swb04 1 kerry remarked it was late . swb04 1 what additional categories and rules would be required to handle these verbs ? swb04 1 we created a monster . swb04 0 * i was already aware of fact . swb04 0 * the defendant deny the allegation . swb04 0 * the defendants denies the allegation . swb04 1 the defendant walks . swb04 0 * the defendant walk . swb04 0 * the defendants walks . swb04 1 how many feature structures categories can label the first daughter ? swb04 1 the child put the toy on the table . swb04 1 the teacher became angry with the students . swb04 0 * the teacher became . swb04 1 the jury believed the defendant lied . swb04 1 the guests dined . swb04 1 we relied on leslie . swb04 0 * we relied above leslie . swb04 1 we celebrated in the streets . swb04 1 we celebrated in the streets in the rain on tuesday in the morning . swb04 0 * the children are happy of ice cream . swb04 0 * the children are fond with the ice cream . swb04 0 * the children are fond that they have ice cream . swb04 1 a magazine appeared on the newsstands . swb04 1 a magazine about crime appeared on the newsstands . swb04 1 newsweek appeared on the newsstands . swb04 0 * newsweek about crime appeared on the newsstands . swb04 1 the report that crime was declining surprised many people . swb04 1 the book surprised many people . swb04 0 * the book that crime was declining surprised many people . swb04 1 the storm arrived after the picnic . swb04 0 * the storm arrived while the picnic . swb04 1 the storm arrived while we ate lunch . swb04 0 * this dogs barked . swb04 1 these dogs barked . swb04 1 a chair was broken . swb04 1 they want them arrested . swb04 1 they preferred them arrested . swb04 1 we preferred them on our team . swb04 1 with my parents as supportive as they are , i 'll be in fine shape . swb04 0 * we walks . swb04 0 * few dog barked . swb04 1 the dogs barked . swb04 1 i walk and dana runs . swb04 1 they like us . swb04 0 * us like them . swb04 1 kim likes dogs . swb04 1 dogs like kim . swb04 1 the person responsible confessed . swb04 0 * the person confessed responsible . swb04 0 * the cat slept soundly and furry . swb04 0 * the soundly and furry cat slept . swb04 1 chris walks , pat eats broccoli , and sandy plays squash . swb04 1 there was some particular dog who saved every family . swb04 1 susan frightens her . swb04 1 susan told her a story . swb04 1 susan told a story to her . swb04 1 susan devoted herself to linguistics . swb04 1 nobody told susan about herself . swb04 1 that picture of susan offended her . swb04 1 he offended sandy . swb04 0 * i enjoy yourself . swb04 1 they talk to themselves . swb04 1 nobody told susan . swb04 1 protect yourself ! swb04 0 * protect you ! swb04 1 i met the person who left . swb04 1 leslie slept . swb04 0 * chris handed bo . swb04 1 dana walked and leslie ran . swb04 0 * dana walking and leslie ran . swb04 0 * dana walking and leslie running . swb04 0 * the putter of books left . swb04 1 kris donated a book to the library . swb04 1 the police sprayed the protesters with water . swb04 1 the police sprayed water on the protesters . swb04 1 the students drove cars . swb04 1 these cars drive easily . swb04 1 the horse kicked me black and blue . swb04 1 they yelled . swb04 1 the horse raced past the barn fell . swb04 1 the horse that was raced past the barn fell . swb04 1 the boat seen down the river sank . swb04 1 the evidence assembled by the prosecution convinced the jury . swb04 1 lou forgot the umbrella . swb04 1 lou forgot the umbrella in the closet . swb04 1 lou hoped the umbrella was broken . swb04 0 * lou hoped the umbrella in the closet . swb04 0 * lou put the umbrella was broken . swb04 1 lou put the umbrella in the closet . swb04 1 the artist drew the child with a pencil . swb04 1 the dog bit the cat . swb04 0 * the cat was bitten the mouse . swb04 0 * the cat was bitten the mouse by the dog . swb04 0 * chris was handed sandy a note by pat . swb04 1 chris was handed a note . swb04 0 * chris was handed sandy a note . swb04 1 tv puts dumb ideas in children 's heads . swb04 1 dumb ideas are put in children 's heads by tv . swb04 1 dumb ideas are put in children 's heads . swb04 0 * dumb ideas are put notions in children 's heads by tv . swb04 1 the patient died . swb04 0 * the patient was died . swb04 0 * chris was handed . swb04 0 * tv puts dumb ideas . swb04 1 he was arrested by the police . swb04 1 the cat got bitten . swb04 0 * the cat were bitten by the dog . swb04 1 there is a monster in loch ness . swb04 1 it is obvious that pat is lying . swb04 1 pat is the captain of the team . swb04 0 * pat is hate chris . swb04 1 there is a unicorn in the garden . swb04 1 there was a felon elected to the city council . swb04 1 there is a seat available . swb04 0 * a seat available was in the last row . swb04 1 many people were fond of pat . swb04 1 people are looking through the window . swb04 1 a felon was elected to the city council . swb04 0 * there loved sandy . swb04 0 * we talked to them about there . swb04 1 it mattered that the giants had lost . swb04 1 that dogs bark annoys people . swb04 1 it annoys people that dogs bark . swb04 1 that chris knew the answer occurred to pat . swb04 1 it never occurred to pat that chris knew the answer . swb04 1 that the cardinal won the game gave sandy a thrill . swb04 1 it gave sandy a thrill that the cardinal won the game . swb04 0 * that sandy had lied suggested . swb04 0 * it loved sandy . swb04 1 cohen proved the independence of the continuum hypothesis . swb04 1 cohen proved that the continuum hypothesis was independent . swb04 1 we forgot our invitations . swb04 1 nobody saw pat . swb04 1 that fido barks annoys me . swb04 1 fido barks . swb04 1 chris dreads the bucket . swb04 1 the candidates bring advantage to the voters . swb04 1 tabs are kept on suspected drug dealers by the fbi . swb04 1 advantage is taken of every opportunity for improvement . swb04 1 the bucket was kicked by pat . w_80 1 john is sad . w_80 1 john loaded the wagon full with hay . w_80 0 * john loaded the wagon with hay green . w_80 0 * i presented john with it dead . w_80 1 of whom are you thinking ? w_80 1 john became rich . w_80 1 i gave john gold apples . w_80 1 how silly is bill considered ? w_80 1 how mad was bill made ? w_80 1 john is sick . w_80 1 john left singing . w_80 1 john is near larry . w_80 1 john gave bill the dog dead . w_80 0 * bill was struck by john as stupid . w_80 0 * john was struck as sick . w_80 1 john was struck by bill 's idiocy . w_80 1 john promised bill to leave . w_80 1 john tried to leave . w_80 1 to leave would be a pleasure . w_80 0 * john was struck by bill as pompous . w_80 0 * john was promised by bill to leave . w_80 1 they make good cooks . w_80 1 there is nothing to do . w_80 1 john has something for bill to do . w_80 1 i am counting on bill to incriminate himself . w_80 1 on whom are you counting to incriminate himself ? w_80 1 i am counting on bill to get there on time . w_80 1 i would prefer to leave . w_80 1 i would hate for john to leave . w_80 1 i would prefer for john to leave . w_80 0 * it was hated for john to leave . w_80 0 * john decided for bill to get the prize . w_80 0 * john decided bill to get the prize . w_80 1 to die is no fun . w_80 1 john wants to leave . w_80 1 john counted on bill to get there on time . w_80 1 i bought bill a book to read . w_80 1 john told mary that it would be important to leave early . w_80 1 john told mary that it was important to fred to leave early . w_80 1 john , told mary that it would be appropriate to leave together . w_80 0 * the election of john president surprised me . w_80 1 john 's arriving dead surprised me . w_80 1 the attempt by john to leave surprised me . w_80 1 john left orders to follow pete . w_80 1 john left us orders to follow pete . w_80 1 john left orders not to be disturbed . w_80 1 that he is here is clear . w_80 1 it is a problem that he is here . w_80 1 it bothers me that he is here . w_80 1 john regretted it that bill had a good time . w_80 0 * john believes it that bill is here . w_80 0 * john believes it sincerely that bill is here . w_80 0 * john is aware of it that bill is here . w_80 0 * john felt it that bill was tardy . w_80 0 * john believed it that bill was tardy . w_80 1 it was believed that bill was tardy . w_80 0 * that john is reluctant seems . w_80 0 * it is the problem that he is here . w_80 1 that he is here is the problem . w_80 1 the problem we are discussing is george . w_80 0 * it is to give up to leave . w_80 1 it would prove our theory to be untenable for carrots to be vegetables . w_80 0 * it was believed to be illegal by them to do that . w_80 1 john grudgingly accepted judgments of his incompetence as an auto mechanic . w_80 1 it was to john that i gave the book . w_80 1 i bought it to read . w_80 1 i bought it to give to pete . w_80 1 i gave it to pete to take to the fair . w_80 0 * i gave pete the book to impress . w_80 1 i wrote to bill . w_80 1 i presented it to bill to read . w_80 0 * i presented bill with it to read . w_80 1 i gave a book to bill to read . w_80 1 john thinks it would upset himself to die . w_80 1 john made bill mad at himself . w_80 1 john made bill master of himself . w_80 1 the correspondence school made bill a good typist . w_80 1 the correspondence school sent bill a good typist . w_80 1 john considers bill silly . w_80 1 john considers bill to be silly . w_80 0 * john bought a dog for himself to play with . w_80 1 john arranged for himself to get the prize . w_80 1 john talked to bill about himself . ================================================ FILE: Chapter03/KantaiBERT.ipynb ================================================ {"nbformat":4,"nbformat_minor":0,"metadata":{"accelerator":"GPU","colab":{"name":"KantaiBERT_2.ipynb","provenance":[],"collapsed_sections":[],"toc_visible":true,"machine_shape":"hm"},"kernelspec":{"display_name":"Python 3","name":"python3"}},"cells":[{"cell_type":"markdown","metadata":{"id":"M1oqh0F6W3ad"},"source":["# How to train a new language model from scratch using Transformers and Tokenizers\n","\n","Copyright 2020, Denis Rothman. Denis Rothman adapted a Hugging Face reference notebook to pretrain a transformer model.The next steps would be work on the building a larger dataset and testing several transformer models. \n","\n","The Transformer model of this Notebook is a Transformer model named ***KantaiBERT***. ***KantaiBERT*** is trained as a RoBERTa Transformer with DistilBERT architecture. The dataset was compiled with three books by Immanuel Kant downloaded from the [Gutenberg Project](https://www.gutenberg.org/). \n","\n","\n","\n","![](https://commons.wikimedia.org/wiki/Kant_gemaelde_1.jpg)\n","\n","***KantaiBERT*** was pretrained with a small model of 84 million parameters using the same number of layers and heads as DistilBert, i.e., 6 layers, 768 hidden size,and 12 attention heads. ***KantaiBERT*** is then fine-tuned for a downstream masked Language Modeling task.\n","\n","### The Hugging Face original Reference and notes:\n","\n","Notebook edition (link to original of the reference blogpost [link](https://huggingface.co/blog/how-to-train)).\n"]},{"cell_type":"code","metadata":{"colab":{"base_uri":"https://localhost:8080/"},"id":"HOk4iZ9YZvec","executionInfo":{"status":"ok","timestamp":1611319407103,"user_tz":-330,"elapsed":1307,"user":{"displayName":"Karan Sonawane","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64","userId":"05479461208077736330"}},"outputId":"312e6d71-acb6-43e3-a4b1-dcd25f27c5f3"},"source":["#@title Step 1: Loading the Dataset\n","#1.Load kant.txt using the Colab file manager\n","#2.Downloading the file from GitHub\n","!curl -L https://raw.githubusercontent.com/PacktPublishing/Transformers-for-Natural-Language-Processing/master/Chapter03/kant.txt --output \"kant.txt\""],"execution_count":2,"outputs":[{"output_type":"stream","text":[" % Total % Received % Xferd Average Speed Time Time Time Current\n"," Dload Upload Total Spent Left Speed\n","100 10.7M 100 10.7M 0 0 31.0M 0 --:--:-- --:--:-- --:--:-- 30.9M\n"],"name":"stdout"}]},{"cell_type":"code","metadata":{"colab":{"base_uri":"https://localhost:8080/"},"id":"5duRggBRZKvP","executionInfo":{"elapsed":48685,"status":"ok","timestamp":1611302298137,"user":{"displayName":"Karan Sonawane","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64","userId":"05479461208077736330"},"user_tz":-330},"outputId":"52e4d19b-8a7b-412c-83ab-d12a4759e508"},"source":["#@title Step 2:Installing Hugging Face Transformers\n","# We won't need TensorFlow here\n","!pip uninstall -y tensorflow\n","# Install `transformers` from master\n","!pip install git+https://github.com/huggingface/transformers\n","!pip list | grep -E 'transformers|tokenizers'\n","# transformers version at notebook update --- 2.9.1\n","# tokenizers version at notebook update --- 0.7.0"],"execution_count":null,"outputs":[{"output_type":"stream","text":["Uninstalling tensorflow-2.4.0:\n"," Successfully uninstalled tensorflow-2.4.0\n","Collecting git+https://github.com/huggingface/transformers\n"," Cloning https://github.com/huggingface/transformers to /tmp/pip-req-build-c75zlcml\n"," Running command git clone -q https://github.com/huggingface/transformers /tmp/pip-req-build-c75zlcml\n"," Installing build dependencies ... \u001b[?25l\u001b[?25hdone\n"," Getting requirements to build wheel ... \u001b[?25l\u001b[?25hdone\n"," Preparing wheel metadata ... \u001b[?25l\u001b[?25hdone\n","Requirement already satisfied: tqdm>=4.27 in /usr/local/lib/python3.6/dist-packages (from transformers==4.3.0.dev0) (4.41.1)\n","Collecting sacremoses\n","\u001b[?25l Downloading https://files.pythonhosted.org/packages/7d/34/09d19aff26edcc8eb2a01bed8e98f13a1537005d31e95233fd48216eed10/sacremoses-0.0.43.tar.gz (883kB)\n","\u001b[K |████████████████████████████████| 890kB 5.7MB/s \n","\u001b[?25hRequirement already satisfied: filelock in /usr/local/lib/python3.6/dist-packages (from transformers==4.3.0.dev0) (3.0.12)\n","Requirement already satisfied: importlib-metadata; python_version < \"3.8\" in /usr/local/lib/python3.6/dist-packages (from transformers==4.3.0.dev0) (3.3.0)\n","Requirement already satisfied: requests in /usr/local/lib/python3.6/dist-packages (from transformers==4.3.0.dev0) (2.23.0)\n","Requirement already satisfied: numpy in /usr/local/lib/python3.6/dist-packages (from transformers==4.3.0.dev0) (1.19.5)\n","Requirement already satisfied: dataclasses; python_version < \"3.7\" in /usr/local/lib/python3.6/dist-packages (from transformers==4.3.0.dev0) (0.8)\n","Requirement already satisfied: packaging in /usr/local/lib/python3.6/dist-packages (from transformers==4.3.0.dev0) (20.8)\n","Requirement already satisfied: regex!=2019.12.17 in /usr/local/lib/python3.6/dist-packages (from transformers==4.3.0.dev0) (2019.12.20)\n","Collecting tokenizers==0.9.4\n","\u001b[?25l Downloading https://files.pythonhosted.org/packages/0f/1c/e789a8b12e28be5bc1ce2156cf87cb522b379be9cadc7ad8091a4cc107c4/tokenizers-0.9.4-cp36-cp36m-manylinux2010_x86_64.whl (2.9MB)\n","\u001b[K |████████████████████████████████| 2.9MB 18.6MB/s \n","\u001b[?25hRequirement already satisfied: six in /usr/local/lib/python3.6/dist-packages (from sacremoses->transformers==4.3.0.dev0) (1.15.0)\n","Requirement already satisfied: click in /usr/local/lib/python3.6/dist-packages (from sacremoses->transformers==4.3.0.dev0) (7.1.2)\n","Requirement already satisfied: joblib in /usr/local/lib/python3.6/dist-packages (from sacremoses->transformers==4.3.0.dev0) (1.0.0)\n","Requirement already satisfied: typing-extensions>=3.6.4; python_version < \"3.8\" in /usr/local/lib/python3.6/dist-packages (from importlib-metadata; python_version < \"3.8\"->transformers==4.3.0.dev0) (3.7.4.3)\n","Requirement already satisfied: zipp>=0.5 in /usr/local/lib/python3.6/dist-packages (from importlib-metadata; python_version < \"3.8\"->transformers==4.3.0.dev0) (3.4.0)\n","Requirement already satisfied: chardet<4,>=3.0.2 in /usr/local/lib/python3.6/dist-packages (from requests->transformers==4.3.0.dev0) (3.0.4)\n","Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.6/dist-packages (from requests->transformers==4.3.0.dev0) (2020.12.5)\n","Requirement already satisfied: idna<3,>=2.5 in /usr/local/lib/python3.6/dist-packages (from requests->transformers==4.3.0.dev0) (2.10)\n","Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /usr/local/lib/python3.6/dist-packages (from requests->transformers==4.3.0.dev0) (1.24.3)\n","Requirement already satisfied: pyparsing>=2.0.2 in /usr/local/lib/python3.6/dist-packages (from packaging->transformers==4.3.0.dev0) (2.4.7)\n","Building wheels for collected packages: transformers\n"," Building wheel for transformers (PEP 517) ... \u001b[?25l\u001b[?25hdone\n"," Created wheel for transformers: filename=transformers-4.3.0.dev0-cp36-none-any.whl size=1744849 sha256=274a76dfeb1da6a19cf68c3c28455142689a60d1cfdb99f1464d3e4d31cd010d\n"," Stored in directory: /tmp/pip-ephem-wheel-cache-f7yzuk0p/wheels/70/d3/52/b3fa4f8b8ef04167ac62e5bb2accb62ae764db2a378247490e\n","Successfully built transformers\n","Building wheels for collected packages: sacremoses\n"," Building wheel for sacremoses (setup.py) ... \u001b[?25l\u001b[?25hdone\n"," Created wheel for sacremoses: filename=sacremoses-0.0.43-cp36-none-any.whl size=893261 sha256=f75b02ce1a1faa3820b27ed3b46d0d8b84e18a5aa12510611c055c2dc7dbc5f2\n"," Stored in directory: /root/.cache/pip/wheels/29/3c/fd/7ce5c3f0666dab31a50123635e6fb5e19ceb42ce38d4e58f45\n","Successfully built sacremoses\n","Installing collected packages: sacremoses, tokenizers, transformers\n","Successfully installed sacremoses-0.0.43 tokenizers-0.9.4 transformers-4.3.0.dev0\n","tokenizers 0.9.4 \n","transformers 4.3.0.dev0 \n"],"name":"stdout"}]},{"cell_type":"code","metadata":{"colab":{"base_uri":"https://localhost:8080/"},"id":"IMnymRDLe0hi","executionInfo":{"elapsed":2860,"status":"ok","timestamp":1611303247694,"user":{"displayName":"Karan Sonawane","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64","userId":"05479461208077736330"},"user_tz":-330},"outputId":"706de1c8-715a-41e2-bdcf-3caa67125bf8"},"source":["#@title Step 3: Training a Tokenizer\n","%%time \n","from pathlib import Path\n","\n","from tokenizers import ByteLevelBPETokenizer\n","\n","paths = [str(x) for x in Path(\".\").glob(\"**/*.txt\")]\n","# Initialize a tokenizer\n","tokenizer = ByteLevelBPETokenizer()\n","\n","# Customize training\n","tokenizer.train(files=paths, vocab_size=52_000, min_frequency=2, special_tokens=[\n"," \"\",\n"," \"\",\n"," \"\",\n"," \"\",\n"," \"\",\n","])"],"execution_count":null,"outputs":[{"output_type":"stream","text":["CPU times: user 6.04 s, sys: 449 ms, total: 6.49 s\n","Wall time: 1.76 s\n"],"name":"stdout"}]},{"cell_type":"code","metadata":{"colab":{"base_uri":"https://localhost:8080/"},"id":"nqYKX1XYyRI-","executionInfo":{"elapsed":1506,"status":"ok","timestamp":1611303250245,"user":{"displayName":"Karan Sonawane","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64","userId":"05479461208077736330"},"user_tz":-330},"outputId":"3247a100-2230-4c7f-92e2-41442e76f3b9"},"source":["#@title Step 4: Saving the files to disk\n","import os\n","token_dir = '/content/KantaiBERT'\n","if not os.path.exists(token_dir):\n"," os.makedirs(token_dir)\n","tokenizer.save_model('KantaiBERT')"],"execution_count":null,"outputs":[{"output_type":"execute_result","data":{"text/plain":["['KantaiBERT/vocab.json', 'KantaiBERT/merges.txt']"]},"metadata":{"tags":[]},"execution_count":7}]},{"cell_type":"code","metadata":{"id":"tKVWB8WShT-z"},"source":["#@title Step 5 Loading the Trained Tokenizer Files \n","from tokenizers.implementations import ByteLevelBPETokenizer\n","from tokenizers.processors import BertProcessing\n","\n","tokenizer = ByteLevelBPETokenizer(\n"," \"./KantaiBERT/vocab.json\",\n"," \"./KantaiBERT/merges.txt\",\n",")"],"execution_count":null,"outputs":[]},{"cell_type":"code","metadata":{"colab":{"base_uri":"https://localhost:8080/"},"id":"I9hQqVS_qZWg","executionInfo":{"elapsed":1393,"status":"ok","timestamp":1611303257943,"user":{"displayName":"Karan Sonawane","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64","userId":"05479461208077736330"},"user_tz":-330},"outputId":"ed5a7467-b61e-4210-d2e5-c506edd44268"},"source":["tokenizer.encode(\"The Critique of Pure Reason.\").tokens"],"execution_count":null,"outputs":[{"output_type":"execute_result","data":{"text/plain":["['The', 'ĠCritique', 'Ġof', 'ĠPure', 'ĠReason', '.']"]},"metadata":{"tags":[]},"execution_count":9}]},{"cell_type":"code","metadata":{"colab":{"base_uri":"https://localhost:8080/"},"id":"OGjAwZVGrfyS","executionInfo":{"elapsed":1499,"status":"ok","timestamp":1611303260078,"user":{"displayName":"Karan Sonawane","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64","userId":"05479461208077736330"},"user_tz":-330},"outputId":"fa7923d2-939c-485a-a064-fb43966357cc"},"source":["tokenizer.encode(\"The Critique of Pure Reason.\")"],"execution_count":null,"outputs":[{"output_type":"execute_result","data":{"text/plain":["Encoding(num_tokens=6, attributes=[ids, type_ids, tokens, offsets, attention_mask, special_tokens_mask, overflowing])"]},"metadata":{"tags":[]},"execution_count":10}]},{"cell_type":"code","metadata":{"id":"hO5M3vrAhcuj"},"source":["tokenizer._tokenizer.post_processor = BertProcessing(\n"," (\"\", tokenizer.token_to_id(\"\")),\n"," (\"\", tokenizer.token_to_id(\"\")),\n",")\n","tokenizer.enable_truncation(max_length=512)"],"execution_count":null,"outputs":[]},{"cell_type":"code","metadata":{"colab":{"base_uri":"https://localhost:8080/"},"id":"kD140sFjh0LQ","executionInfo":{"elapsed":1546,"status":"ok","timestamp":1611303265026,"user":{"displayName":"Karan Sonawane","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64","userId":"05479461208077736330"},"user_tz":-330},"outputId":"466229fc-5980-4ced-ffb7-0035fba3ff73"},"source":["#@title Step 6: Checking Resource Constraints: GPU and NVIDIA \n","!nvidia-smi"],"execution_count":null,"outputs":[{"output_type":"stream","text":["Fri Jan 22 08:14:23 2021 \n","+-----------------------------------------------------------------------------+\n","| NVIDIA-SMI 460.32.03 Driver Version: 418.67 CUDA Version: 10.1 |\n","|-------------------------------+----------------------+----------------------+\n","| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |\n","| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |\n","| | | MIG M. |\n","|===============================+======================+======================|\n","| 0 Tesla P100-PCIE... Off | 00000000:00:04.0 Off | 0 |\n","| N/A 34C P0 25W / 250W | 0MiB / 16280MiB | 0% Default |\n","| | | ERR! |\n","+-------------------------------+----------------------+----------------------+\n"," \n","+-----------------------------------------------------------------------------+\n","| Processes: |\n","| GPU GI CI PID Type Process name GPU Memory |\n","| ID ID Usage |\n","|=============================================================================|\n","| No running processes found |\n","+-----------------------------------------------------------------------------+\n"],"name":"stdout"}]},{"cell_type":"code","metadata":{"colab":{"base_uri":"https://localhost:8080/"},"id":"VNZZs-r6iKAV","executionInfo":{"elapsed":1562,"status":"ok","timestamp":1611303382926,"user":{"displayName":"Karan Sonawane","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64","userId":"05479461208077736330"},"user_tz":-330},"outputId":"6b3f8b32-4ccd-4661-d58a-74b324766495"},"source":["#@title Checking that PyTorch Sees CUDAnot\n","import torch\n","torch.cuda.is_available()"],"execution_count":null,"outputs":[{"output_type":"execute_result","data":{"text/plain":["True"]},"metadata":{"tags":[]},"execution_count":14}]},{"cell_type":"code","metadata":{"id":"LTXXutqeDzPi"},"source":["#@title Step 7: Defining the configuration of the Model\n","from transformers import RobertaConfig\n","\n","config = RobertaConfig(\n"," vocab_size=52_000,\n"," max_position_embeddings=514,\n"," num_attention_heads=12,\n"," num_hidden_layers=6,\n"," type_vocab_size=1,\n",")"],"execution_count":null,"outputs":[]},{"cell_type":"code","metadata":{"colab":{"base_uri":"https://localhost:8080/"},"id":"5-UsuK9Ps0H7","executionInfo":{"elapsed":1631,"status":"ok","timestamp":1611303394881,"user":{"displayName":"Karan Sonawane","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64","userId":"05479461208077736330"},"user_tz":-330},"outputId":"405400e1-733f-490b-de7b-ae249d95ac01"},"source":["print(config)"],"execution_count":null,"outputs":[{"output_type":"stream","text":["RobertaConfig {\n"," \"attention_probs_dropout_prob\": 0.1,\n"," \"bos_token_id\": 0,\n"," \"eos_token_id\": 2,\n"," \"gradient_checkpointing\": false,\n"," \"hidden_act\": \"gelu\",\n"," \"hidden_dropout_prob\": 0.1,\n"," \"hidden_size\": 768,\n"," \"initializer_range\": 0.02,\n"," \"intermediate_size\": 3072,\n"," \"layer_norm_eps\": 1e-12,\n"," \"max_position_embeddings\": 514,\n"," \"model_type\": \"roberta\",\n"," \"num_attention_heads\": 12,\n"," \"num_hidden_layers\": 6,\n"," \"pad_token_id\": 1,\n"," \"position_embedding_type\": \"absolute\",\n"," \"transformers_version\": \"4.3.0.dev0\",\n"," \"type_vocab_size\": 1,\n"," \"use_cache\": true,\n"," \"vocab_size\": 52000\n","}\n","\n"],"name":"stdout"}]},{"cell_type":"code","metadata":{"id":"4keFBUjQFOD1"},"source":["#@title Step 8: Re-creating the Tokenizer in Transformers\n","from transformers import RobertaTokenizer\n","tokenizer = RobertaTokenizer.from_pretrained(\"./KantaiBERT\", max_length=512)"],"execution_count":null,"outputs":[]},{"cell_type":"code","metadata":{"colab":{"base_uri":"https://localhost:8080/"},"id":"BzMqR-dzF4Ro","executionInfo":{"elapsed":4263,"status":"ok","timestamp":1611303404170,"user":{"displayName":"Karan Sonawane","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64","userId":"05479461208077736330"},"user_tz":-330},"outputId":"e71ab069-0d78-4592-f4cc-158050b47f75"},"source":["#@title Step 9: Initializing a Model From Scratch\n","from transformers import RobertaForMaskedLM\n","\n","model = RobertaForMaskedLM(config=config)\n","print(model)"],"execution_count":null,"outputs":[{"output_type":"stream","text":["RobertaForMaskedLM(\n"," (roberta): RobertaModel(\n"," (embeddings): RobertaEmbeddings(\n"," (word_embeddings): Embedding(52000, 768, padding_idx=1)\n"," (position_embeddings): Embedding(514, 768, padding_idx=1)\n"," (token_type_embeddings): Embedding(1, 768)\n"," (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n"," (dropout): Dropout(p=0.1, inplace=False)\n"," )\n"," (encoder): RobertaEncoder(\n"," (layer): ModuleList(\n"," (0): RobertaLayer(\n"," (attention): RobertaAttention(\n"," (self): RobertaSelfAttention(\n"," (query): Linear(in_features=768, out_features=768, bias=True)\n"," (key): Linear(in_features=768, out_features=768, bias=True)\n"," (value): Linear(in_features=768, out_features=768, bias=True)\n"," (dropout): Dropout(p=0.1, inplace=False)\n"," )\n"," (output): RobertaSelfOutput(\n"," (dense): Linear(in_features=768, out_features=768, bias=True)\n"," (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n"," (dropout): Dropout(p=0.1, inplace=False)\n"," )\n"," )\n"," (intermediate): RobertaIntermediate(\n"," (dense): Linear(in_features=768, out_features=3072, bias=True)\n"," )\n"," (output): RobertaOutput(\n"," (dense): Linear(in_features=3072, out_features=768, bias=True)\n"," (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n"," (dropout): Dropout(p=0.1, inplace=False)\n"," )\n"," )\n"," (1): RobertaLayer(\n"," (attention): RobertaAttention(\n"," (self): RobertaSelfAttention(\n"," (query): Linear(in_features=768, out_features=768, bias=True)\n"," (key): Linear(in_features=768, out_features=768, bias=True)\n"," (value): Linear(in_features=768, out_features=768, bias=True)\n"," (dropout): Dropout(p=0.1, inplace=False)\n"," )\n"," (output): RobertaSelfOutput(\n"," (dense): Linear(in_features=768, out_features=768, bias=True)\n"," (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n"," (dropout): Dropout(p=0.1, inplace=False)\n"," )\n"," )\n"," (intermediate): RobertaIntermediate(\n"," (dense): Linear(in_features=768, out_features=3072, bias=True)\n"," )\n"," (output): RobertaOutput(\n"," (dense): Linear(in_features=3072, out_features=768, bias=True)\n"," (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n"," (dropout): Dropout(p=0.1, inplace=False)\n"," )\n"," )\n"," (2): RobertaLayer(\n"," (attention): RobertaAttention(\n"," (self): RobertaSelfAttention(\n"," (query): Linear(in_features=768, out_features=768, bias=True)\n"," (key): Linear(in_features=768, out_features=768, bias=True)\n"," (value): Linear(in_features=768, out_features=768, bias=True)\n"," (dropout): Dropout(p=0.1, inplace=False)\n"," )\n"," (output): RobertaSelfOutput(\n"," (dense): Linear(in_features=768, out_features=768, bias=True)\n"," (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n"," (dropout): Dropout(p=0.1, inplace=False)\n"," )\n"," )\n"," (intermediate): RobertaIntermediate(\n"," (dense): Linear(in_features=768, out_features=3072, bias=True)\n"," )\n"," (output): RobertaOutput(\n"," (dense): Linear(in_features=3072, out_features=768, bias=True)\n"," (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n"," (dropout): Dropout(p=0.1, inplace=False)\n"," )\n"," )\n"," (3): RobertaLayer(\n"," (attention): RobertaAttention(\n"," (self): RobertaSelfAttention(\n"," (query): Linear(in_features=768, out_features=768, bias=True)\n"," (key): Linear(in_features=768, out_features=768, bias=True)\n"," (value): Linear(in_features=768, out_features=768, bias=True)\n"," (dropout): Dropout(p=0.1, inplace=False)\n"," )\n"," (output): RobertaSelfOutput(\n"," (dense): Linear(in_features=768, out_features=768, bias=True)\n"," (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n"," (dropout): Dropout(p=0.1, inplace=False)\n"," )\n"," )\n"," (intermediate): RobertaIntermediate(\n"," (dense): Linear(in_features=768, out_features=3072, bias=True)\n"," )\n"," (output): RobertaOutput(\n"," (dense): Linear(in_features=3072, out_features=768, bias=True)\n"," (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n"," (dropout): Dropout(p=0.1, inplace=False)\n"," )\n"," )\n"," (4): RobertaLayer(\n"," (attention): RobertaAttention(\n"," (self): RobertaSelfAttention(\n"," (query): Linear(in_features=768, out_features=768, bias=True)\n"," (key): Linear(in_features=768, out_features=768, bias=True)\n"," (value): Linear(in_features=768, out_features=768, bias=True)\n"," (dropout): Dropout(p=0.1, inplace=False)\n"," )\n"," (output): RobertaSelfOutput(\n"," (dense): Linear(in_features=768, out_features=768, bias=True)\n"," (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n"," (dropout): Dropout(p=0.1, inplace=False)\n"," )\n"," )\n"," (intermediate): RobertaIntermediate(\n"," (dense): Linear(in_features=768, out_features=3072, bias=True)\n"," )\n"," (output): RobertaOutput(\n"," (dense): Linear(in_features=3072, out_features=768, bias=True)\n"," (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n"," (dropout): Dropout(p=0.1, inplace=False)\n"," )\n"," )\n"," (5): RobertaLayer(\n"," (attention): RobertaAttention(\n"," (self): RobertaSelfAttention(\n"," (query): Linear(in_features=768, out_features=768, bias=True)\n"," (key): Linear(in_features=768, out_features=768, bias=True)\n"," (value): Linear(in_features=768, out_features=768, bias=True)\n"," (dropout): Dropout(p=0.1, inplace=False)\n"," )\n"," (output): RobertaSelfOutput(\n"," (dense): Linear(in_features=768, out_features=768, bias=True)\n"," (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n"," (dropout): Dropout(p=0.1, inplace=False)\n"," )\n"," )\n"," (intermediate): RobertaIntermediate(\n"," (dense): Linear(in_features=768, out_features=3072, bias=True)\n"," )\n"," (output): RobertaOutput(\n"," (dense): Linear(in_features=3072, out_features=768, bias=True)\n"," (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n"," (dropout): Dropout(p=0.1, inplace=False)\n"," )\n"," )\n"," )\n"," )\n"," )\n"," (lm_head): RobertaLMHead(\n"," (dense): Linear(in_features=768, out_features=768, bias=True)\n"," (layer_norm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n"," (decoder): Linear(in_features=768, out_features=52000, bias=True)\n"," )\n",")\n"],"name":"stdout"}]},{"cell_type":"code","metadata":{"colab":{"base_uri":"https://localhost:8080/"},"id":"jU6JhBSTKiaM","executionInfo":{"elapsed":1417,"status":"ok","timestamp":1611303407295,"user":{"displayName":"Karan Sonawane","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64","userId":"05479461208077736330"},"user_tz":-330},"outputId":"7cefac4a-263c-4785-d91f-f9020bfd3d1c"},"source":["print(model.num_parameters())"],"execution_count":null,"outputs":[{"output_type":"stream","text":["83504416\n"],"name":"stdout"}]},{"cell_type":"code","metadata":{"colab":{"base_uri":"https://localhost:8080/"},"id":"-BXhhe7twTxb","executionInfo":{"elapsed":1327,"status":"ok","timestamp":1611303409350,"user":{"displayName":"Karan Sonawane","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64","userId":"05479461208077736330"},"user_tz":-330},"outputId":"5f78d978-6e80-477e-aebc-3a1c9119b5e9"},"source":["#@title Exploring the Parameters\n","LP=list(model.parameters())\n","lp=len(LP)\n","print(lp)\n","for p in range(0,lp):\n"," print(LP[p])"],"execution_count":null,"outputs":[{"output_type":"stream","text":["106\n","Parameter containing:\n","tensor([[ 0.0235, 0.0174, -0.0312, ..., -0.0200, 0.0193, 0.0241],\n"," [ 0.0223, 0.0050, -0.0057, ..., 0.0110, -0.0061, 0.0102],\n"," [-0.0245, -0.0372, 0.0108, ..., 0.0088, -0.0083, 0.0045],\n"," ...,\n"," [-0.0384, -0.0139, 0.0199, ..., -0.0005, 0.0123, 0.0251],\n"," [ 0.0307, 0.0179, 0.0046, ..., -0.0197, 0.0076, -0.0035],\n"," [ 0.0019, 0.0276, -0.0056, ..., 0.0491, -0.0172, 0.0045]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([[-0.0130, -0.0137, 0.0190, ..., -0.0165, -0.0319, 0.0139],\n"," [-0.0254, 0.0061, 0.0060, ..., 0.0178, 0.0224, 0.0162],\n"," [-0.0003, 0.0218, -0.0115, ..., -0.0138, -0.0128, 0.0331],\n"," ...,\n"," [-0.0115, 0.0210, 0.0268, ..., -0.0152, 0.0361, -0.0047],\n"," [ 0.0272, 0.0065, 0.0166, ..., 0.0208, -0.0169, -0.0053],\n"," [ 0.0158, 0.0003, 0.0151, ..., -0.0129, 0.0220, -0.0140]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([[ 7.1800e-03, 6.6437e-03, 8.4003e-03, -9.4242e-03, 3.5971e-02,\n"," 1.5493e-02, 8.6217e-03, -2.5297e-02, -3.4352e-02, -2.7498e-04,\n"," 3.5180e-03, 2.6678e-02, 8.6635e-03, -2.7617e-02, 1.0993e-02,\n"," 8.0117e-03, -1.4683e-02, -4.3267e-03, -2.5300e-02, -1.5936e-02,\n"," 1.4094e-02, -1.0912e-02, -1.1442e-02, 3.0577e-03, -1.4531e-02,\n"," -1.6202e-04, -3.8920e-03, -3.7224e-02, 2.1012e-02, -4.2987e-03,\n"," -1.0250e-02, -4.4596e-02, 9.0525e-03, 3.5725e-02, -2.5446e-02,\n"," -1.5833e-02, -1.2881e-02, 1.3866e-02, 1.3644e-02, 1.7893e-05,\n"," 1.5038e-02, 5.3411e-03, -3.6112e-03, -3.5338e-03, 9.4914e-03,\n"," -4.9824e-04, -5.1193e-03, -1.0588e-02, 2.9083e-02, 3.1769e-02,\n"," 5.6870e-04, 1.6663e-03, 2.3251e-02, 1.7677e-02, -6.0966e-02,\n"," 2.4458e-02, -9.8698e-03, 8.8821e-03, 2.2937e-03, -2.7309e-02,\n"," -6.4727e-03, -1.2389e-02, 2.5948e-02, 2.4857e-02, 3.5420e-02,\n"," -1.8719e-02, -1.3693e-03, -3.0626e-02, 2.7882e-03, -5.6030e-03,\n"," 3.6367e-02, 4.4579e-03, -2.8664e-03, -8.1722e-03, 2.7467e-02,\n"," 2.4763e-02, -1.5073e-02, 1.6423e-02, 1.3034e-02, -2.8786e-02,\n"," -1.8404e-02, 2.0704e-02, 7.7605e-04, -3.0608e-02, -1.2811e-02,\n"," 1.7842e-02, 4.0206e-02, 1.2923e-02, 4.2755e-02, -2.9865e-02,\n"," -8.5164e-03, -1.5878e-02, -2.4676e-02, -5.8619e-03, 1.0806e-02,\n"," -2.4007e-02, -1.7718e-02, 1.9903e-02, 8.6628e-03, 3.6344e-03,\n"," 8.1299e-03, 2.5190e-02, 6.9107e-03, -4.3021e-02, 1.5741e-02,\n"," 5.8291e-03, -8.6341e-03, -4.1600e-03, -6.1461e-03, -8.9474e-03,\n"," 3.9467e-03, 1.9342e-02, 1.6835e-02, -1.1109e-02, -3.2239e-03,\n"," -9.3587e-05, 2.1938e-02, -1.9343e-02, 1.8259e-02, 8.5760e-03,\n"," -6.0120e-03, 2.0020e-02, -1.2867e-03, 3.0612e-02, -2.8033e-02,\n"," 2.3411e-02, -2.7851e-02, 8.3895e-03, 2.0522e-02, 2.8449e-02,\n"," 5.3295e-03, 1.7460e-03, -1.0284e-02, 3.9836e-03, -2.9588e-03,\n"," -1.7243e-02, 1.5264e-02, -2.8806e-03, 7.4932e-03, -1.2714e-03,\n"," -3.7502e-02, -3.0640e-02, -1.9682e-02, 4.5279e-03, -1.8760e-02,\n"," 1.1877e-02, -2.1112e-03, -7.6996e-03, -1.1017e-02, -1.8120e-02,\n"," 1.1911e-02, 1.4793e-02, -3.6629e-02, 1.6278e-02, -2.3724e-02,\n"," -1.7545e-02, 1.5860e-02, -1.4429e-02, -1.6339e-02, 2.0127e-03,\n"," -2.2843e-03, 1.4623e-02, 1.4657e-02, 4.2222e-02, 2.1177e-03,\n"," -3.4719e-03, 2.5870e-02, -2.1785e-03, 3.1395e-03, 6.6356e-03,\n"," -5.9466e-03, -5.7608e-02, 2.9915e-02, 2.1435e-02, 3.2267e-02,\n"," -3.7518e-02, -2.3894e-03, 1.3420e-02, -2.4054e-03, 1.4911e-02,\n"," -1.3702e-04, -2.7453e-02, 1.8495e-02, 1.4016e-02, -1.8197e-03,\n"," 9.7095e-03, -5.1503e-03, 7.9470e-04, 1.4856e-02, -1.3550e-02,\n"," -1.3359e-02, -1.8834e-02, -7.5882e-03, 2.0986e-02, 2.5334e-02,\n"," 8.9823e-04, 3.4695e-02, 2.2731e-02, -1.7828e-02, 7.3283e-03,\n"," 3.2347e-03, 1.8042e-02, -1.3865e-02, 7.8857e-03, -3.2168e-02,\n"," -9.6327e-03, 2.2697e-02, -6.7531e-03, -2.4187e-02, -7.3901e-03,\n"," 1.6666e-02, 3.7033e-03, 2.2159e-02, 2.1215e-02, -1.7350e-02,\n"," -3.5021e-02, 4.0338e-02, -1.4414e-03, -2.7513e-02, 1.9779e-02,\n"," -2.7719e-02, 1.4489e-02, 4.3596e-03, 1.2859e-02, 9.3213e-03,\n"," 2.0891e-02, 1.0693e-02, -5.1071e-03, 3.1345e-02, -3.0417e-02,\n"," -3.4125e-02, 2.4389e-02, 2.0400e-02, -1.6777e-02, 4.3065e-02,\n"," 5.6042e-03, -7.2931e-03, -4.3282e-03, 1.6478e-02, -3.6286e-02,\n"," -1.4050e-02, 1.1615e-02, -4.2972e-03, -8.1791e-03, 8.1628e-03,\n"," -2.4173e-02, -4.4340e-03, -8.4001e-03, 2.1200e-03, -1.2033e-02,\n"," -2.3261e-02, -1.4815e-02, -2.6957e-02, 9.9435e-03, -2.8107e-02,\n"," -2.3369e-02, -1.0778e-02, -1.4185e-02, 6.0750e-03, -1.3494e-02,\n"," 2.7330e-02, -5.6782e-03, 7.8278e-04, 6.3407e-03, 1.3132e-02,\n"," 1.9369e-02, -3.2497e-02, -4.0580e-02, -3.6554e-02, 2.1006e-02,\n"," 1.5128e-02, -2.2121e-02, -3.2965e-02, -2.3447e-02, 1.2934e-02,\n"," -8.9827e-03, -1.4176e-02, -1.0781e-02, 6.4699e-03, 2.8518e-02,\n"," 6.9459e-03, 1.1522e-03, -1.3720e-03, 4.2082e-02, 3.0859e-03,\n"," -9.0549e-03, 1.1470e-02, -1.1126e-02, 3.0940e-02, -4.2342e-02,\n"," -1.3248e-02, -6.9904e-03, 6.8190e-03, 2.5913e-03, 7.9832e-03,\n"," -1.1913e-02, -2.8337e-02, 1.4619e-02, -1.9521e-02, 1.2593e-02,\n"," 1.2898e-02, 2.1613e-02, -1.5045e-02, 3.4067e-02, 2.7984e-02,\n"," 1.2481e-02, -3.2347e-02, 3.8949e-02, 8.9595e-03, 7.9586e-03,\n"," -3.1714e-02, 4.2258e-03, -1.3134e-02, 5.1858e-03, 4.8415e-03,\n"," -4.5262e-02, -2.8169e-02, -2.4474e-02, -1.7195e-02, 2.9952e-02,\n"," -1.7173e-02, 9.7922e-03, 5.6517e-03, 7.5203e-03, 4.0959e-03,\n"," 1.0813e-02, 4.3824e-02, 1.1238e-02, 3.3847e-02, -6.5930e-04,\n"," 2.1624e-02, 2.5236e-02, -3.8943e-03, -9.1064e-03, -1.1540e-02,\n"," -3.8315e-02, 2.8085e-02, -1.9670e-02, 7.4361e-03, -1.1309e-02,\n"," 1.9085e-03, -1.6417e-03, 2.2166e-02, -7.0164e-04, -2.6895e-02,\n"," 1.2807e-02, -3.3783e-02, 2.7430e-02, -1.8868e-02, -3.3606e-02,\n"," 1.3179e-02, 5.5675e-03, -2.9198e-03, -7.6581e-03, -2.1437e-02,\n"," -2.1133e-02, -2.1569e-02, 1.4381e-02, 2.7465e-03, -1.3964e-02,\n"," -1.4823e-03, -6.7077e-03, 1.2270e-02, -3.1715e-02, 3.0172e-02,\n"," -2.6980e-02, 4.9205e-02, -1.1472e-02, -1.0497e-02, -9.2518e-03,\n"," -4.7793e-04, -1.6612e-02, -5.6121e-03, -1.0638e-04, -2.1223e-02,\n"," -9.3877e-03, 1.7303e-02, -9.3583e-03, -4.3980e-02, -4.8787e-03,\n"," 3.8577e-03, -8.0367e-04, -9.6154e-04, -1.5294e-02, -4.7520e-04,\n"," -3.0764e-02, -4.4183e-03, -5.6546e-02, -1.0320e-02, 6.8146e-03,\n"," -1.6257e-02, 3.5170e-03, 1.2217e-02, -1.1546e-03, 2.4336e-02,\n"," 3.0590e-02, -3.3474e-03, -2.1434e-02, -2.4652e-02, 4.6145e-02,\n"," -5.3906e-03, -2.3682e-03, 5.4975e-03, -1.7060e-02, -2.0117e-02,\n"," -3.2636e-02, -1.2466e-02, -2.4518e-02, 1.4289e-02, -2.3107e-02,\n"," 3.7563e-03, 1.0498e-03, -1.4731e-04, 1.6694e-02, -8.0411e-03,\n"," 1.4120e-03, -3.1152e-03, -3.8396e-02, 3.1607e-02, -1.0289e-02,\n"," 1.8154e-02, 1.1437e-03, -2.7626e-02, 2.1337e-03, -1.8168e-02,\n"," -3.5218e-02, 2.2459e-02, 7.8964e-03, -2.5786e-03, -9.5183e-03,\n"," 7.1241e-03, 2.0980e-03, 9.6836e-03, -1.6857e-03, 1.4061e-02,\n"," -1.2510e-02, 2.2653e-02, -8.3410e-03, 2.8464e-02, -1.3376e-02,\n"," -1.5460e-02, 4.2510e-02, 2.9187e-02, -1.7961e-02, -3.4083e-03,\n"," -1.5833e-02, 9.7975e-03, 6.0228e-03, 1.6737e-02, 4.6077e-02,\n"," 4.7919e-02, -1.7183e-02, -3.9046e-02, 4.2153e-03, -5.8682e-03,\n"," -7.6091e-03, 6.1273e-03, -5.4660e-03, -8.3213e-03, 3.4065e-02,\n"," -1.3743e-03, -3.0128e-02, -5.6012e-03, 1.8614e-02, -1.4121e-02,\n"," 7.2251e-03, 1.9277e-02, 9.1925e-03, -1.0967e-02, -1.4405e-02,\n"," 1.1874e-02, 3.7143e-02, 4.0905e-03, -3.5487e-02, 2.1459e-02,\n"," 1.2339e-02, -2.5228e-02, 1.3806e-03, -7.1485e-03, -4.3867e-04,\n"," -4.0890e-02, -7.7639e-03, 1.8846e-02, -7.9675e-03, 1.9798e-02,\n"," 2.6019e-02, -2.9910e-02, 3.7135e-02, -2.3721e-03, -1.3041e-02,\n"," -1.3590e-02, -2.1053e-02, -4.2212e-02, -2.0449e-02, -1.6761e-03,\n"," 1.3793e-02, 5.0163e-03, 7.2525e-03, -1.0584e-03, 1.7911e-02,\n"," -6.1241e-03, 3.6156e-02, -9.7795e-03, 1.1575e-02, -1.2393e-02,\n"," -2.1529e-02, 3.6920e-02, 5.4768e-03, -2.5465e-03, -3.5564e-02,\n"," -1.8852e-02, -1.9270e-02, -2.3598e-02, 1.6461e-02, -8.5044e-03,\n"," -8.1342e-02, 1.9946e-02, 2.3101e-02, 1.1608e-02, -2.7202e-02,\n"," -2.1214e-03, 1.8512e-02, -2.2433e-02, 5.2665e-02, 1.4524e-02,\n"," -2.7767e-02, 2.0917e-02, -2.4531e-02, 2.8340e-04, 7.4590e-03,\n"," 1.3207e-02, -4.4964e-04, 1.8489e-02, -8.1855e-03, -2.6571e-02,\n"," 6.0091e-03, -1.5137e-02, -1.0845e-02, 1.8406e-02, -8.6995e-03,\n"," 4.6326e-02, -3.4162e-02, -4.5423e-03, -1.1695e-02, -1.6018e-03,\n"," 4.5359e-04, 1.3775e-02, -3.0665e-02, -3.7231e-03, 1.9538e-02,\n"," 1.8332e-02, 1.5294e-02, -2.2645e-02, -1.7966e-04, -5.6845e-03,\n"," 2.4474e-02, 1.2335e-02, -2.8537e-02, -3.3811e-02, -3.9477e-02,\n"," 1.5555e-02, 2.4807e-02, 1.1926e-02, 1.3630e-02, -2.4158e-03,\n"," 5.2492e-03, -3.5830e-03, 1.6062e-02, 8.3222e-03, 4.5684e-03,\n"," -7.3681e-03, 1.7268e-02, -1.6564e-02, -2.3695e-02, 1.8832e-02,\n"," 1.2441e-02, -2.6820e-02, -1.8805e-02, -1.5253e-02, -6.4328e-03,\n"," -1.8041e-02, -3.2997e-02, 1.3746e-02, -3.5377e-02, -1.6488e-02,\n"," 3.8501e-02, 9.5742e-03, -1.4900e-02, -1.6956e-02, -3.1345e-02,\n"," -1.0592e-02, 1.2087e-02, 4.2428e-03, 2.2683e-02, 9.1621e-03,\n"," 1.9223e-02, -2.2042e-02, 9.1639e-03, -4.3950e-02, 8.7145e-03,\n"," 2.8953e-03, 5.7122e-03, -2.4340e-02, -1.0640e-02, -1.1329e-02,\n"," -2.6244e-02, 1.6505e-02, -3.3996e-02, 1.9131e-02, 1.0224e-02,\n"," 1.4776e-02, 2.7232e-02, -1.3224e-02, -9.5900e-03, -9.4386e-03,\n"," -2.3307e-02, -2.9251e-02, -2.1324e-02, -8.7544e-03, -3.4134e-03,\n"," 1.0803e-02, 2.8762e-03, 1.4169e-02, -1.3228e-02, -1.1342e-02,\n"," -2.9762e-02, 3.8964e-02, 2.5084e-02, -5.2755e-03, 2.5368e-02,\n"," 1.5550e-02, 2.3373e-02, 4.4992e-02, 3.5252e-02, 5.1078e-03,\n"," -3.5071e-02, -1.6495e-02, -2.0655e-03, -2.2318e-02, 4.7724e-03,\n"," 3.0932e-02, -2.7194e-03, 6.9039e-03, -1.2915e-02, 1.3947e-02,\n"," 2.8844e-02, 3.2517e-03, 4.6779e-03, 7.0392e-03, -1.3077e-03,\n"," -1.7592e-02, -4.7972e-03, -1.7324e-02, -6.2396e-03, 2.2592e-02,\n"," -1.3515e-02, -8.0224e-03, 2.1195e-02, 2.2268e-02, 2.5593e-02,\n"," -1.9392e-02, 2.1210e-02, -6.4903e-04, -6.2049e-03, 4.3366e-03,\n"," -1.2365e-02, 9.8694e-03, 2.6114e-02, 3.4560e-02, -1.6659e-02,\n"," 2.1352e-02, -2.9365e-02, -3.3745e-03, 6.6747e-03, 1.3251e-02,\n"," -9.8489e-03, 1.8596e-02, 3.2361e-02, 7.7174e-03, -1.2383e-03,\n"," 2.5911e-02, 2.4750e-02, 5.1987e-03, 1.3954e-02, -1.5113e-02,\n"," 7.1089e-02, 8.0755e-05, -8.1151e-03, 1.2588e-02, 9.8257e-03,\n"," 3.8920e-02, -2.1225e-02, 2.4114e-02, -1.6625e-02, -3.0056e-02,\n"," -2.7325e-03, 7.4526e-03, -1.4002e-02, 3.4950e-03, -8.6745e-03,\n"," 3.0063e-03, -2.1799e-02, -1.4129e-02, 2.9121e-04, -4.3018e-02,\n"," -4.4798e-02, -1.8633e-02, -1.2476e-02, -8.5260e-03, 2.1731e-03,\n"," 7.5067e-03, -2.0826e-02, -3.4043e-02, 3.2342e-03, 1.9906e-02,\n"," 1.8168e-02, -2.3600e-02, -2.7241e-03, -4.9381e-02, -7.6955e-03,\n"," -5.1531e-03, 3.1543e-02, -1.9595e-03, 3.5362e-02, 3.4881e-03,\n"," -9.6533e-03, 1.2043e-02, 3.6402e-02, 7.7131e-03, 1.1696e-02,\n"," 2.5230e-02, -5.0558e-03, 1.0681e-02, 1.3198e-02, 1.5028e-02,\n"," -3.5566e-02, -2.5206e-02, -9.1994e-03, -3.8904e-02, 1.6052e-02,\n"," 9.3387e-03, 3.6619e-02, -1.5902e-02, -6.2100e-03, -2.9235e-03,\n"," -7.6509e-03, -3.5802e-02, -1.0864e-02, 7.5553e-03, 2.3614e-02,\n"," 3.2622e-03, 1.5450e-02, -7.8131e-03, 1.4952e-02, 2.0042e-02,\n"," -2.0912e-02, 2.9369e-02, -1.1444e-02, -1.8924e-03, 1.9401e-02,\n"," -6.1184e-03, 2.1170e-04, -6.4930e-03, -7.0237e-04, 3.6012e-02,\n"," 2.1304e-02, -1.2476e-02, 1.8241e-02]], requires_grad=True)\n","Parameter containing:\n","tensor([1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.], requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([[ 0.0054, -0.0249, 0.0291, ..., 0.0022, -0.0016, 0.0385],\n"," [ 0.0306, 0.0053, -0.0565, ..., -0.0035, 0.0238, -0.0246],\n"," [-0.0154, 0.0495, -0.0128, ..., -0.0275, 0.0204, -0.0024],\n"," ...,\n"," [-0.0148, 0.0189, 0.0026, ..., 0.0448, 0.0153, -0.0079],\n"," [-0.0347, 0.0001, 0.0442, ..., 0.0052, -0.0085, 0.0158],\n"," [ 0.0194, 0.0051, -0.0135, ..., 0.0190, -0.0092, 0.0258]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([[ 0.0113, 0.0310, 0.0108, ..., 0.0016, 0.0146, -0.0166],\n"," [-0.0192, -0.0021, 0.0224, ..., -0.0187, 0.0372, 0.0042],\n"," [ 0.0228, 0.0064, -0.0263, ..., 0.0390, 0.0026, 0.0213],\n"," ...,\n"," [-0.0450, 0.0222, -0.0257, ..., -0.0105, -0.0424, 0.0225],\n"," [ 0.0079, 0.0048, 0.0043, ..., -0.0155, 0.0029, -0.0219],\n"," [-0.0418, 0.0221, -0.0124, ..., 0.0272, -0.0198, 0.0119]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([[-0.0113, -0.0214, 0.0008, ..., 0.0096, -0.0131, -0.0022],\n"," [ 0.0034, 0.0181, 0.0045, ..., 0.0213, 0.0026, -0.0047],\n"," [ 0.0222, -0.0167, 0.0034, ..., -0.0064, -0.0021, 0.0017],\n"," ...,\n"," [ 0.0029, 0.0131, -0.0039, ..., 0.0112, -0.0027, -0.0162],\n"," [ 0.0292, 0.0232, 0.0022, ..., 0.0169, 0.0313, -0.0035],\n"," [ 0.0070, 0.0183, 0.0586, ..., -0.0171, 0.0111, 0.0258]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([[-0.0363, 0.0225, -0.0609, ..., -0.0468, -0.0013, 0.0161],\n"," [ 0.0152, -0.0506, -0.0623, ..., -0.0139, -0.0189, -0.0270],\n"," [ 0.0230, -0.0084, -0.0055, ..., 0.0324, -0.0222, -0.0231],\n"," ...,\n"," [ 0.0151, -0.0383, 0.0167, ..., 0.0069, 0.0069, -0.0132],\n"," [ 0.0169, -0.0264, -0.0132, ..., -0.0112, 0.0104, 0.0146],\n"," [ 0.0034, -0.0298, -0.0019, ..., 0.0200, 0.0011, -0.0155]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.], requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([[-0.0077, 0.0063, 0.0143, ..., 0.0019, -0.0524, 0.0445],\n"," [-0.0125, 0.0156, -0.0290, ..., -0.0203, -0.0214, 0.0349],\n"," [-0.0074, -0.0013, 0.0146, ..., 0.0062, 0.0252, 0.0079],\n"," ...,\n"," [ 0.0063, 0.0212, -0.0081, ..., -0.0179, -0.0004, 0.0155],\n"," [-0.0057, 0.0016, 0.0063, ..., 0.0437, -0.0100, 0.0192],\n"," [ 0.0144, -0.0082, -0.0022, ..., -0.0005, 0.0238, 0.0159]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., ..., 0., 0., 0.], requires_grad=True)\n","Parameter containing:\n","tensor([[ 0.0047, 0.0345, 0.0557, ..., -0.0219, -0.0153, -0.0291],\n"," [ 0.0389, -0.0243, 0.0141, ..., 0.0330, 0.0041, -0.0300],\n"," [ 0.0149, -0.0104, 0.0221, ..., -0.0033, 0.0114, -0.0043],\n"," ...,\n"," [ 0.0028, -0.0109, -0.0051, ..., -0.0174, 0.0029, -0.0119],\n"," [-0.0236, 0.0067, 0.0196, ..., -0.0007, -0.0110, 0.0118],\n"," [-0.0044, -0.0271, 0.0073, ..., 0.0059, 0.0052, -0.0130]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.], requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([[ 0.0371, -0.0192, 0.0182, ..., -0.0041, -0.0052, 0.0179],\n"," [ 0.0274, 0.0165, 0.0110, ..., -0.0003, 0.0148, 0.0029],\n"," [ 0.0241, -0.0423, -0.0193, ..., 0.0032, -0.0114, 0.0185],\n"," ...,\n"," [-0.0191, 0.0087, -0.0006, ..., 0.0065, 0.0221, -0.0228],\n"," [-0.0435, -0.0280, -0.0225, ..., 0.0004, -0.0100, 0.0038],\n"," [ 0.0104, 0.0024, 0.0126, ..., 0.0063, -0.0131, 0.0316]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([[-0.0023, 0.0078, 0.0455, ..., 0.0066, -0.0019, -0.0146],\n"," [-0.0186, 0.0153, -0.0109, ..., -0.0367, -0.0061, -0.0016],\n"," [ 0.0156, 0.0037, -0.0166, ..., 0.0102, 0.0307, 0.0078],\n"," ...,\n"," [ 0.0219, -0.0116, -0.0122, ..., -0.0252, 0.0032, 0.0406],\n"," [ 0.0203, 0.0145, -0.0515, ..., 0.0131, 0.0013, -0.0063],\n"," [ 0.0067, -0.0223, -0.0189, ..., 0.0266, 0.0110, -0.0115]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([[-0.0287, -0.0230, 0.0014, ..., 0.0075, 0.0079, 0.0613],\n"," [ 0.0374, 0.0188, -0.0121, ..., 0.0040, 0.0162, -0.0196],\n"," [ 0.0042, -0.0110, -0.0315, ..., -0.0221, -0.0409, 0.0357],\n"," ...,\n"," [-0.0087, -0.0071, 0.0022, ..., 0.0310, 0.0067, 0.0144],\n"," [ 0.0077, 0.0096, -0.0059, ..., -0.0267, 0.0289, -0.0156],\n"," [ 0.0087, -0.0253, -0.0012, ..., -0.0169, -0.0123, -0.0010]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([[-0.0215, -0.0247, -0.0500, ..., 0.0362, -0.0077, 0.0157],\n"," [ 0.0178, -0.0209, 0.0173, ..., 0.0163, -0.0242, 0.0330],\n"," [-0.0260, 0.0015, -0.0006, ..., 0.0037, -0.0195, 0.0091],\n"," ...,\n"," [ 0.0184, 0.0291, 0.0384, ..., -0.0104, 0.0043, 0.0370],\n"," [-0.0538, 0.0278, 0.0242, ..., -0.0162, -0.0008, -0.0071],\n"," [ 0.0257, 0.0098, 0.0103, ..., -0.0066, -0.0165, 0.0016]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.], requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([[ 0.0114, 0.0074, 0.0016, ..., -0.0198, -0.0020, -0.0096],\n"," [ 0.0078, 0.0069, 0.0396, ..., 0.0034, -0.0085, -0.0037],\n"," [-0.0195, -0.0269, -0.0479, ..., -0.0211, -0.0026, -0.0250],\n"," ...,\n"," [ 0.0057, 0.0213, -0.0129, ..., 0.0020, 0.0266, 0.0101],\n"," [-0.0033, 0.0155, 0.0236, ..., -0.0229, -0.0166, -0.0096],\n"," [-0.0014, 0.0099, 0.0002, ..., 0.0407, -0.0093, -0.0057]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., ..., 0., 0., 0.], requires_grad=True)\n","Parameter containing:\n","tensor([[-0.0133, -0.0316, 0.0196, ..., -0.0108, 0.0064, 0.0170],\n"," [ 0.0272, -0.0227, 0.0144, ..., -0.0309, 0.0075, -0.0324],\n"," [-0.0131, -0.0101, -0.0131, ..., -0.0107, 0.0268, 0.0084],\n"," ...,\n"," [-0.0045, -0.0154, 0.0094, ..., -0.0247, -0.0337, -0.0311],\n"," [-0.0163, 0.0192, 0.0021, ..., 0.0031, -0.0060, -0.0188],\n"," [ 0.0039, 0.0330, -0.0208, ..., -0.0112, -0.0151, -0.0054]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.], requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([[-0.0110, 0.0115, -0.0081, ..., -0.0104, -0.0163, 0.0067],\n"," [-0.0065, -0.0073, 0.0089, ..., -0.0405, 0.0007, 0.0241],\n"," [ 0.0008, 0.0435, -0.0219, ..., -0.0030, 0.0032, 0.0166],\n"," ...,\n"," [-0.0195, -0.0130, -0.0039, ..., -0.0105, -0.0171, 0.0047],\n"," [ 0.0145, 0.0111, 0.0021, ..., -0.0247, -0.0031, 0.0089],\n"," [ 0.0348, -0.0068, 0.0093, ..., 0.0129, 0.0171, 0.0110]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([[ 0.0017, -0.0207, -0.0193, ..., -0.0182, 0.0319, -0.0042],\n"," [ 0.0246, -0.0065, 0.0370, ..., 0.0140, 0.0228, 0.0050],\n"," [-0.0100, -0.0325, 0.0043, ..., 0.0470, -0.0174, 0.0122],\n"," ...,\n"," [ 0.0396, -0.0135, 0.0053, ..., -0.0031, 0.0022, 0.0057],\n"," [-0.0297, -0.0040, -0.0111, ..., -0.0220, -0.0053, 0.0058],\n"," [-0.0027, -0.0132, -0.0135, ..., 0.0018, -0.0094, 0.0045]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([[-0.0175, -0.0272, 0.0002, ..., -0.0198, 0.0161, 0.0005],\n"," [-0.0147, 0.0293, 0.0050, ..., -0.0218, 0.0021, 0.0015],\n"," [-0.0279, -0.0163, -0.0152, ..., 0.0146, -0.0068, -0.0099],\n"," ...,\n"," [-0.0066, 0.0098, -0.0221, ..., 0.0155, -0.0009, -0.0207],\n"," [-0.0035, -0.0050, -0.0044, ..., -0.0118, 0.0075, -0.0457],\n"," [-0.0084, -0.0077, 0.0478, ..., 0.0118, 0.0303, -0.0203]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([[-0.0064, 0.0323, -0.0135, ..., -0.0135, 0.0215, 0.0121],\n"," [ 0.0011, -0.0242, 0.0095, ..., 0.0274, -0.0122, -0.0283],\n"," [-0.0224, 0.0338, -0.0290, ..., -0.0263, 0.0217, -0.0171],\n"," ...,\n"," [-0.0120, 0.0282, -0.0060, ..., -0.0208, 0.0026, 0.0056],\n"," [ 0.0052, -0.0032, 0.0125, ..., -0.0473, -0.0094, 0.0141],\n"," [-0.0010, -0.0062, 0.0059, ..., 0.0261, -0.0050, 0.0003]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.], requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([[ 0.0029, -0.0024, -0.0296, ..., 0.0012, 0.0428, -0.0092],\n"," [-0.0169, 0.0191, 0.0063, ..., 0.0139, 0.0459, -0.0038],\n"," [-0.0046, -0.0121, -0.0403, ..., 0.0099, 0.0046, 0.0167],\n"," ...,\n"," [ 0.0467, 0.0041, 0.0186, ..., 0.0034, -0.0331, -0.0016],\n"," [ 0.0091, -0.0312, -0.0105, ..., 0.0193, 0.0108, -0.0265],\n"," [-0.0092, 0.0173, -0.0221, ..., -0.0228, 0.0626, -0.0301]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., ..., 0., 0., 0.], requires_grad=True)\n","Parameter containing:\n","tensor([[ 0.0288, 0.0347, -0.0111, ..., 0.0066, -0.0029, -0.0072],\n"," [ 0.0079, -0.0051, 0.0204, ..., 0.0104, -0.0036, 0.0418],\n"," [-0.0095, -0.0150, 0.0176, ..., -0.0050, -0.0217, 0.0044],\n"," ...,\n"," [-0.0206, -0.0033, 0.0309, ..., 0.0207, -0.0070, 0.0111],\n"," [ 0.0137, 0.0179, 0.0485, ..., -0.0346, -0.0086, 0.0238],\n"," [ 0.0257, -0.0012, -0.0298, ..., 0.0205, -0.0131, -0.0042]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.], requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([[-0.0357, -0.0351, 0.0085, ..., 0.0087, 0.0193, -0.0061],\n"," [ 0.0101, 0.0315, -0.0053, ..., 0.0200, -0.0208, -0.0224],\n"," [ 0.0202, -0.0045, -0.0026, ..., 0.0071, 0.0281, -0.0238],\n"," ...,\n"," [-0.0048, 0.0365, -0.0033, ..., 0.0502, 0.0061, -0.0374],\n"," [ 0.0037, -0.0032, -0.0197, ..., -0.0347, 0.0175, -0.0231],\n"," [ 0.0091, -0.0069, 0.0172, ..., -0.0107, 0.0019, 0.0130]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([[ 0.0235, -0.0070, -0.0124, ..., -0.0014, 0.0005, 0.0221],\n"," [-0.0234, 0.0223, 0.0084, ..., 0.0203, -0.0004, 0.0217],\n"," [-0.0277, 0.0151, -0.0243, ..., -0.0004, 0.0123, 0.0058],\n"," ...,\n"," [ 0.0311, -0.0026, 0.0344, ..., -0.0022, -0.0060, -0.0143],\n"," [-0.0079, -0.0146, 0.0151, ..., -0.0138, -0.0092, -0.0064],\n"," [-0.0063, 0.0249, -0.0266, ..., -0.0188, -0.0208, 0.0085]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([[ 0.0048, 0.0173, -0.0011, ..., 0.0227, 0.0020, 0.0363],\n"," [ 0.0359, -0.0102, -0.0046, ..., 0.0402, 0.0201, 0.0505],\n"," [-0.0343, 0.0053, -0.0294, ..., 0.0225, 0.0183, -0.0022],\n"," ...,\n"," [ 0.0146, 0.0148, -0.0042, ..., -0.0033, 0.0284, -0.0066],\n"," [ 0.0114, -0.0046, -0.0007, ..., -0.0223, 0.0237, 0.0019],\n"," [ 0.0199, -0.0108, -0.0134, ..., -0.0157, -0.0056, 0.0061]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([[ 0.0104, -0.0175, -0.0179, ..., 0.0039, -0.0230, 0.0308],\n"," [ 0.0036, -0.0209, -0.0021, ..., -0.0294, -0.0010, -0.0134],\n"," [ 0.0043, -0.0066, -0.0020, ..., 0.0302, -0.0107, 0.0294],\n"," ...,\n"," [ 0.0050, 0.0231, 0.0165, ..., 0.0235, 0.0017, 0.0008],\n"," [-0.0062, -0.0152, 0.0197, ..., -0.0150, 0.0227, -0.0002],\n"," [-0.0078, 0.0004, -0.0485, ..., -0.0065, 0.0180, -0.0047]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.], requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([[ 0.0307, -0.0206, 0.0074, ..., -0.0162, -0.0204, -0.0239],\n"," [ 0.0042, 0.0065, 0.0052, ..., -0.0414, -0.0222, 0.0336],\n"," [ 0.0133, 0.0100, -0.0241, ..., 0.0120, 0.0076, 0.0027],\n"," ...,\n"," [-0.0333, -0.0121, 0.0306, ..., -0.0032, -0.0266, -0.0231],\n"," [-0.0119, 0.0303, -0.0076, ..., 0.0027, 0.0146, -0.0204],\n"," [ 0.0218, -0.0079, -0.0013, ..., 0.0240, -0.0374, 0.0312]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., ..., 0., 0., 0.], requires_grad=True)\n","Parameter containing:\n","tensor([[-0.0322, -0.0077, 0.0120, ..., 0.0150, -0.0179, 0.0023],\n"," [ 0.0057, -0.0142, -0.0065, ..., -0.0118, 0.0151, 0.0168],\n"," [-0.0048, 0.0125, 0.0374, ..., 0.0152, 0.0463, 0.0009],\n"," ...,\n"," [ 0.0073, 0.0066, 0.0022, ..., -0.0218, 0.0131, 0.0258],\n"," [-0.0050, 0.0442, -0.0028, ..., -0.0188, 0.0114, -0.0279],\n"," [-0.0033, 0.0115, -0.0173, ..., 0.0038, -0.0212, 0.0098]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.], requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([[-0.0242, 0.0441, -0.0098, ..., -0.0031, -0.0083, 0.0330],\n"," [-0.0369, 0.0118, 0.0001, ..., 0.0003, -0.0220, 0.0033],\n"," [-0.0065, 0.0101, 0.0038, ..., 0.0188, -0.0076, 0.0088],\n"," ...,\n"," [ 0.0006, 0.0281, 0.0263, ..., -0.0092, 0.0023, 0.0010],\n"," [-0.0148, 0.0130, 0.0021, ..., -0.0130, -0.0136, 0.0012],\n"," [-0.0165, 0.0078, -0.0509, ..., -0.0126, 0.0186, 0.0120]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([[-0.0259, -0.0256, 0.0229, ..., 0.0480, -0.0087, 0.0039],\n"," [ 0.0022, -0.0142, 0.0378, ..., 0.0080, -0.0094, 0.0031],\n"," [-0.0170, 0.0531, 0.0088, ..., -0.0128, 0.0364, 0.0242],\n"," ...,\n"," [-0.0153, -0.0138, 0.0145, ..., 0.0315, -0.0114, 0.0018],\n"," [ 0.0389, -0.0455, 0.0126, ..., 0.0030, 0.0271, -0.0047],\n"," [-0.0268, -0.0066, -0.0116, ..., 0.0275, 0.0002, 0.0163]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([[ 0.0333, -0.0119, -0.0056, ..., -0.0094, -0.0057, 0.0245],\n"," [-0.0046, 0.0278, 0.0025, ..., -0.0072, -0.0165, 0.0011],\n"," [-0.0152, -0.0066, 0.0119, ..., -0.0010, -0.0457, 0.0160],\n"," ...,\n"," [-0.0148, -0.0123, -0.0129, ..., -0.0055, 0.0173, -0.0246],\n"," [ 0.0077, -0.0306, -0.0068, ..., -0.0308, 0.0058, -0.0056],\n"," [-0.0104, -0.0199, -0.0015, ..., -0.0020, 0.0161, 0.0124]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([[-0.0062, 0.0128, -0.0137, ..., 0.0202, 0.0229, -0.0264],\n"," [-0.0273, -0.0203, 0.0245, ..., -0.0114, 0.0284, -0.0209],\n"," [-0.0245, -0.0111, 0.0219, ..., -0.0153, 0.0036, -0.0129],\n"," ...,\n"," [-0.0330, 0.0134, 0.0144, ..., 0.0189, -0.0339, -0.0204],\n"," [ 0.0267, 0.0215, -0.0244, ..., 0.0086, -0.0319, -0.0028],\n"," [-0.0270, 0.0148, 0.0155, ..., 0.0113, -0.0063, -0.0188]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.], requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([[-0.0118, -0.0135, 0.0127, ..., 0.0226, 0.0109, -0.0040],\n"," [-0.0076, 0.0183, 0.0090, ..., -0.0140, 0.0563, -0.0068],\n"," [ 0.0183, 0.0099, 0.0319, ..., 0.0139, 0.0437, 0.0120],\n"," ...,\n"," [-0.0376, -0.0088, 0.0195, ..., 0.0481, 0.0247, -0.0063],\n"," [-0.0007, -0.0104, -0.0115, ..., -0.0122, -0.0012, -0.0033],\n"," [-0.0102, 0.0107, 0.0317, ..., 0.0021, 0.0174, -0.0022]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., ..., 0., 0., 0.], requires_grad=True)\n","Parameter containing:\n","tensor([[-0.0080, -0.0068, -0.0147, ..., -0.0147, -0.0201, 0.0060],\n"," [-0.0216, -0.0156, 0.0003, ..., -0.0099, -0.0240, -0.0519],\n"," [-0.0034, -0.0163, -0.0302, ..., 0.0365, -0.0289, 0.0012],\n"," ...,\n"," [-0.0215, -0.0266, 0.0209, ..., -0.0011, -0.0199, -0.0020],\n"," [ 0.0053, -0.0307, 0.0172, ..., 0.0115, -0.0379, 0.0196],\n"," [ 0.0070, 0.0208, -0.0073, ..., 0.0190, -0.0255, 0.0458]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.], requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([[ 6.9585e-03, -1.6945e-02, 2.1462e-02, ..., 2.5339e-02,\n"," 2.0430e-02, -1.4363e-02],\n"," [ 4.4982e-02, -2.5055e-02, 1.1185e-02, ..., -1.7393e-02,\n"," -1.4757e-02, -4.4690e-02],\n"," [-2.1813e-02, 2.6028e-02, 3.0546e-04, ..., 8.1186e-03,\n"," -3.3577e-03, -1.8338e-02],\n"," ...,\n"," [ 2.9144e-02, -6.2635e-05, -2.2587e-02, ..., 3.1348e-02,\n"," 7.0294e-03, 1.0935e-02],\n"," [ 2.6010e-02, 3.3259e-02, 1.3087e-02, ..., 2.6066e-02,\n"," 1.1860e-02, 1.2322e-02],\n"," [ 1.1514e-02, 5.3906e-03, 2.4099e-02, ..., -4.5759e-03,\n"," -3.0175e-03, -1.0820e-02]], requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([[-0.0131, -0.0151, -0.0092, ..., -0.0124, -0.0029, 0.0157],\n"," [-0.0058, 0.0491, -0.0354, ..., 0.0302, -0.0023, 0.0292],\n"," [ 0.0319, 0.0409, 0.0114, ..., 0.0136, -0.0021, 0.0093],\n"," ...,\n"," [-0.0085, 0.0182, -0.0265, ..., -0.0299, -0.0108, 0.0093],\n"," [-0.0116, -0.0126, -0.0154, ..., -0.0226, 0.0024, -0.0118],\n"," [-0.0020, 0.0256, -0.0206, ..., -0.0158, -0.0241, 0.0084]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([[-0.0089, 0.0046, 0.0110, ..., -0.0274, -0.0066, 0.0079],\n"," [-0.0042, -0.0302, 0.0265, ..., 0.0379, 0.0011, -0.0103],\n"," [-0.0234, -0.0332, 0.0367, ..., -0.0007, 0.0157, 0.0053],\n"," ...,\n"," [-0.0051, 0.0228, 0.0412, ..., -0.0233, -0.0331, -0.0270],\n"," [-0.0409, -0.0014, 0.0307, ..., 0.0017, 0.0290, -0.0176],\n"," [-0.0023, 0.0131, -0.0247, ..., -0.0064, 0.0265, 0.0064]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([[-0.0150, 0.0113, 0.0144, ..., 0.0170, -0.0096, 0.0191],\n"," [-0.0004, 0.0008, -0.0175, ..., -0.0170, 0.0112, -0.0070],\n"," [ 0.0299, 0.0273, -0.0075, ..., -0.0206, -0.0012, -0.0053],\n"," ...,\n"," [-0.0252, 0.0047, 0.0243, ..., 0.0028, -0.0012, 0.0110],\n"," [-0.0212, -0.0151, 0.0219, ..., -0.0139, 0.0290, 0.0310],\n"," [-0.0261, 0.0195, 0.0263, ..., 0.0245, -0.0040, 0.0331]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.], requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([[-0.0167, -0.0493, -0.0141, ..., -0.0422, 0.0203, 0.0107],\n"," [-0.0437, -0.0005, 0.0243, ..., -0.0028, 0.0221, -0.0080],\n"," [ 0.0308, -0.0203, 0.0144, ..., 0.0223, 0.0124, 0.0252],\n"," ...,\n"," [ 0.0112, 0.0005, -0.0042, ..., 0.0042, 0.0232, 0.0172],\n"," [-0.0113, 0.0347, -0.0536, ..., 0.0057, 0.0190, -0.0136],\n"," [-0.0060, 0.0082, 0.0432, ..., -0.0228, 0.0417, 0.0247]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., ..., 0., 0., 0.], requires_grad=True)\n","Parameter containing:\n","tensor([[-0.0105, -0.0253, -0.0303, ..., 0.0160, -0.0179, 0.0256],\n"," [-0.0052, 0.0164, 0.0133, ..., -0.0161, 0.0213, -0.0034],\n"," [ 0.0360, -0.0218, 0.0057, ..., 0.0125, 0.0144, -0.0003],\n"," ...,\n"," [-0.0088, 0.0230, 0.0231, ..., 0.0153, -0.0139, -0.0112],\n"," [ 0.0244, -0.0448, -0.0341, ..., -0.0271, -0.0140, -0.0010],\n"," [-0.0094, 0.0033, -0.0191, ..., -0.0193, -0.0186, -0.0260]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.], requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., ..., 0., 0., 0.], requires_grad=True)\n","Parameter containing:\n","tensor([[ 0.0208, -0.0084, 0.0142, ..., 0.0035, -0.0125, -0.0398],\n"," [-0.0025, 0.0109, 0.0187, ..., 0.0066, 0.0255, -0.0056],\n"," [ 0.0407, 0.0072, -0.0126, ..., -0.0236, -0.0165, -0.0047],\n"," ...,\n"," [-0.0352, 0.0254, -0.0119, ..., 0.0193, 0.0495, -0.0062],\n"," [-0.0097, -0.0344, 0.0108, ..., 0.0191, -0.0036, 0.0204],\n"," [ 0.0050, 0.0167, 0.0047, ..., 0.0294, 0.0066, -0.0200]],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n","Parameter containing:\n","tensor([1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,\n"," 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.], requires_grad=True)\n","Parameter containing:\n","tensor([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,\n"," 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],\n"," requires_grad=True)\n"],"name":"stdout"}]},{"cell_type":"code","metadata":{"colab":{"base_uri":"https://localhost:8080/"},"id":"Ej82kG6K3akQ","executionInfo":{"elapsed":1596,"status":"ok","timestamp":1611303413503,"user":{"displayName":"Karan Sonawane","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64","userId":"05479461208077736330"},"user_tz":-330},"outputId":"09db7a80-a3c2-4cfb-9a80-dd172a1bffa1"},"source":["#@title Counting the parameters\n","np=0\n","for p in range(0,lp):#number of tensors\n"," PL2=True\n"," try:\n"," L2=len(LP[p][0]) #check if 2D\n"," except:\n"," L2=1 #not 2D but 1D\n"," PL2=False\n"," L1=len(LP[p]) \n"," L3=L1*L2\n"," np+=L3 # number of parameters per tensor\n"," if PL2==True:\n"," print(p,L1,L2,L3) # displaying the sizes of the parameters\n"," if PL2==False:\n"," print(p,L1,L3) # displaying the sizes of the parameters\n","\n","print(np) # total number of parameters"],"execution_count":null,"outputs":[{"output_type":"stream","text":["0 52000 768 39936000\n","1 514 768 394752\n","2 1 768 768\n","3 768 768\n","4 768 768\n","5 768 768 589824\n","6 768 768\n","7 768 768 589824\n","8 768 768\n","9 768 768 589824\n","10 768 768\n","11 768 768 589824\n","12 768 768\n","13 768 768\n","14 768 768\n","15 3072 768 2359296\n","16 3072 3072\n","17 768 3072 2359296\n","18 768 768\n","19 768 768\n","20 768 768\n","21 768 768 589824\n","22 768 768\n","23 768 768 589824\n","24 768 768\n","25 768 768 589824\n","26 768 768\n","27 768 768 589824\n","28 768 768\n","29 768 768\n","30 768 768\n","31 3072 768 2359296\n","32 3072 3072\n","33 768 3072 2359296\n","34 768 768\n","35 768 768\n","36 768 768\n","37 768 768 589824\n","38 768 768\n","39 768 768 589824\n","40 768 768\n","41 768 768 589824\n","42 768 768\n","43 768 768 589824\n","44 768 768\n","45 768 768\n","46 768 768\n","47 3072 768 2359296\n","48 3072 3072\n","49 768 3072 2359296\n","50 768 768\n","51 768 768\n","52 768 768\n","53 768 768 589824\n","54 768 768\n","55 768 768 589824\n","56 768 768\n","57 768 768 589824\n","58 768 768\n","59 768 768 589824\n","60 768 768\n","61 768 768\n","62 768 768\n","63 3072 768 2359296\n","64 3072 3072\n","65 768 3072 2359296\n","66 768 768\n","67 768 768\n","68 768 768\n","69 768 768 589824\n","70 768 768\n","71 768 768 589824\n","72 768 768\n","73 768 768 589824\n","74 768 768\n","75 768 768 589824\n","76 768 768\n","77 768 768\n","78 768 768\n","79 3072 768 2359296\n","80 3072 3072\n","81 768 3072 2359296\n","82 768 768\n","83 768 768\n","84 768 768\n","85 768 768 589824\n","86 768 768\n","87 768 768 589824\n","88 768 768\n","89 768 768 589824\n","90 768 768\n","91 768 768 589824\n","92 768 768\n","93 768 768\n","94 768 768\n","95 3072 768 2359296\n","96 3072 3072\n","97 768 3072 2359296\n","98 768 768\n","99 768 768\n","100 768 768\n","101 52000 52000\n","102 768 768 589824\n","103 768 768\n","104 768 768\n","105 768 768\n","83504416\n"],"name":"stdout"}]},{"cell_type":"code","metadata":{"colab":{"base_uri":"https://localhost:8080/"},"id":"GlvP_A-THEEl","executionInfo":{"elapsed":22936,"status":"ok","timestamp":1611303439451,"user":{"displayName":"Karan Sonawane","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64","userId":"05479461208077736330"},"user_tz":-330},"outputId":"ce117d0d-56d0-473f-eb4e-9efffc7b25dc"},"source":["#@title Step 10: Building the Dataset\n","%%time\n","from transformers import LineByLineTextDataset\n","\n","dataset = LineByLineTextDataset(\n"," tokenizer=tokenizer,\n"," file_path=\"./kant.txt\",\n"," block_size=128,\n",")"],"execution_count":null,"outputs":[{"output_type":"stream","text":["/usr/local/lib/python3.6/dist-packages/transformers/data/datasets/language_modeling.py:128: FutureWarning: This dataset will be removed from the library soon, preprocessing should be handled with the 🤗 Datasets library. You can have a look at this example script for pointers: https://github.com/huggingface/transformers/blob/master/examples/language-modeling/run_mlm.py\n"," FutureWarning,\n"],"name":"stderr"},{"output_type":"stream","text":["CPU times: user 20.2 s, sys: 661 ms, total: 20.9 s\n","Wall time: 20.9 s\n"],"name":"stdout"}]},{"cell_type":"code","metadata":{"id":"zTgWPa9Dipk2"},"source":["#@title Step 11: Defining a Data Collator\n","from transformers import DataCollatorForLanguageModeling\n","\n","data_collator = DataCollatorForLanguageModeling(\n"," tokenizer=tokenizer, mlm=True, mlm_probability=0.15\n",")"],"execution_count":null,"outputs":[]},{"cell_type":"code","metadata":{"id":"YpvnFFmZJD-N"},"source":["#@title Step 12: Initializing the Trainer\n","from transformers import Trainer, TrainingArguments\n","\n","training_args = TrainingArguments(\n"," output_dir=\"./KantaiBERT\",\n"," overwrite_output_dir=True,\n"," num_train_epochs=1,\n"," per_device_train_batch_size=64,\n"," save_steps=10_000,\n"," save_total_limit=2,\n",")\n","\n","trainer = Trainer(\n"," model=model,\n"," args=training_args,\n"," data_collator=data_collator,\n"," train_dataset=dataset,\n",")"],"execution_count":null,"outputs":[]},{"cell_type":"code","metadata":{"colab":{"base_uri":"https://localhost:8080/","height":285},"id":"VmaHZXzmkNtJ","executionInfo":{"elapsed":351691,"status":"ok","timestamp":1611303814910,"user":{"displayName":"Karan Sonawane","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64","userId":"05479461208077736330"},"user_tz":-330},"outputId":"998acefe-df4b-4d07-8059-d25b323587e1"},"source":["#@title Step 13: Pre-training the Model\n","%%time\n","trainer.train()"],"execution_count":null,"outputs":[{"output_type":"display_data","data":{"text/html":["\n","
\n"," \n"," \n"," \n"," [2672/2672 05:50, Epoch 1/1]\n","
\n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n"," \n","
StepTraining Loss
5004.755200
10004.046900
15003.770500
20003.549800
25003.431600

"],"text/plain":[""]},"metadata":{"tags":[]}},{"output_type":"stream","text":["CPU times: user 4min 14s, sys: 1min 37s, total: 5min 51s\n","Wall time: 5min 50s\n"],"name":"stdout"},{"output_type":"execute_result","data":{"text/plain":["TrainOutput(global_step=2672, training_loss=3.8793241306693256, metrics={'train_runtime': 350.6061, 'train_samples_per_second': 7.621, 'total_flos': 1689347110470912, 'epoch': 1.0})"]},"metadata":{"tags":[]},"execution_count":25}]},{"cell_type":"code","metadata":{"id":"QDNgPls7_l13"},"source":["#@title Step 14: Saving the Final Model(+tokenizer + config) to disk\n","trainer.save_model(\"./KantaiBERT\")"],"execution_count":null,"outputs":[]},{"cell_type":"code","metadata":{"colab":{"base_uri":"https://localhost:8080/"},"id":"ltXgXyCbAJLY","executionInfo":{"elapsed":6144,"status":"ok","timestamp":1611304118693,"user":{"displayName":"Karan Sonawane","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64","userId":"05479461208077736330"},"user_tz":-330},"outputId":"36ddf0da-9b07-4f97-b8ad-834827e4bc25"},"source":["#@title Step 15: Language Modeling with the FillMaskPipeline\n","from transformers import pipeline\n","\n","fill_mask = pipeline(\n"," \"fill-mask\",\n"," model=\"./KantaiBERT\",\n"," tokenizer=\"./KantaiBERT\"\n",")"],"execution_count":null,"outputs":[{"output_type":"stream","text":["Some weights of RobertaModel were not initialized from the model checkpoint at ./KantaiBERT and are newly initialized: ['roberta.pooler.dense.weight', 'roberta.pooler.dense.bias']\n","You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.\n"],"name":"stderr"}]},{"cell_type":"code","metadata":{"colab":{"background_save":true},"id":"UIvgZ3S6AO0z","outputId":"19c8ae6c-b5b3-4be9-c84e-772fabc5a5c9"},"source":["fill_mask(\"Human thinking involves.\")"],"execution_count":null,"outputs":[{"output_type":"execute_result","data":{"text/plain":["[{'score': 0.010303723625838757,\n"," 'sequence': 'Human thinking involves reason.',\n"," 'token': 394,\n"," 'token_str': 'Ġreason'},\n"," {'score': 0.010289391502737999,\n"," 'sequence': 'Human thinking involves priori.',\n"," 'token': 578,\n"," 'token_str': 'Ġpriori'},\n"," {'score': 0.009549057111144066,\n"," 'sequence': 'Human thinking involves conceptions.',\n"," 'token': 610,\n"," 'token_str': 'Ġconceptions'},\n"," {'score': 0.008349979296326637,\n"," 'sequence': 'Human thinking involves experience.',\n"," 'token': 535,\n"," 'token_str': 'Ġexperience'},\n"," {'score': 0.00743826711550355,\n"," 'sequence': 'Human thinking involves will.',\n"," 'token': 487,\n"," 'token_str': 'Ġwill'}]"]},"metadata":{"tags":[]},"execution_count":0}]}]} ================================================ FILE: Chapter03/kant.txt ================================================ [File too large to display: 10.7 MB] ================================================ FILE: Chapter04/Transformer_tasks.ipynb ================================================ { "nbformat": 4, "nbformat_minor": 0, "metadata": { "colab": { "name": "Transformer tasks.ipynb", "provenance": [], "collapsed_sections": [], "toc_visible": true }, "kernelspec": { "name": "python3", "display_name": "Python 3" }, "widgets": { "application/vnd.jupyter.widget-state+json": { "3f41ce53fe774a86aeab5cedb2217a5b": { "model_module": "@jupyter-widgets/controls", "model_name": "HBoxModel", "state": { "_view_name": "HBoxView", "_dom_classes": [], "_model_name": "HBoxModel", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.5.0", "box_style": "", "layout": "IPY_MODEL_1a64601ca3c04b5e98e1d19375a47751", "_model_module": "@jupyter-widgets/controls", "children": [ "IPY_MODEL_edef5417bf564ca58e080362d7ff66a7", "IPY_MODEL_83bc31d46193435cb8d2ad65d99a457b" ] } }, "1a64601ca3c04b5e98e1d19375a47751": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "edef5417bf564ca58e080362d7ff66a7": { "model_module": "@jupyter-widgets/controls", "model_name": "FloatProgressModel", "state": { "_view_name": "ProgressView", "style": "IPY_MODEL_e78397c4b7bd471191db36e12639e024", "_dom_classes": [], "description": "Downloading: 100%", "_model_name": "FloatProgressModel", "bar_style": "success", "max": 442, "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": 442, "_view_count": null, "_view_module_version": "1.5.0", "orientation": "horizontal", "min": 0, "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_42ae51d8832e45be807d436f41f8ea51" } }, "83bc31d46193435cb8d2ad65d99a457b": { "model_module": "@jupyter-widgets/controls", "model_name": "HTMLModel", "state": { "_view_name": "HTMLView", "style": "IPY_MODEL_84785747983f453cae73f9596a7ec6f0", "_dom_classes": [], "description": "", "_model_name": "HTMLModel", "placeholder": "​", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": " 442/442 [00:02<00:00, 183B/s]", "_view_count": null, "_view_module_version": "1.5.0", "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_91a64b6bf1bf4c4aa24ad0c4d08dd3df" } }, "e78397c4b7bd471191db36e12639e024": { "model_module": "@jupyter-widgets/controls", "model_name": "ProgressStyleModel", "state": { "_view_name": "StyleView", "_model_name": "ProgressStyleModel", "description_width": "initial", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "bar_color": null, "_model_module": "@jupyter-widgets/controls" } }, "42ae51d8832e45be807d436f41f8ea51": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "84785747983f453cae73f9596a7ec6f0": { "model_module": "@jupyter-widgets/controls", "model_name": "DescriptionStyleModel", "state": { "_view_name": "StyleView", "_model_name": "DescriptionStyleModel", "description_width": "", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "_model_module": "@jupyter-widgets/controls" } }, "91a64b6bf1bf4c4aa24ad0c4d08dd3df": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "acdd285b20eb4823a8dfffe6ecd76201": { "model_module": "@jupyter-widgets/controls", "model_name": "HBoxModel", "state": { "_view_name": "HBoxView", "_dom_classes": [], "_model_name": "HBoxModel", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.5.0", "box_style": "", "layout": "IPY_MODEL_30c1e11f7e9c4ada902dc6edabf234f8", "_model_module": "@jupyter-widgets/controls", "children": [ "IPY_MODEL_5783ab24f4aa44ecbd2c437f01ec8bd5", "IPY_MODEL_35e5cd80564a43749c73a0458cc0d6da" ] } }, "30c1e11f7e9c4ada902dc6edabf234f8": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "5783ab24f4aa44ecbd2c437f01ec8bd5": { "model_module": "@jupyter-widgets/controls", "model_name": "FloatProgressModel", "state": { "_view_name": "ProgressView", "style": "IPY_MODEL_236898968f1e46d2bee145d6d369d0f4", "_dom_classes": [], "description": "Downloading: 100%", "_model_name": "FloatProgressModel", "bar_style": "success", "max": 231508, "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": 231508, "_view_count": null, "_view_module_version": "1.5.0", "orientation": "horizontal", "min": 0, "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_c23ef21b59b141afa84133ee50e5a329" } }, "35e5cd80564a43749c73a0458cc0d6da": { "model_module": "@jupyter-widgets/controls", "model_name": "HTMLModel", "state": { "_view_name": "HTMLView", "style": "IPY_MODEL_1852f772c435440e897bbbdf5913598e", "_dom_classes": [], "description": "", "_model_name": "HTMLModel", "placeholder": "​", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": " 232k/232k [00:00<00:00, 298kB/s]", "_view_count": null, "_view_module_version": "1.5.0", "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_a481c009790841bf912e0413788f2776" } }, "236898968f1e46d2bee145d6d369d0f4": { "model_module": "@jupyter-widgets/controls", "model_name": "ProgressStyleModel", "state": { "_view_name": "StyleView", "_model_name": "ProgressStyleModel", "description_width": "initial", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "bar_color": null, "_model_module": "@jupyter-widgets/controls" } }, "c23ef21b59b141afa84133ee50e5a329": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "1852f772c435440e897bbbdf5913598e": { "model_module": "@jupyter-widgets/controls", "model_name": "DescriptionStyleModel", "state": { "_view_name": "StyleView", "_model_name": "DescriptionStyleModel", "description_width": "", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "_model_module": "@jupyter-widgets/controls" } }, "a481c009790841bf912e0413788f2776": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "a608132fb0c247928252b7b3011fcf7d": { "model_module": "@jupyter-widgets/controls", "model_name": "HBoxModel", "state": { "_view_name": "HBoxView", "_dom_classes": [], "_model_name": "HBoxModel", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.5.0", "box_style": "", "layout": "IPY_MODEL_9c0ead55753243999715167582feb852", "_model_module": "@jupyter-widgets/controls", "children": [ "IPY_MODEL_da90cce7734d450e8e39ebf5e659658f", "IPY_MODEL_f2b8cbe27e4c4a168a9f5c8771e6f54c" ] } }, "9c0ead55753243999715167582feb852": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "da90cce7734d450e8e39ebf5e659658f": { "model_module": "@jupyter-widgets/controls", "model_name": "FloatProgressModel", "state": { "_view_name": "ProgressView", "style": "IPY_MODEL_3a2e5325f1e04541baf033054d514e2a", "_dom_classes": [], "description": "Downloading: 100%", "_model_name": "FloatProgressModel", "bar_style": "success", "max": 629, "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": 629, "_view_count": null, "_view_module_version": "1.5.0", "orientation": "horizontal", "min": 0, "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_2266c15cbb48451bbbd5655d6435b62b" } }, "f2b8cbe27e4c4a168a9f5c8771e6f54c": { "model_module": "@jupyter-widgets/controls", "model_name": "HTMLModel", "state": { "_view_name": "HTMLView", "style": "IPY_MODEL_9fbbb8870a95419b90cfbaa8c7db4ae1", "_dom_classes": [], "description": "", "_model_name": "HTMLModel", "placeholder": "​", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": " 629/629 [00:01<00:00, 376B/s]", "_view_count": null, "_view_module_version": "1.5.0", "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_f473dd5bf92f4a5eaec7a709d37a1601" } }, "3a2e5325f1e04541baf033054d514e2a": { "model_module": "@jupyter-widgets/controls", "model_name": "ProgressStyleModel", "state": { "_view_name": "StyleView", "_model_name": "ProgressStyleModel", "description_width": "initial", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "bar_color": null, "_model_module": "@jupyter-widgets/controls" } }, "2266c15cbb48451bbbd5655d6435b62b": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "9fbbb8870a95419b90cfbaa8c7db4ae1": { "model_module": "@jupyter-widgets/controls", "model_name": "DescriptionStyleModel", "state": { "_view_name": "StyleView", "_model_name": "DescriptionStyleModel", "description_width": "", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "_model_module": "@jupyter-widgets/controls" } }, "f473dd5bf92f4a5eaec7a709d37a1601": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "f6abfb99d9c24695ab8a5db242947f54": { "model_module": "@jupyter-widgets/controls", "model_name": "HBoxModel", "state": { "_view_name": "HBoxView", "_dom_classes": [], "_model_name": "HBoxModel", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.5.0", "box_style": "", "layout": "IPY_MODEL_7110b475ad774c75a7855636d4212f30", "_model_module": "@jupyter-widgets/controls", "children": [ "IPY_MODEL_802be925656042b19f8c5ded138045bb", "IPY_MODEL_59f4bcea6eb54e269d687cf9618376ea" ] } }, "7110b475ad774c75a7855636d4212f30": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "802be925656042b19f8c5ded138045bb": { "model_module": "@jupyter-widgets/controls", "model_name": "FloatProgressModel", "state": { "_view_name": "ProgressView", "style": "IPY_MODEL_4919fefe558047d6b7f248898ac62f6f", "_dom_classes": [], "description": "Downloading: 100%", "_model_name": "FloatProgressModel", "bar_style": "success", "max": 230, "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": 230, "_view_count": null, "_view_module_version": "1.5.0", "orientation": "horizontal", "min": 0, "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_eee41447bbd7413f826225573a3836f0" } }, "59f4bcea6eb54e269d687cf9618376ea": { "model_module": "@jupyter-widgets/controls", "model_name": "HTMLModel", "state": { "_view_name": "HTMLView", "style": "IPY_MODEL_28e9463b30a14ee59a7b65fa99f029e5", "_dom_classes": [], "description": "", "_model_name": "HTMLModel", "placeholder": "​", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": " 230/230 [00:00<00:00, 1.90kB/s]", "_view_count": null, "_view_module_version": "1.5.0", "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_5786117a2bbb44d1aa70e3ef08872ad5" } }, "4919fefe558047d6b7f248898ac62f6f": { "model_module": "@jupyter-widgets/controls", "model_name": "ProgressStyleModel", "state": { "_view_name": "StyleView", "_model_name": "ProgressStyleModel", "description_width": "initial", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "bar_color": null, "_model_module": "@jupyter-widgets/controls" } }, "eee41447bbd7413f826225573a3836f0": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "28e9463b30a14ee59a7b65fa99f029e5": { "model_module": "@jupyter-widgets/controls", "model_name": "DescriptionStyleModel", "state": { "_view_name": "StyleView", "_model_name": "DescriptionStyleModel", "description_width": "", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "_model_module": "@jupyter-widgets/controls" } }, "5786117a2bbb44d1aa70e3ef08872ad5": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "e9d5f842308740368a11ed1b46aca768": { "model_module": "@jupyter-widgets/controls", "model_name": "HBoxModel", "state": { "_view_name": "HBoxView", "_dom_classes": [], "_model_name": "HBoxModel", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.5.0", "box_style": "", "layout": "IPY_MODEL_d8d5e37ace9b42c5b8fbe0e4763db2a5", "_model_module": "@jupyter-widgets/controls", "children": [ "IPY_MODEL_cec94035159243f9ab03a5034ed26d66", "IPY_MODEL_651c6adc8a064096bf306e5ebc1275c7" ] } }, "d8d5e37ace9b42c5b8fbe0e4763db2a5": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "cec94035159243f9ab03a5034ed26d66": { "model_module": "@jupyter-widgets/controls", "model_name": "FloatProgressModel", "state": { "_view_name": "ProgressView", "style": "IPY_MODEL_54a8b16ce66040c2b297f1b662b350c1", "_dom_classes": [], "description": "Downloading: 100%", "_model_name": "FloatProgressModel", "bar_style": "success", "max": 267844284, "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": 267844284, "_view_count": null, "_view_module_version": "1.5.0", "orientation": "horizontal", "min": 0, "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_f1c4230f55e148338f80ffff65afd1cb" } }, "651c6adc8a064096bf306e5ebc1275c7": { "model_module": "@jupyter-widgets/controls", "model_name": "HTMLModel", "state": { "_view_name": "HTMLView", "style": "IPY_MODEL_c2d1d30ef23346f9971c11cff4824012", "_dom_classes": [], "description": "", "_model_name": "HTMLModel", "placeholder": "​", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": " 268M/268M [00:11<00:00, 22.4MB/s]", "_view_count": null, "_view_module_version": "1.5.0", "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_f62dfd28a2ef4429a8809a7b83d3cfdc" } }, "54a8b16ce66040c2b297f1b662b350c1": { "model_module": "@jupyter-widgets/controls", "model_name": "ProgressStyleModel", "state": { "_view_name": "StyleView", "_model_name": "ProgressStyleModel", "description_width": "initial", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "bar_color": null, "_model_module": "@jupyter-widgets/controls" } }, "f1c4230f55e148338f80ffff65afd1cb": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "c2d1d30ef23346f9971c11cff4824012": { "model_module": "@jupyter-widgets/controls", "model_name": "DescriptionStyleModel", "state": { "_view_name": "StyleView", "_model_name": "DescriptionStyleModel", "description_width": "", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "_model_module": "@jupyter-widgets/controls" } }, "f62dfd28a2ef4429a8809a7b83d3cfdc": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "7ce0e4d211f34e298db9bde71aafd31d": { "model_module": "@jupyter-widgets/controls", "model_name": "HBoxModel", "state": { "_view_name": "HBoxView", "_dom_classes": [], "_model_name": "HBoxModel", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.5.0", "box_style": "", "layout": "IPY_MODEL_ca96ab0cd02644d2897f14ef256f9ab9", "_model_module": "@jupyter-widgets/controls", "children": [ "IPY_MODEL_c8d3c1a200884dfe8cc74efc73643d66", "IPY_MODEL_7a9c6953595d4ab39267c4dfadbf72b4" ] } }, "ca96ab0cd02644d2897f14ef256f9ab9": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "c8d3c1a200884dfe8cc74efc73643d66": { "model_module": "@jupyter-widgets/controls", "model_name": "FloatProgressModel", "state": { "_view_name": "ProgressView", "style": "IPY_MODEL_c21b433bb763464b99dbc52cd180ae85", "_dom_classes": [], "description": "Downloading: 100%", "_model_name": "FloatProgressModel", "bar_style": "success", "max": 433, "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": 433, "_view_count": null, "_view_module_version": "1.5.0", "orientation": "horizontal", "min": 0, "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_71fda054324a40bc9d852cbb94ae3240" } }, "7a9c6953595d4ab39267c4dfadbf72b4": { "model_module": "@jupyter-widgets/controls", "model_name": "HTMLModel", "state": { "_view_name": "HTMLView", "style": "IPY_MODEL_e8ae1c2f79564550beb3df70d9e08295", "_dom_classes": [], "description": "", "_model_name": "HTMLModel", "placeholder": "​", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": " 433/433 [00:02<00:00, 195B/s]", "_view_count": null, "_view_module_version": "1.5.0", "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_757eaa4714064d93b99918fb9ea3cd43" } }, "c21b433bb763464b99dbc52cd180ae85": { "model_module": "@jupyter-widgets/controls", "model_name": "ProgressStyleModel", "state": { "_view_name": "StyleView", "_model_name": "ProgressStyleModel", "description_width": "initial", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "bar_color": null, "_model_module": "@jupyter-widgets/controls" } }, "71fda054324a40bc9d852cbb94ae3240": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "e8ae1c2f79564550beb3df70d9e08295": { "model_module": "@jupyter-widgets/controls", "model_name": "DescriptionStyleModel", "state": { "_view_name": "StyleView", "_model_name": "DescriptionStyleModel", "description_width": "", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "_model_module": "@jupyter-widgets/controls" } }, "757eaa4714064d93b99918fb9ea3cd43": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "812815f9249b4f6cb138aed2e6a03fd4": { "model_module": "@jupyter-widgets/controls", "model_name": "HBoxModel", "state": { "_view_name": "HBoxView", "_dom_classes": [], "_model_name": "HBoxModel", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.5.0", "box_style": "", "layout": "IPY_MODEL_513fd9d4b17d47e385c7ec7399d7a355", "_model_module": "@jupyter-widgets/controls", "children": [ "IPY_MODEL_c0e34e4be46b4ea395b978fd7108f420", "IPY_MODEL_a4c8291f9c0a44d28e4c89b2a6092373" ] } }, "513fd9d4b17d47e385c7ec7399d7a355": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "c0e34e4be46b4ea395b978fd7108f420": { "model_module": "@jupyter-widgets/controls", "model_name": "FloatProgressModel", "state": { "_view_name": "ProgressView", "style": "IPY_MODEL_04ff1050a97d46fcba82145640ff6b78", "_dom_classes": [], "description": "Downloading: 100%", "_model_name": "FloatProgressModel", "bar_style": "success", "max": 213450, "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": 213450, "_view_count": null, "_view_module_version": "1.5.0", "orientation": "horizontal", "min": 0, "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_9bad4a56293449f68e79f5b5dd0d41c1" } }, "a4c8291f9c0a44d28e4c89b2a6092373": { "model_module": "@jupyter-widgets/controls", "model_name": "HTMLModel", "state": { "_view_name": "HTMLView", "style": "IPY_MODEL_f0f68e55618a44fd9491524d7e0d8dd5", "_dom_classes": [], "description": "", "_model_name": "HTMLModel", "placeholder": "​", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": " 213k/213k [00:00<00:00, 368kB/s]", "_view_count": null, "_view_module_version": "1.5.0", "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_fed685ae5a7645c28bdb58e3f9703384" } }, "04ff1050a97d46fcba82145640ff6b78": { "model_module": "@jupyter-widgets/controls", "model_name": "ProgressStyleModel", "state": { "_view_name": "StyleView", "_model_name": "ProgressStyleModel", "description_width": "initial", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "bar_color": null, "_model_module": "@jupyter-widgets/controls" } }, "9bad4a56293449f68e79f5b5dd0d41c1": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "f0f68e55618a44fd9491524d7e0d8dd5": { "model_module": "@jupyter-widgets/controls", "model_name": "DescriptionStyleModel", "state": { "_view_name": "StyleView", "_model_name": "DescriptionStyleModel", "description_width": "", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "_model_module": "@jupyter-widgets/controls" } }, "fed685ae5a7645c28bdb58e3f9703384": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "b88724d6b16e472f8ede902cac4ae6f2": { "model_module": "@jupyter-widgets/controls", "model_name": "HBoxModel", "state": { "_view_name": "HBoxView", "_dom_classes": [], "_model_name": "HBoxModel", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.5.0", "box_style": "", "layout": "IPY_MODEL_163770117e5a4d0d95926e3a5d0fbf82", "_model_module": "@jupyter-widgets/controls", "children": [ "IPY_MODEL_7f2b4c0c78994c83a064056dc8e79bb3", "IPY_MODEL_3d95ba7c826c4c3a8265755fd5738434" ] } }, "163770117e5a4d0d95926e3a5d0fbf82": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "7f2b4c0c78994c83a064056dc8e79bb3": { "model_module": "@jupyter-widgets/controls", "model_name": "FloatProgressModel", "state": { "_view_name": "ProgressView", "style": "IPY_MODEL_1d5c9a930b5a4558ab0647c90d78f085", "_dom_classes": [], "description": "Downloading: 100%", "_model_name": "FloatProgressModel", "bar_style": "success", "max": 433518744, "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": 433518744, "_view_count": null, "_view_module_version": "1.5.0", "orientation": "horizontal", "min": 0, "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_61f23475c541487899f4e559125e7b46" } }, "3d95ba7c826c4c3a8265755fd5738434": { "model_module": "@jupyter-widgets/controls", "model_name": "HTMLModel", "state": { "_view_name": "HTMLView", "style": "IPY_MODEL_40212ae99d6e40e6a75836d1e6874dc3", "_dom_classes": [], "description": "", "_model_name": "HTMLModel", "placeholder": "​", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": " 434M/434M [00:16<00:00, 27.0MB/s]", "_view_count": null, "_view_module_version": "1.5.0", "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_890f3d9f1fa5441f9f6c0e8fb8a89c8f" } }, "1d5c9a930b5a4558ab0647c90d78f085": { "model_module": "@jupyter-widgets/controls", "model_name": "ProgressStyleModel", "state": { "_view_name": "StyleView", "_model_name": "ProgressStyleModel", "description_width": "initial", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "bar_color": null, "_model_module": "@jupyter-widgets/controls" } }, "61f23475c541487899f4e559125e7b46": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "40212ae99d6e40e6a75836d1e6874dc3": { "model_module": "@jupyter-widgets/controls", "model_name": "DescriptionStyleModel", "state": { "_view_name": "StyleView", "_model_name": "DescriptionStyleModel", "description_width": "", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "_model_module": "@jupyter-widgets/controls" } }, "890f3d9f1fa5441f9f6c0e8fb8a89c8f": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "d6106a736cf046599bc3836b40ad804f": { "model_module": "@jupyter-widgets/controls", "model_name": "HBoxModel", "state": { "_view_name": "HBoxView", "_dom_classes": [], "_model_name": "HBoxModel", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.5.0", "box_style": "", "layout": "IPY_MODEL_76f684a27781484f9cd5ef43df693943", "_model_module": "@jupyter-widgets/controls", "children": [ "IPY_MODEL_d8d9185e24604408a59bb75404fd7daa", "IPY_MODEL_6c0bd986d2664ca896244e3858448962" ] } }, "76f684a27781484f9cd5ef43df693943": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "d8d9185e24604408a59bb75404fd7daa": { "model_module": "@jupyter-widgets/controls", "model_name": "FloatProgressModel", "state": { "_view_name": "ProgressView", "style": "IPY_MODEL_a76a6ab098f1470cafcba41f27c75e74", "_dom_classes": [], "description": "Downloading: 100%", "_model_name": "FloatProgressModel", "bar_style": "success", "max": 625, "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": 625, "_view_count": null, "_view_module_version": "1.5.0", "orientation": "horizontal", "min": 0, "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_1df0601050254a59a1954c4c5d1d2a76" } }, "6c0bd986d2664ca896244e3858448962": { "model_module": "@jupyter-widgets/controls", "model_name": "HTMLModel", "state": { "_view_name": "HTMLView", "style": "IPY_MODEL_ac8d253331bb458c8d9f764303bc9f0b", "_dom_classes": [], "description": "", "_model_name": "HTMLModel", "placeholder": "​", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": " 625/625 [00:01<00:00, 394B/s]", "_view_count": null, "_view_module_version": "1.5.0", "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_93bad43a579342e79fdafdf673a3f8a2" } }, "a76a6ab098f1470cafcba41f27c75e74": { "model_module": "@jupyter-widgets/controls", "model_name": "ProgressStyleModel", "state": { "_view_name": "StyleView", "_model_name": "ProgressStyleModel", "description_width": "initial", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "bar_color": null, "_model_module": "@jupyter-widgets/controls" } }, "1df0601050254a59a1954c4c5d1d2a76": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "ac8d253331bb458c8d9f764303bc9f0b": { "model_module": "@jupyter-widgets/controls", "model_name": "DescriptionStyleModel", "state": { "_view_name": "StyleView", "_model_name": "DescriptionStyleModel", "description_width": "", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "_model_module": "@jupyter-widgets/controls" } }, "93bad43a579342e79fdafdf673a3f8a2": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "cf32487a5c3d4a898ca91e270c3f266b": { "model_module": "@jupyter-widgets/controls", "model_name": "HBoxModel", "state": { "_view_name": "HBoxView", "_dom_classes": [], "_model_name": "HBoxModel", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.5.0", "box_style": "", "layout": "IPY_MODEL_8502995caabc474c91bca08b97cfaa58", "_model_module": "@jupyter-widgets/controls", "children": [ "IPY_MODEL_98b911d9620c40a79c7fee410461d039", "IPY_MODEL_8fb914c5733747e3bae86fcb13073767" ] } }, "8502995caabc474c91bca08b97cfaa58": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "98b911d9620c40a79c7fee410461d039": { "model_module": "@jupyter-widgets/controls", "model_name": "FloatProgressModel", "state": { "_view_name": "ProgressView", "style": "IPY_MODEL_21acd0fcd9214093aa0e93845052ef7e", "_dom_classes": [], "description": "Downloading: 100%", "_model_name": "FloatProgressModel", "bar_style": "success", "max": 213450, "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": 213450, "_view_count": null, "_view_module_version": "1.5.0", "orientation": "horizontal", "min": 0, "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_aa67bf7e7cf940848b4061bee967052c" } }, "8fb914c5733747e3bae86fcb13073767": { "model_module": "@jupyter-widgets/controls", "model_name": "HTMLModel", "state": { "_view_name": "HTMLView", "style": "IPY_MODEL_b1d11f8f842540a682717d350d254155", "_dom_classes": [], "description": "", "_model_name": "HTMLModel", "placeholder": "​", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": " 213k/213k [00:00<00:00, 278kB/s]", "_view_count": null, "_view_module_version": "1.5.0", "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_8418d2e012a14387b231fe12ce0b9a1a" } }, "21acd0fcd9214093aa0e93845052ef7e": { "model_module": "@jupyter-widgets/controls", "model_name": "ProgressStyleModel", "state": { "_view_name": "StyleView", "_model_name": "ProgressStyleModel", "description_width": "initial", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "bar_color": null, "_model_module": "@jupyter-widgets/controls" } }, "aa67bf7e7cf940848b4061bee967052c": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "b1d11f8f842540a682717d350d254155": { "model_module": "@jupyter-widgets/controls", "model_name": "DescriptionStyleModel", "state": { "_view_name": "StyleView", "_model_name": "DescriptionStyleModel", "description_width": "", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "_model_module": "@jupyter-widgets/controls" } }, "8418d2e012a14387b231fe12ce0b9a1a": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "50993327f1d04882a66df50dd30cff3e": { "model_module": "@jupyter-widgets/controls", "model_name": "HBoxModel", "state": { "_view_name": "HBoxView", "_dom_classes": [], "_model_name": "HBoxModel", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.5.0", "box_style": "", "layout": "IPY_MODEL_61c52c61d9c740efab94997706257ba4", "_model_module": "@jupyter-widgets/controls", "children": [ "IPY_MODEL_e23a4ab1a468460a9e4b0f34ae67ef76", "IPY_MODEL_3f1629314ca6407b832db9349a508461" ] } }, "61c52c61d9c740efab94997706257ba4": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "e23a4ab1a468460a9e4b0f34ae67ef76": { "model_module": "@jupyter-widgets/controls", "model_name": "FloatProgressModel", "state": { "_view_name": "ProgressView", "style": "IPY_MODEL_d577e0e2741d43b8a73489c1a0df2406", "_dom_classes": [], "description": "Downloading: 100%", "_model_name": "FloatProgressModel", "bar_style": "success", "max": 998, "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": 998, "_view_count": null, "_view_module_version": "1.5.0", "orientation": "horizontal", "min": 0, "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_76ccbe1be44a4e019e6a50d32c9abf98" } }, "3f1629314ca6407b832db9349a508461": { "model_module": "@jupyter-widgets/controls", "model_name": "HTMLModel", "state": { "_view_name": "HTMLView", "style": "IPY_MODEL_c6eb63c7e9e34d6d989427b9dbe9457c", "_dom_classes": [], "description": "", "_model_name": "HTMLModel", "placeholder": "​", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": " 998/998 [00:01<00:00, 621B/s]", "_view_count": null, "_view_module_version": "1.5.0", "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_3d9b5bff09414a5dadc8f6f3ea279227" } }, "d577e0e2741d43b8a73489c1a0df2406": { "model_module": "@jupyter-widgets/controls", "model_name": "ProgressStyleModel", "state": { "_view_name": "StyleView", "_model_name": "ProgressStyleModel", "description_width": "initial", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "bar_color": null, "_model_module": "@jupyter-widgets/controls" } }, "76ccbe1be44a4e019e6a50d32c9abf98": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "c6eb63c7e9e34d6d989427b9dbe9457c": { "model_module": "@jupyter-widgets/controls", "model_name": "DescriptionStyleModel", "state": { "_view_name": "StyleView", "_model_name": "DescriptionStyleModel", "description_width": "", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "_model_module": "@jupyter-widgets/controls" } }, "3d9b5bff09414a5dadc8f6f3ea279227": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "ffd924b0cc9d492d888e5da831481033": { "model_module": "@jupyter-widgets/controls", "model_name": "HBoxModel", "state": { "_view_name": "HBoxView", "_dom_classes": [], "_model_name": "HBoxModel", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.5.0", "box_style": "", "layout": "IPY_MODEL_555dfb4c02df4930ae64f4f56ed158b7", "_model_module": "@jupyter-widgets/controls", "children": [ "IPY_MODEL_dc47306a92f648d4a690da79d39ac4cc", "IPY_MODEL_c4dfc9d018534634a056aa3da58fcfef" ] } }, "555dfb4c02df4930ae64f4f56ed158b7": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "dc47306a92f648d4a690da79d39ac4cc": { "model_module": "@jupyter-widgets/controls", "model_name": "FloatProgressModel", "state": { "_view_name": "ProgressView", "style": "IPY_MODEL_0a03eaaa65144c829bf93cacc2f66e69", "_dom_classes": [], "description": "Downloading: 100%", "_model_name": "FloatProgressModel", "bar_style": "success", "max": 230, "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": 230, "_view_count": null, "_view_module_version": "1.5.0", "orientation": "horizontal", "min": 0, "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_116c0c19179c4c5d847257a531f9269b" } }, "c4dfc9d018534634a056aa3da58fcfef": { "model_module": "@jupyter-widgets/controls", "model_name": "HTMLModel", "state": { "_view_name": "HTMLView", "style": "IPY_MODEL_aa9be08c10e44dfc8732f3419b5cc967", "_dom_classes": [], "description": "", "_model_name": "HTMLModel", "placeholder": "​", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": " 230/230 [01:38<00:00, 2.33B/s]", "_view_count": null, "_view_module_version": "1.5.0", "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_86cbe5290e654770888f9cfba100ae4e" } }, "0a03eaaa65144c829bf93cacc2f66e69": { "model_module": "@jupyter-widgets/controls", "model_name": "ProgressStyleModel", "state": { "_view_name": "StyleView", "_model_name": "ProgressStyleModel", "description_width": "initial", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "bar_color": null, "_model_module": "@jupyter-widgets/controls" } }, "116c0c19179c4c5d847257a531f9269b": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "aa9be08c10e44dfc8732f3419b5cc967": { "model_module": "@jupyter-widgets/controls", "model_name": "DescriptionStyleModel", "state": { "_view_name": "StyleView", "_model_name": "DescriptionStyleModel", "description_width": "", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "_model_module": "@jupyter-widgets/controls" } }, "86cbe5290e654770888f9cfba100ae4e": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "5b66512a4f6944b2ab0a78631d502da3": { "model_module": "@jupyter-widgets/controls", "model_name": "HBoxModel", "state": { "_view_name": "HBoxView", "_dom_classes": [], "_model_name": "HBoxModel", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.5.0", "box_style": "", "layout": "IPY_MODEL_f861ad27060640a49d34dbc6384a236e", "_model_module": "@jupyter-widgets/controls", "children": [ "IPY_MODEL_63c8f259ede54eab8b7eacc2cd393191", "IPY_MODEL_611edbb55ee74d808993a39de5044275" ] } }, "f861ad27060640a49d34dbc6384a236e": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "63c8f259ede54eab8b7eacc2cd393191": { "model_module": "@jupyter-widgets/controls", "model_name": "FloatProgressModel", "state": { "_view_name": "ProgressView", "style": "IPY_MODEL_336b49d4e3d741aea03fecc236e6333a", "_dom_classes": [], "description": "Downloading: 100%", "_model_name": "FloatProgressModel", "bar_style": "success", "max": 1334448817, "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": 1334448817, "_view_count": null, "_view_module_version": "1.5.0", "orientation": "horizontal", "min": 0, "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_b8c1e428241d4e67abb5df0be3eca758" } }, "611edbb55ee74d808993a39de5044275": { "model_module": "@jupyter-widgets/controls", "model_name": "HTMLModel", "state": { "_view_name": "HTMLView", "style": "IPY_MODEL_37c31cab7d4745d9a7944e4bdda8b970", "_dom_classes": [], "description": "", "_model_name": "HTMLModel", "placeholder": "​", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": " 1.33G/1.33G [01:35<00:00, 13.9MB/s]", "_view_count": null, "_view_module_version": "1.5.0", "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_8fe12fc6327d4027ab66cb5815760e75" } }, "336b49d4e3d741aea03fecc236e6333a": { "model_module": "@jupyter-widgets/controls", "model_name": "ProgressStyleModel", "state": { "_view_name": "StyleView", "_model_name": "ProgressStyleModel", "description_width": "initial", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "bar_color": null, "_model_module": "@jupyter-widgets/controls" } }, "b8c1e428241d4e67abb5df0be3eca758": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "37c31cab7d4745d9a7944e4bdda8b970": { "model_module": "@jupyter-widgets/controls", "model_name": "DescriptionStyleModel", "state": { "_view_name": "StyleView", "_model_name": "DescriptionStyleModel", "description_width": "", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "_model_module": "@jupyter-widgets/controls" } }, "8fe12fc6327d4027ab66cb5815760e75": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "10323e3b6d3f43b6b0d0e23de45b5729": { "model_module": "@jupyter-widgets/controls", "model_name": "HBoxModel", "state": { "_view_name": "HBoxView", "_dom_classes": [], "_model_name": "HBoxModel", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.5.0", "box_style": "", "layout": "IPY_MODEL_8bc14cbd5a01449eb01f2e01e9db2fa2", "_model_module": "@jupyter-widgets/controls", "children": [ "IPY_MODEL_43faf48017b84e8b9b0c5bf29337ed30", "IPY_MODEL_7eccb6f09783498ab44020360c1e0062" ] } }, "8bc14cbd5a01449eb01f2e01e9db2fa2": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "43faf48017b84e8b9b0c5bf29337ed30": { "model_module": "@jupyter-widgets/controls", "model_name": "FloatProgressModel", "state": { "_view_name": "ProgressView", "style": "IPY_MODEL_6738833ebeb94b60912151d886ca083e", "_dom_classes": [], "description": "Downloading: 100%", "_model_name": "FloatProgressModel", "bar_style": "success", "max": 1199, "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": 1199, "_view_count": null, "_view_module_version": "1.5.0", "orientation": "horizontal", "min": 0, "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_6b4c9bb4710541b29c2894739c24472c" } }, "7eccb6f09783498ab44020360c1e0062": { "model_module": "@jupyter-widgets/controls", "model_name": "HTMLModel", "state": { "_view_name": "HTMLView", "style": "IPY_MODEL_f2febefe55c64bea8d5fa39b8c94ba01", "_dom_classes": [], "description": "", "_model_name": "HTMLModel", "placeholder": "​", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": " 1.20k/1.20k [00:04<00:00, 292B/s]", "_view_count": null, "_view_module_version": "1.5.0", "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_1a8391734975484ca3feaf86d6da1161" } }, "6738833ebeb94b60912151d886ca083e": { "model_module": "@jupyter-widgets/controls", "model_name": "ProgressStyleModel", "state": { "_view_name": "StyleView", "_model_name": "ProgressStyleModel", "description_width": "initial", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "bar_color": null, "_model_module": "@jupyter-widgets/controls" } }, "6b4c9bb4710541b29c2894739c24472c": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "f2febefe55c64bea8d5fa39b8c94ba01": { "model_module": "@jupyter-widgets/controls", "model_name": "DescriptionStyleModel", "state": { "_view_name": "StyleView", "_model_name": "DescriptionStyleModel", "description_width": "", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "_model_module": "@jupyter-widgets/controls" } }, "1a8391734975484ca3feaf86d6da1161": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "1314be84dd424e94b13ee568840c7ea2": { "model_module": "@jupyter-widgets/controls", "model_name": "HBoxModel", "state": { "_view_name": "HBoxView", "_dom_classes": [], "_model_name": "HBoxModel", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.5.0", "box_style": "", "layout": "IPY_MODEL_63392f66c53d479199e24d8d8a45823a", "_model_module": "@jupyter-widgets/controls", "children": [ "IPY_MODEL_0cdc54e5dc434c73be4ef332025d0d97", "IPY_MODEL_bd0ab5247d6349819f4ba64533e5cc80" ] } }, "63392f66c53d479199e24d8d8a45823a": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "0cdc54e5dc434c73be4ef332025d0d97": { "model_module": "@jupyter-widgets/controls", "model_name": "FloatProgressModel", "state": { "_view_name": "ProgressView", "style": "IPY_MODEL_e31316c758bb4236905d4291302a9e15", "_dom_classes": [], "description": "Downloading: 100%", "_model_name": "FloatProgressModel", "bar_style": "success", "max": 791656, "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": 791656, "_view_count": null, "_view_module_version": "1.5.0", "orientation": "horizontal", "min": 0, "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_20af50bd6ae74e31b3c0b7f2b6e054d5" } }, "bd0ab5247d6349819f4ba64533e5cc80": { "model_module": "@jupyter-widgets/controls", "model_name": "HTMLModel", "state": { "_view_name": "HTMLView", "style": "IPY_MODEL_8b035f4d02f2447a961ce83766868e09", "_dom_classes": [], "description": "", "_model_name": "HTMLModel", "placeholder": "​", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": " 792k/792k [00:02<00:00, 316kB/s]", "_view_count": null, "_view_module_version": "1.5.0", "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_28c23c928d8c440783419e71bfb917d7" } }, "e31316c758bb4236905d4291302a9e15": { "model_module": "@jupyter-widgets/controls", "model_name": "ProgressStyleModel", "state": { "_view_name": "StyleView", "_model_name": "ProgressStyleModel", "description_width": "initial", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "bar_color": null, "_model_module": "@jupyter-widgets/controls" } }, "20af50bd6ae74e31b3c0b7f2b6e054d5": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "8b035f4d02f2447a961ce83766868e09": { "model_module": "@jupyter-widgets/controls", "model_name": "DescriptionStyleModel", "state": { "_view_name": "StyleView", "_model_name": "DescriptionStyleModel", "description_width": "", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "_model_module": "@jupyter-widgets/controls" } }, "28c23c928d8c440783419e71bfb917d7": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "f5c80d9b2f804af3a4bf07f97ba06bf1": { "model_module": "@jupyter-widgets/controls", "model_name": "HBoxModel", "state": { "_view_name": "HBoxView", "_dom_classes": [], "_model_name": "HBoxModel", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.5.0", "box_style": "", "layout": "IPY_MODEL_bcfe5e255d8a422baf1175c3b23e52e5", "_model_module": "@jupyter-widgets/controls", "children": [ "IPY_MODEL_b50f807793aa404f9ce54c55403f8dd3", "IPY_MODEL_6edc69c824bc4f63914e5850e589448a" ] } }, "bcfe5e255d8a422baf1175c3b23e52e5": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "b50f807793aa404f9ce54c55403f8dd3": { "model_module": "@jupyter-widgets/controls", "model_name": "FloatProgressModel", "state": { "_view_name": "ProgressView", "style": "IPY_MODEL_224b69b50c9846a1b244134d23635efd", "_dom_classes": [], "description": "Downloading: 100%", "_model_name": "FloatProgressModel", "bar_style": "success", "max": 230, "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": 230, "_view_count": null, "_view_module_version": "1.5.0", "orientation": "horizontal", "min": 0, "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_e26e0596f8534c79b7595361eda687ce" } }, "6edc69c824bc4f63914e5850e589448a": { "model_module": "@jupyter-widgets/controls", "model_name": "HTMLModel", "state": { "_view_name": "HTMLView", "style": "IPY_MODEL_d04989abce5d481dbee28ac450a32d2d", "_dom_classes": [], "description": "", "_model_name": "HTMLModel", "placeholder": "​", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": " 230/230 [00:21<00:00, 10.7B/s]", "_view_count": null, "_view_module_version": "1.5.0", "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_38a758bfadeb443686bc615481cd9da9" } }, "224b69b50c9846a1b244134d23635efd": { "model_module": "@jupyter-widgets/controls", "model_name": "ProgressStyleModel", "state": { "_view_name": "StyleView", "_model_name": "ProgressStyleModel", "description_width": "initial", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "bar_color": null, "_model_module": "@jupyter-widgets/controls" } }, "e26e0596f8534c79b7595361eda687ce": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "d04989abce5d481dbee28ac450a32d2d": { "model_module": "@jupyter-widgets/controls", "model_name": "DescriptionStyleModel", "state": { "_view_name": "StyleView", "_model_name": "DescriptionStyleModel", "description_width": "", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "_model_module": "@jupyter-widgets/controls" } }, "38a758bfadeb443686bc615481cd9da9": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "0d335835f44548efadcf0ecd8a49e391": { "model_module": "@jupyter-widgets/controls", "model_name": "HBoxModel", "state": { "_view_name": "HBoxView", "_dom_classes": [], "_model_name": "HBoxModel", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.5.0", "box_style": "", "layout": "IPY_MODEL_b08f64e0d5994eb9adffdfc1d48e9088", "_model_module": "@jupyter-widgets/controls", "children": [ "IPY_MODEL_45ea2974a40c4786a43ee07cdbbcd693", "IPY_MODEL_35dd5a9f5c0843838571880dc058661a" ] } }, "b08f64e0d5994eb9adffdfc1d48e9088": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "45ea2974a40c4786a43ee07cdbbcd693": { "model_module": "@jupyter-widgets/controls", "model_name": "FloatProgressModel", "state": { "_view_name": "ProgressView", "style": "IPY_MODEL_c1e3435e66d64c498e2334da0c521b60", "_dom_classes": [], "description": "Downloading: 100%", "_model_name": "FloatProgressModel", "bar_style": "success", "max": 891691430, "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": 891691430, "_view_count": null, "_view_module_version": "1.5.0", "orientation": "horizontal", "min": 0, "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_39500a63ce0c401a8f52d3f1610e14a3" } }, "35dd5a9f5c0843838571880dc058661a": { "model_module": "@jupyter-widgets/controls", "model_name": "HTMLModel", "state": { "_view_name": "HTMLView", "style": "IPY_MODEL_e1be44708f6d4fc0a64a36953787eef0", "_dom_classes": [], "description": "", "_model_name": "HTMLModel", "placeholder": "​", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": " 892M/892M [00:18<00:00, 47.7MB/s]", "_view_count": null, "_view_module_version": "1.5.0", "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_13edcd9f5be5479d894bbe150c64e3d0" } }, "c1e3435e66d64c498e2334da0c521b60": { "model_module": "@jupyter-widgets/controls", "model_name": "ProgressStyleModel", "state": { "_view_name": "StyleView", "_model_name": "ProgressStyleModel", "description_width": "initial", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "bar_color": null, "_model_module": "@jupyter-widgets/controls" } }, "39500a63ce0c401a8f52d3f1610e14a3": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "e1be44708f6d4fc0a64a36953787eef0": { "model_module": "@jupyter-widgets/controls", "model_name": "DescriptionStyleModel", "state": { "_view_name": "StyleView", "_model_name": "DescriptionStyleModel", "description_width": "", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "_model_module": "@jupyter-widgets/controls" } }, "13edcd9f5be5479d894bbe150c64e3d0": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } } } }, "accelerator": "GPU" }, "cells": [ { "cell_type": "markdown", "metadata": { "id": "aZSBwt0M5Hmf", "colab_type": "text" }, "source": [ "#Usage of Transformers\n", "\n", "[Original Hugging Face Code Reference](https://huggingface.co/transformers/usage.html)\n", "\n", "[Paper citation: HuggingFace's Transformers: State-of-the-art Natural Language Processing](https://arxiv.org/abs/1910.03771)\n", "\n", "Copyright Denis Rothman 2020, MIT License. The original usage examples have been changed for educational purposes." ] }, { "cell_type": "code", "metadata": { "id": "nQ0myH1cLaQ7", "colab_type": "code", "outputId": "6532ebf7-6c77-4b6b-c81c-c46b0defb059", "colab": { "base_uri": "https://localhost:8080/", "height": 119 } }, "source": [ "#@title Transformer and Torch Installation\n", "try:\n", " import transformers\n", "except:\n", " print(\"Installing transformers\")\n", " !pip -qq install transformers\n", "\n", "try:\n", " import torch\n", "except:\n", " print(\"Installing Torch\")\n", " !pip -qq install torch" ], "execution_count": 1, "outputs": [ { "output_type": "stream", "text": [ "Installing transformers\n", "\u001b[K |████████████████████████████████| 675kB 2.8MB/s \n", "\u001b[K |████████████████████████████████| 3.8MB 12.9MB/s \n", "\u001b[K |████████████████████████████████| 1.1MB 32.9MB/s \n", "\u001b[K |████████████████████████████████| 890kB 42.5MB/s \n", "\u001b[?25h Building wheel for sacremoses (setup.py) ... \u001b[?25l\u001b[?25hdone\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "foamjwawe2OX", "colab_type": "code", "outputId": "a3944c8e-f677-4ea6-b1e8-0894ce5d1bd4", "colab": { "base_uri": "https://localhost:8080/", "height": 296, "referenced_widgets": [ "3f41ce53fe774a86aeab5cedb2217a5b", "1a64601ca3c04b5e98e1d19375a47751", "edef5417bf564ca58e080362d7ff66a7", "83bc31d46193435cb8d2ad65d99a457b", "e78397c4b7bd471191db36e12639e024", "42ae51d8832e45be807d436f41f8ea51", "84785747983f453cae73f9596a7ec6f0", "91a64b6bf1bf4c4aa24ad0c4d08dd3df", "acdd285b20eb4823a8dfffe6ecd76201", "30c1e11f7e9c4ada902dc6edabf234f8", "5783ab24f4aa44ecbd2c437f01ec8bd5", "35e5cd80564a43749c73a0458cc0d6da", "236898968f1e46d2bee145d6d369d0f4", "c23ef21b59b141afa84133ee50e5a329", "1852f772c435440e897bbbdf5913598e", "a481c009790841bf912e0413788f2776", "a608132fb0c247928252b7b3011fcf7d", "9c0ead55753243999715167582feb852", "da90cce7734d450e8e39ebf5e659658f", "f2b8cbe27e4c4a168a9f5c8771e6f54c", "3a2e5325f1e04541baf033054d514e2a", "2266c15cbb48451bbbd5655d6435b62b", "9fbbb8870a95419b90cfbaa8c7db4ae1", "f473dd5bf92f4a5eaec7a709d37a1601", "f6abfb99d9c24695ab8a5db242947f54", "7110b475ad774c75a7855636d4212f30", "802be925656042b19f8c5ded138045bb", "59f4bcea6eb54e269d687cf9618376ea", "4919fefe558047d6b7f248898ac62f6f", "eee41447bbd7413f826225573a3836f0", "28e9463b30a14ee59a7b65fa99f029e5", "5786117a2bbb44d1aa70e3ef08872ad5", "e9d5f842308740368a11ed1b46aca768", "d8d5e37ace9b42c5b8fbe0e4763db2a5", "cec94035159243f9ab03a5034ed26d66", "651c6adc8a064096bf306e5ebc1275c7", "54a8b16ce66040c2b297f1b662b350c1", "f1c4230f55e148338f80ffff65afd1cb", "c2d1d30ef23346f9971c11cff4824012", "f62dfd28a2ef4429a8809a7b83d3cfdc" ] } }, "source": [ "#@title SST-2 Binary Classification\n", "from transformers import pipeline\n", "\n", "nlp = pipeline(\"sentiment-analysis\")\n", "\n", "print(nlp(\"If you sometimes like to go to the movies to have fun , Wasabi is a good place to start .\"),\"If you sometimes like to go to the movies to have fun , Wasabi is a good place to start .\")\n", "print(nlp(\"Effective but too-tepid biopic.\"),\"Effective but too-tepid biopic.\")" ], "execution_count": 2, "outputs": [ { "output_type": "display_data", "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "3f41ce53fe774a86aeab5cedb2217a5b", "version_minor": 0, "version_major": 2 }, "text/plain": [ "HBox(children=(FloatProgress(value=0.0, description='Downloading', max=442.0, style=ProgressStyle(description_…" ] }, "metadata": { "tags": [] } }, { "output_type": "stream", "text": [ "\n" ], "name": "stdout" }, { "output_type": "display_data", "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "acdd285b20eb4823a8dfffe6ecd76201", "version_minor": 0, "version_major": 2 }, "text/plain": [ "HBox(children=(FloatProgress(value=0.0, description='Downloading', max=231508.0, style=ProgressStyle(descripti…" ] }, "metadata": { "tags": [] } }, { "output_type": "stream", "text": [ "\n" ], "name": "stdout" }, { "output_type": "display_data", "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "a608132fb0c247928252b7b3011fcf7d", "version_minor": 0, "version_major": 2 }, "text/plain": [ "HBox(children=(FloatProgress(value=0.0, description='Downloading', max=629.0, style=ProgressStyle(description_…" ] }, "metadata": { "tags": [] } }, { "output_type": "stream", "text": [ "\n" ], "name": "stdout" }, { "output_type": "display_data", "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "f6abfb99d9c24695ab8a5db242947f54", "version_minor": 0, "version_major": 2 }, "text/plain": [ "HBox(children=(FloatProgress(value=0.0, description='Downloading', max=230.0, style=ProgressStyle(description_…" ] }, "metadata": { "tags": [] } }, { "output_type": "stream", "text": [ "\n" ], "name": "stdout" }, { "output_type": "display_data", "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "e9d5f842308740368a11ed1b46aca768", "version_minor": 0, "version_major": 2 }, "text/plain": [ "HBox(children=(FloatProgress(value=0.0, description='Downloading', max=267844284.0, style=ProgressStyle(descri…" ] }, "metadata": { "tags": [] } }, { "output_type": "stream", "text": [ "\n", "[{'label': 'POSITIVE', 'score': 0.999825656414032}] If you sometimes like to go to the movies to have fun , Wasabi is a good place to start .\n", "[{'label': 'NEGATIVE', 'score': 0.9974064230918884}] Effective but too-tepid biopic.\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "colab_type": "code", "outputId": "ac628ecd-9aac-46ba-bd8c-f52a80122508", "id": "iILfeaHLlivA", "colab": { "base_uri": "https://localhost:8080/", "height": 215, "referenced_widgets": [ "7ce0e4d211f34e298db9bde71aafd31d", "ca96ab0cd02644d2897f14ef256f9ab9", "c8d3c1a200884dfe8cc74efc73643d66", "7a9c6953595d4ab39267c4dfadbf72b4", "c21b433bb763464b99dbc52cd180ae85", "71fda054324a40bc9d852cbb94ae3240", "e8ae1c2f79564550beb3df70d9e08295", "757eaa4714064d93b99918fb9ea3cd43", "812815f9249b4f6cb138aed2e6a03fd4", "513fd9d4b17d47e385c7ec7399d7a355", "c0e34e4be46b4ea395b978fd7108f420", "a4c8291f9c0a44d28e4c89b2a6092373", "04ff1050a97d46fcba82145640ff6b78", "9bad4a56293449f68e79f5b5dd0d41c1", "f0f68e55618a44fd9491524d7e0d8dd5", "fed685ae5a7645c28bdb58e3f9703384", "b88724d6b16e472f8ede902cac4ae6f2", "163770117e5a4d0d95926e3a5d0fbf82", "7f2b4c0c78994c83a064056dc8e79bb3", "3d95ba7c826c4c3a8265755fd5738434", "1d5c9a930b5a4558ab0647c90d78f085", "61f23475c541487899f4e559125e7b46", "40212ae99d6e40e6a75836d1e6874dc3", "890f3d9f1fa5441f9f6c0e8fb8a89c8f" ] } }, "source": [ "#@title Sequence Classification : paraphrase classification\n", "from transformers import AutoTokenizer, TFAutoModelForSequenceClassification\n", "import tensorflow as tf\n", "\n", "tokenizer = AutoTokenizer.from_pretrained(\"bert-base-cased-finetuned-mrpc\")\n", "model = TFAutoModelForSequenceClassification.from_pretrained(\"bert-base-cased-finetuned-mrpc\")\n", "\n", "classes = [\"not paraphrase\", \"is paraphrase\"]\n", "\n", "sequence_A = \"The DVD-CCA then appealed to the state Supreme Court.\"\n", "sequence_B = \"The DVD CCA appealed that decision to the U.S. Supreme Court.\"\n", "\n", "paraphrase = tokenizer.encode_plus(sequence_A, sequence_B, return_tensors=\"tf\")\n", "\n", "paraphrase_classification_logits = model(paraphrase)[0]\n", "\n", "paraphrase_results = tf.nn.softmax(paraphrase_classification_logits, axis=1).numpy()[0]\n", "\n", "print(sequence_B, \"should be a paraphrase\")\n", "for i in range(len(classes)):\n", " print(f\"{classes[i]}: {round(paraphrase_results[i] * 100)}%\")" ], "execution_count": 3, "outputs": [ { "output_type": "display_data", "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "7ce0e4d211f34e298db9bde71aafd31d", "version_minor": 0, "version_major": 2 }, "text/plain": [ "HBox(children=(FloatProgress(value=0.0, description='Downloading', max=433.0, style=ProgressStyle(description_…" ] }, "metadata": { "tags": [] } }, { "output_type": "stream", "text": [ "\n" ], "name": "stdout" }, { "output_type": "display_data", "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "812815f9249b4f6cb138aed2e6a03fd4", "version_minor": 0, "version_major": 2 }, "text/plain": [ "HBox(children=(FloatProgress(value=0.0, description='Downloading', max=213450.0, style=ProgressStyle(descripti…" ] }, "metadata": { "tags": [] } }, { "output_type": "stream", "text": [ "\n" ], "name": "stdout" }, { "output_type": "display_data", "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "b88724d6b16e472f8ede902cac4ae6f2", "version_minor": 0, "version_major": 2 }, "text/plain": [ "HBox(children=(FloatProgress(value=0.0, description='Downloading', max=433518744.0, style=ProgressStyle(descri…" ] }, "metadata": { "tags": [] } }, { "output_type": "stream", "text": [ "\n", "The DVD CCA appealed that decision to the U.S. Supreme Court. should be a paraphrase\n", "not paraphrase: 8.0%\n", "is paraphrase: 92.0%\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "PyQKscwtYgCW", "colab_type": "code", "outputId": "8ea9d4b7-0c4c-4bd9-d9c2-b497db7253f3", "colab": { "base_uri": "https://localhost:8080/", "height": 299, "referenced_widgets": [ "d6106a736cf046599bc3836b40ad804f", "76f684a27781484f9cd5ef43df693943", "d8d9185e24604408a59bb75404fd7daa", "6c0bd986d2664ca896244e3858448962", "a76a6ab098f1470cafcba41f27c75e74", "1df0601050254a59a1954c4c5d1d2a76", "ac8d253331bb458c8d9f764303bc9f0b", "93bad43a579342e79fdafdf673a3f8a2", "cf32487a5c3d4a898ca91e270c3f266b", "8502995caabc474c91bca08b97cfaa58", "98b911d9620c40a79c7fee410461d039", "8fb914c5733747e3bae86fcb13073767", "21acd0fcd9214093aa0e93845052ef7e", "aa67bf7e7cf940848b4061bee967052c", "b1d11f8f842540a682717d350d254155", "8418d2e012a14387b231fe12ce0b9a1a", "50993327f1d04882a66df50dd30cff3e", "61c52c61d9c740efab94997706257ba4", "e23a4ab1a468460a9e4b0f34ae67ef76", "3f1629314ca6407b832db9349a508461", "d577e0e2741d43b8a73489c1a0df2406", "76ccbe1be44a4e019e6a50d32c9abf98", "c6eb63c7e9e34d6d989427b9dbe9457c", "3d9b5bff09414a5dadc8f6f3ea279227", "ffd924b0cc9d492d888e5da831481033", "555dfb4c02df4930ae64f4f56ed158b7", "dc47306a92f648d4a690da79d39ac4cc", "c4dfc9d018534634a056aa3da58fcfef", "0a03eaaa65144c829bf93cacc2f66e69", "116c0c19179c4c5d847257a531f9269b", "aa9be08c10e44dfc8732f3419b5cc967", "86cbe5290e654770888f9cfba100ae4e", "5b66512a4f6944b2ab0a78631d502da3", "f861ad27060640a49d34dbc6384a236e", "63c8f259ede54eab8b7eacc2cd393191", "611edbb55ee74d808993a39de5044275", "336b49d4e3d741aea03fecc236e6333a", "b8c1e428241d4e67abb5df0be3eca758", "37c31cab7d4745d9a7944e4bdda8b970", "8fe12fc6327d4027ab66cb5815760e75" ] } }, "source": [ "#@title Named Entity Recognition(NER)\n", "from transformers import pipeline\n", "nlp = pipeline(\"ner\")\n", "sequence = \"Hugging Face Inc. is a company based in New York City. Its headquarters are in DUMBO, therefore very\" \\\n", " \"close to the Manhattan Bridge which is visible from the window.\"\n", "print(nlp(sequence))" ], "execution_count": 4, "outputs": [ { "output_type": "display_data", "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "d6106a736cf046599bc3836b40ad804f", "version_minor": 0, "version_major": 2 }, "text/plain": [ "HBox(children=(FloatProgress(value=0.0, description='Downloading', max=625.0, style=ProgressStyle(description_…" ] }, "metadata": { "tags": [] } }, { "output_type": "stream", "text": [ "\n" ], "name": "stdout" }, { "output_type": "display_data", "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "cf32487a5c3d4a898ca91e270c3f266b", "version_minor": 0, "version_major": 2 }, "text/plain": [ "HBox(children=(FloatProgress(value=0.0, description='Downloading', max=213450.0, style=ProgressStyle(descripti…" ] }, "metadata": { "tags": [] } }, { "output_type": "stream", "text": [ "\n" ], "name": "stdout" }, { "output_type": "display_data", "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "50993327f1d04882a66df50dd30cff3e", "version_minor": 0, "version_major": 2 }, "text/plain": [ "HBox(children=(FloatProgress(value=0.0, description='Downloading', max=998.0, style=ProgressStyle(description_…" ] }, "metadata": { "tags": [] } }, { "output_type": "stream", "text": [ "\n" ], "name": "stdout" }, { "output_type": "display_data", "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "ffd924b0cc9d492d888e5da831481033", "version_minor": 0, "version_major": 2 }, "text/plain": [ "HBox(children=(FloatProgress(value=0.0, description='Downloading', max=230.0, style=ProgressStyle(description_…" ] }, "metadata": { "tags": [] } }, { "output_type": "stream", "text": [ "\n" ], "name": "stdout" }, { "output_type": "display_data", "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "5b66512a4f6944b2ab0a78631d502da3", "version_minor": 0, "version_major": 2 }, "text/plain": [ "HBox(children=(FloatProgress(value=0.0, description='Downloading', max=1334448817.0, style=ProgressStyle(descr…" ] }, "metadata": { "tags": [] } }, { "output_type": "stream", "text": [ "\n", "[{'word': 'Hu', 'score': 0.9995632767677307, 'entity': 'I-ORG', 'index': 1}, {'word': '##gging', 'score': 0.9915938377380371, 'entity': 'I-ORG', 'index': 2}, {'word': 'Face', 'score': 0.9982671737670898, 'entity': 'I-ORG', 'index': 3}, {'word': 'Inc', 'score': 0.9994403719902039, 'entity': 'I-ORG', 'index': 4}, {'word': 'New', 'score': 0.9994346499443054, 'entity': 'I-LOC', 'index': 11}, {'word': 'York', 'score': 0.9993270635604858, 'entity': 'I-LOC', 'index': 12}, {'word': 'City', 'score': 0.9993864893913269, 'entity': 'I-LOC', 'index': 13}, {'word': 'D', 'score': 0.9825621843338013, 'entity': 'I-LOC', 'index': 19}, {'word': '##UM', 'score': 0.936983048915863, 'entity': 'I-LOC', 'index': 20}, {'word': '##BO', 'score': 0.8987101316452026, 'entity': 'I-LOC', 'index': 21}, {'word': 'Manhattan', 'score': 0.9758241176605225, 'entity': 'I-LOC', 'index': 29}, {'word': 'Bridge', 'score': 0.9902493953704834, 'entity': 'I-LOC', 'index': 30}]\n" ], "name": "stdout" } ] }, { "cell_type": "markdown", "metadata": { "id": "lbwBChUX7grO", "colab_type": "text" }, "source": [ "prosody represented by \"ha,ha\" with sure. Could be positive or negative. Context required. \"Not\" \"else\", \"however\" are too strong in this model." ] }, { "cell_type": "code", "metadata": { "id": "l871bLNcNWiA", "colab_type": "code", "outputId": "37e7bce6-1c08-4872-e78a-1f9ebc9f32dd", "colab": { "base_uri": "https://localhost:8080/", "height": 213, "referenced_widgets": [ "10323e3b6d3f43b6b0d0e23de45b5729", "8bc14cbd5a01449eb01f2e01e9db2fa2", "43faf48017b84e8b9b0c5bf29337ed30", "7eccb6f09783498ab44020360c1e0062", "6738833ebeb94b60912151d886ca083e", "6b4c9bb4710541b29c2894739c24472c", "f2febefe55c64bea8d5fa39b8c94ba01", "1a8391734975484ca3feaf86d6da1161", "1314be84dd424e94b13ee568840c7ea2", "63392f66c53d479199e24d8d8a45823a", "0cdc54e5dc434c73be4ef332025d0d97", "bd0ab5247d6349819f4ba64533e5cc80", "e31316c758bb4236905d4291302a9e15", "20af50bd6ae74e31b3c0b7f2b6e054d5", "8b035f4d02f2447a961ce83766868e09", "28c23c928d8c440783419e71bfb917d7", "f5c80d9b2f804af3a4bf07f97ba06bf1", "bcfe5e255d8a422baf1175c3b23e52e5", "b50f807793aa404f9ce54c55403f8dd3", "6edc69c824bc4f63914e5850e589448a", "224b69b50c9846a1b244134d23635efd", "e26e0596f8534c79b7595361eda687ce", "d04989abce5d481dbee28ac450a32d2d", "38a758bfadeb443686bc615481cd9da9", "0d335835f44548efadcf0ecd8a49e391", "b08f64e0d5994eb9adffdfc1d48e9088", "45ea2974a40c4786a43ee07cdbbcd693", "35dd5a9f5c0843838571880dc058661a", "c1e3435e66d64c498e2334da0c521b60", "39500a63ce0c401a8f52d3f1610e14a3", "e1be44708f6d4fc0a64a36953787eef0", "13edcd9f5be5479d894bbe150c64e3d0" ] } }, "source": [ "#@title Winograd\n", "from transformers import pipeline\n", "translator = pipeline(\"translation_en_to_fr\")" ], "execution_count": 5, "outputs": [ { "output_type": "display_data", "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "10323e3b6d3f43b6b0d0e23de45b5729", "version_minor": 0, "version_major": 2 }, "text/plain": [ "HBox(children=(FloatProgress(value=0.0, description='Downloading', max=1199.0, style=ProgressStyle(description…" ] }, "metadata": { "tags": [] } }, { "output_type": "stream", "text": [ "\n" ], "name": "stdout" }, { "output_type": "display_data", "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "1314be84dd424e94b13ee568840c7ea2", "version_minor": 0, "version_major": 2 }, "text/plain": [ "HBox(children=(FloatProgress(value=0.0, description='Downloading', max=791656.0, style=ProgressStyle(descripti…" ] }, "metadata": { "tags": [] } }, { "output_type": "stream", "text": [ "\n" ], "name": "stdout" }, { "output_type": "display_data", "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "f5c80d9b2f804af3a4bf07f97ba06bf1", "version_minor": 0, "version_major": 2 }, "text/plain": [ "HBox(children=(FloatProgress(value=0.0, description='Downloading', max=230.0, style=ProgressStyle(description_…" ] }, "metadata": { "tags": [] } }, { "output_type": "stream", "text": [ "\n" ], "name": "stdout" }, { "output_type": "display_data", "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "0d335835f44548efadcf0ecd8a49e391", "version_minor": 0, "version_major": 2 }, "text/plain": [ "HBox(children=(FloatProgress(value=0.0, description='Downloading', max=891691430.0, style=ProgressStyle(descri…" ] }, "metadata": { "tags": [] } }, { "output_type": "stream", "text": [ "\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "Jslzg16dTa0K", "colab_type": "code", "outputId": "083a682a-2b86-47c7-c6fd-ffc83f5cd829", "colab": { "base_uri": "https://localhost:8080/", "height": 34 } }, "source": [ "print(translator(\"The car could not go in the garage because it was too big.\", max_length=40))" ], "execution_count": 6, "outputs": [ { "output_type": "stream", "text": [ "[{'translation_text': \"La voiture ne pouvait pas aller dans le garage parce qu'elle était trop grosse.\"}]\n" ], "name": "stdout" } ] } ] } ================================================ FILE: Chapter05/BLEU.py ================================================ #BLEU : Bilingual Evaluation Understudy Score #Copyright 2020, MIT License BLEU Examples #REF PAPER: Kishore Papineni, et al.,2002,“BLEU: a Method for Automatic Evaluation of Machine Translation“. # https://www.aclweb.org/anthology/P02-1040.pdf #NLTK : Natural Language Toolkit #NLTK sentence_bleu doc: http://www.nltk.org/api/nltk.translate.html#nltk.translate.bleu_score.sentence_bleu #NLTK smoothing doc: https://www.nltk.org/api/nltk.translate.html #NLTK REF PAPER for smoothing():Chen et al.,http://acl2014.org/acl2014/W14-33/pdf/W14-3346.pdf #REF DOC : https://machinelearningmastery.com/calculate-bleu-score-for-text-python/ from nltk.translate.bleu_score import sentence_bleu from nltk.translate.bleu_score import SmoothingFunction #Example 1 reference = [['the', 'cat', 'likes', 'milk'], ['cat', 'likes' 'milk']] candidate = ['the', 'cat', 'likes', 'milk'] score = sentence_bleu(reference, candidate) print('Example 1', score) #Example 2 reference = [['the', 'cat', 'likes', 'milk']] candidate = ['the', 'cat', 'likes', 'milk'] score = sentence_bleu(reference, candidate) print('Example 2', score) #Example 3 reference = [['the', 'cat', 'likes', 'milk']] candidate = ['the', 'cat', 'enjoys','milk'] score = sentence_bleu(reference, candidate) print('Example 3', score) #Example 4 reference = [['je','vous','invite', 'a', 'vous', 'lever','pour', 'cette', 'minute', 'de', 'silence']] candidate = ['levez','vous','svp','pour', 'cette', 'minute', 'de', 'silence'] score = sentence_bleu(reference, candidate) print("without soothing score", score) chencherry = SmoothingFunction() r1=list('je vous invite a vous lever pour cette minute de silence') candidate=list('levez vous svp pour cette minute de silence') #sentence_bleu([reference1, reference2, reference3], hypothesis2,smoothing_function=chencherry.method1) print("with smoothing score",sentence_bleu([r1], candidate,smoothing_function=chencherry.method1)) ================================================ FILE: Chapter05/Trax_Translation.ipynb ================================================ { "nbformat": 4, "nbformat_minor": 0, "metadata": { "colab": { "name": "Trax_Translation.ipynb", "provenance": [], "collapsed_sections": [] }, "kernelspec": { "name": "python3", "display_name": "Python 3" }, "accelerator": "GPU" }, "cells": [ { "cell_type": "markdown", "metadata": { "id": "1liQji85FAIp" }, "source": [ "#Machine Translation with Trax\n", "\n", "Note by Denis Rothman: The original notebook was split into cells.\n", "\n", "[Reference Code](https://colab.research.google.com/github/google/trax/blob/master/trax/intro.ipynb)\n" ] }, { "cell_type": "code", "metadata": { "id": "h0pjcihTE9fR" }, "source": [ "#@title Installing Trax\n", "import os\n", "import numpy as np\n", "\n", "!pip install -q -U trax\n", "import trax" ], "execution_count": 7, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "ivTjrL-BMD8i" }, "source": [ "#@title Creating\n", "# Pre-trained model config in gs://trax-ml/models/translation/ende_wmt32k.gin\n", "model = trax.models.Transformer(\n", " input_vocab_size=33300,\n", " d_model=512, d_ff=2048,\n", " n_heads=8, n_encoder_layers=6, n_decoder_layers=6,\n", " max_len=2048, mode='predict')\n" ], "execution_count": 8, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "oJgRqlrmMKbo" }, "source": [ "#@title Initializing the model using pre-trained weights\n", "model.init_from_file('gs://trax-ml/models/translation/ende_wmt32k.pkl.gz',\n", " weights_only=True)" ], "execution_count": 9, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "HvwJ5w-6MQNw" }, "source": [ "#@title Tokenizing a sentence\n", "sentence = 'I am only a machine but I have machine intelligence.'\n", "\n", "tokenized = list(trax.data.tokenize(iter([sentence]), # Operates on streams.\n", " vocab_dir='gs://trax-ml/vocabs/',\n", " vocab_file='ende_32k.subword'))[0]\n" ], "execution_count": 10, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "IVkBQOvmMW9A" }, "source": [ "#@title Decoding from the Transformer\n", "tokenized = tokenized[None, :] # Add batch dimension.\n", "tokenized_translation = trax.supervised.decoding.autoregressive_sample(\n", " model, tokenized, temperature=0.0) # Higher temperature: more diverse results.\n" ], "execution_count": 11, "outputs": [] }, { "cell_type": "code", "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "QV2xr8_7Mc4B", "outputId": "c78c12ea-84a1-4fd5-fb2e-770fadc19e8b" }, "source": [ "#@title De-tokenizing and Displaying the Translation\n", "tokenized_translation = tokenized_translation[0][:-1] # Remove batch and EOS.\n", "translation = trax.data.detokenize(tokenized_translation,\n", " vocab_dir='gs://trax-ml/vocabs/',\n", " vocab_file='ende_32k.subword')\n", "print(\"The sentence:\",sentence)\n", "print(\"The translation:\",translation)" ], "execution_count": 12, "outputs": [ { "output_type": "stream", "text": [ "The sentence: I am only a machine but I have machine intelligence.\n", "The translation: Ich bin nur eine Maschine, aber ich habe Maschinenübersicht.\n" ], "name": "stdout" } ] } ] } ================================================ FILE: Chapter05/read.py ================================================ #Pre-Processing datasets for Machine Translation #Copyright 2020, Denis Rothman, MIT License #Denis Rothman modified the code for educational purposes. #Reference: #Jason Brownlee PhD, ‘How to Prepare a French-to-English Dataset for Machine Translation # https://machinelearningmastery.com/prepare-french-english-dataset-machine-translation/ import pickle from pickle import dump # load doc into memory def load_doc(filename): # open the file as read only file = open(filename, mode='rt', encoding='utf-8') # read all text text = file.read() # close the file file.close() return text # split a loaded document into sentences def to_sentences(doc): return doc.strip().split('\n') # shortest and longest sentence lengths def sentence_lengths(sentences): lengths = [len(s.split()) for s in sentences] return min(lengths), max(lengths) # clean lines import re import string import unicodedata def clean_lines(lines): cleaned = list() # prepare regex for char filtering re_print = re.compile('[^%s]' % re.escape(string.printable)) # prepare translation table for removing punctuation table = str.maketrans('', '', string.punctuation) for line in lines: # normalize unicode characters line = unicodedata.normalize('NFD', line).encode('ascii', 'ignore') line = line.decode('UTF-8') # tokenize on white space line = line.split() # convert to lower case line = [word.lower() for word in line] # remove punctuation from each token line = [word.translate(table) for word in line] # remove non-printable chars form each token line = [re_print.sub('', w) for w in line] # remove tokens with numbers in them line = [word for word in line if word.isalpha()] # store as string cleaned.append(' '.join(line)) return cleaned # load English data filename = 'europarl-v7.fr-en.en' doc = load_doc(filename) sentences = to_sentences(doc) minlen, maxlen = sentence_lengths(sentences) print('English data: sentences=%d, min=%d, max=%d' % (len(sentences), minlen, maxlen)) cleanf=clean_lines(sentences) filename = 'English.pkl' outfile = open(filename,'wb') pickle.dump(cleanf,outfile) outfile.close() print(filename," saved") # load English data filename = 'europarl-v7.fr-en.fr' doc = load_doc(filename) sentences = to_sentences(doc) minlen, maxlen = sentence_lengths(sentences) print('French data: sentences=%d, min=%d, max=%d' % (len(sentences), minlen, maxlen)) cleanf=clean_lines(sentences) filename = 'French.pkl' outfile = open(filename,'wb') pickle.dump(cleanf,outfile) outfile.close() print(filename," saved") ================================================ FILE: Chapter05/read_clean.py ================================================ #Pre-Processing datasets for Machine Translation #Copyright 2020, Denis Rothman, MIT License #Denis Rothman modified the code for educational purposes. #Reference: #Jason Brownlee PhD, ‘How to Prepare a French-to-English Dataset for Machine Translation # https://machinelearningmastery.com/prepare-french-english-dataset-machine-translation/ from pickle import load from pickle import dump from collections import Counter # load a clean dataset def load_clean_sentences(filename): return load(open(filename, 'rb')) # save a list of clean sentences to file def save_clean_sentences(sentences, filename): dump(sentences, open(filename, 'wb')) print('Saved: %s' % filename) # create a frequency table for all words def to_vocab(lines): vocab = Counter() for line in lines: tokens = line.split() vocab.update(tokens) return vocab # remove all words with a frequency below a threshold def trim_vocab(vocab, min_occurance): tokens = [k for k,c in vocab.items() if c >= min_occurance] return set(tokens) # mark all OOV with "unk" for all lines def update_dataset(lines, vocab): new_lines = list() for line in lines: new_tokens = list() for token in line.split(): if token in vocab: new_tokens.append(token) else: new_tokens.append('unk') new_line = ' '.join(new_tokens) new_lines.append(new_line) return new_lines # load English dataset filename = 'English.pkl' lines = load_clean_sentences(filename) # calculate vocabulary vocab = to_vocab(lines) print('English Vocabulary: %d' % len(vocab)) # reduce vocabulary vocab = trim_vocab(vocab, 5) print('New English Vocabulary: %d' % len(vocab)) # mark out of vocabulary words lines = update_dataset(lines, vocab) # save updated dataset filename = 'english_vocab.pkl' save_clean_sentences(lines, filename) # spot check for i in range(20): print("line",i,":",lines[i]) # load French dataset filename = 'French.pkl' lines = load_clean_sentences(filename) # calculate vocabulary vocab = to_vocab(lines) print('French Vocabulary: %d' % len(vocab)) # reduce vocabulary vocab = trim_vocab(vocab, 5) print('New French Vocabulary: %d' % len(vocab)) # mark out of vocabulary words lines = update_dataset(lines, vocab) # save updated dataset filename = 'french_vocab.pkl' save_clean_sentences(lines, filename) # spot check for i in range(20): print("line",i,":",lines[i]) ================================================ FILE: Chapter06/OpenAI_GPT_2.ipynb ================================================ {"nbformat":4,"nbformat_minor":0,"metadata":{"colab":{"name":"OpenAI_GPT_2_KS.ipynb","provenance":[],"collapsed_sections":[],"toc_visible":true},"kernelspec":{"name":"python3","display_name":"Python 3"},"accelerator":"GPU"},"cells":[{"cell_type":"markdown","metadata":{"id":"LH2YgC7LfzJZ"},"source":["#OpenAI GTP-2\n","Copyright 2020, Denis Rothman MIT License. Denis Rothman created the Colab notebook using the OpenAI repository, adding title steps for educational purposes only.\n","\n","It is important to note that we are running a low-level GPT-2 model \n","and not a one-line call to obtain a result. We are also\n","avoiding pre-packaged versions. We are getting our hands dirty to\n","understand the architecture of a GPT-2 from scratch. You might get\n","some deprecation messages. However, the effort is worthwhile.\n","\n","***Code Reference***\n","[Reference: OpenAI Repository](https://github.com/openai/gpt-2)\n","\n","***Model Reference***\n","[Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei, Ilya Sutskever,2019,'Language Models are Unsupervised Multitask Learners'](https://d4mucfpksywv.cloudfront.net/better-language-models/language-models.pdf)\n","\n","\n","Step 1: Pre-requisite: activate GPU in the notebook settings runTime menu\n","\n"]},{"cell_type":"code","metadata":{"id":"isqdu1fpfmqM","colab":{"base_uri":"https://localhost:8080/"},"executionInfo":{"status":"ok","timestamp":1611121642694,"user_tz":-330,"elapsed":2122,"user":{"displayName":"Karan Sonawane","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64","userId":"05479461208077736330"}},"outputId":"0893439c-1785-4977-ac91-9fa8088c3b03"},"source":["#@title Step 2: Cloning the OpenAI GPT-2 Repository \n","!git clone https://github.com/openai/gpt-2.git"],"execution_count":1,"outputs":[{"output_type":"stream","text":["Cloning into 'gpt-2'...\n","remote: Enumerating objects: 233, done.\u001b[K\n","remote: Total 233 (delta 0), reused 0 (delta 0), pack-reused 233\u001b[K\n","Receiving objects: 100% (233/233), 4.38 MiB | 23.47 MiB/s, done.\n","Resolving deltas: 100% (124/124), done.\n"],"name":"stdout"}]},{"cell_type":"code","metadata":{"id":"7RHOjN-TjUbj","colab":{"base_uri":"https://localhost:8080/"},"executionInfo":{"status":"ok","timestamp":1611121666299,"user_tz":-330,"elapsed":14069,"user":{"displayName":"Karan Sonawane","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64","userId":"05479461208077736330"}},"outputId":"3d3312bf-e2c9-489f-9a6f-061ea6a34340"},"source":["#@title Step 3: Installing the requirements\n","import os # when the VM restarts import os necessary\n","os.chdir(\"/content/gpt-2\") \n","!pip3 install -r requirements.txt"],"execution_count":2,"outputs":[{"output_type":"stream","text":["Collecting fire>=0.1.3\n","\u001b[?25l Downloading https://files.pythonhosted.org/packages/34/a7/0e22e70778aca01a52b9c899d9c145c6396d7b613719cd63db97ffa13f2f/fire-0.3.1.tar.gz (81kB)\n","\u001b[K |████████████████████████████████| 81kB 7.8MB/s \n","\u001b[?25hCollecting regex==2017.4.5\n","\u001b[?25l Downloading https://files.pythonhosted.org/packages/36/62/c0c0d762ffd4ffaf39f372eb8561b8d491a11ace5a7884610424a8b40f95/regex-2017.04.05.tar.gz (601kB)\n","\u001b[K |████████████████████████████████| 604kB 24.4MB/s \n","\u001b[?25hCollecting requests==2.21.0\n","\u001b[?25l Downloading https://files.pythonhosted.org/packages/7d/e3/20f3d364d6c8e5d2353c72a67778eb189176f08e873c9900e10c0287b84b/requests-2.21.0-py2.py3-none-any.whl (57kB)\n","\u001b[K |████████████████████████████████| 61kB 9.6MB/s \n","\u001b[?25hCollecting tqdm==4.31.1\n","\u001b[?25l Downloading https://files.pythonhosted.org/packages/6c/4b/c38b5144cf167c4f52288517436ccafefe9dc01b8d1c190e18a6b154cd4a/tqdm-4.31.1-py2.py3-none-any.whl (48kB)\n","\u001b[K |████████████████████████████████| 51kB 6.1MB/s \n","\u001b[?25hRequirement already satisfied: six in /usr/local/lib/python3.6/dist-packages (from fire>=0.1.3->-r requirements.txt (line 1)) (1.15.0)\n","Requirement already satisfied: termcolor in /usr/local/lib/python3.6/dist-packages (from fire>=0.1.3->-r requirements.txt (line 1)) (1.1.0)\n","Requirement already satisfied: chardet<3.1.0,>=3.0.2 in /usr/local/lib/python3.6/dist-packages (from requests==2.21.0->-r requirements.txt (line 3)) (3.0.4)\n","Collecting idna<2.9,>=2.5\n","\u001b[?25l Downloading https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl (58kB)\n","\u001b[K |████████████████████████████████| 61kB 9.9MB/s \n","\u001b[?25hRequirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.6/dist-packages (from requests==2.21.0->-r requirements.txt (line 3)) (2020.12.5)\n","Requirement already satisfied: urllib3<1.25,>=1.21.1 in /usr/local/lib/python3.6/dist-packages (from requests==2.21.0->-r requirements.txt (line 3)) (1.24.3)\n","Building wheels for collected packages: fire, regex\n"," Building wheel for fire (setup.py) ... \u001b[?25l\u001b[?25hdone\n"," Created wheel for fire: filename=fire-0.3.1-py2.py3-none-any.whl size=111006 sha256=84d334e01481079528fbe07f0be1143f7f49c6f454c39837521ed84d822943a1\n"," Stored in directory: /root/.cache/pip/wheels/c1/61/df/768b03527bf006b546dce284eb4249b185669e65afc5fbb2ac\n"," Building wheel for regex (setup.py) ... \u001b[?25l\u001b[?25hdone\n"," Created wheel for regex: filename=regex-2017.4.5-cp36-cp36m-linux_x86_64.whl size=533190 sha256=e6d35cedb29485199a171cead1d9904cf5a633fd8b2860c419d5f1dbdfc8567f\n"," Stored in directory: /root/.cache/pip/wheels/75/07/38/3c16b529d50cb4e0cd3dbc7b75cece8a09c132692c74450b01\n","Successfully built fire regex\n","\u001b[31mERROR: spacy 2.2.4 has requirement tqdm<5.0.0,>=4.38.0, but you'll have tqdm 4.31.1 which is incompatible.\u001b[0m\n","\u001b[31mERROR: google-colab 1.0.0 has requirement requests~=2.23.0, but you'll have requests 2.21.0 which is incompatible.\u001b[0m\n","\u001b[31mERROR: fbprophet 0.7.1 has requirement tqdm>=4.36.1, but you'll have tqdm 4.31.1 which is incompatible.\u001b[0m\n","\u001b[31mERROR: datascience 0.10.6 has requirement folium==0.2.1, but you'll have folium 0.8.3 which is incompatible.\u001b[0m\n","Installing collected packages: fire, regex, idna, requests, tqdm\n"," Found existing installation: regex 2019.12.20\n"," Uninstalling regex-2019.12.20:\n"," Successfully uninstalled regex-2019.12.20\n"," Found existing installation: idna 2.10\n"," Uninstalling idna-2.10:\n"," Successfully uninstalled idna-2.10\n"," Found existing installation: requests 2.23.0\n"," Uninstalling requests-2.23.0:\n"," Successfully uninstalled requests-2.23.0\n"," Found existing installation: tqdm 4.41.1\n"," Uninstalling tqdm-4.41.1:\n"," Successfully uninstalled tqdm-4.41.1\n","Successfully installed fire-0.3.1 idna-2.8 regex-2017.4.5 requests-2.21.0 tqdm-4.31.1\n"],"name":"stdout"}]},{"cell_type":"code","metadata":{"id":"_kpNCnh9fyYD","colab":{"base_uri":"https://localhost:8080/"},"executionInfo":{"status":"ok","timestamp":1611121682119,"user_tz":-330,"elapsed":6103,"user":{"displayName":"Karan Sonawane","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64","userId":"05479461208077736330"}},"outputId":"828003c4-1dff-4c43-d438-4c91bc573ab1"},"source":["#@title Step 4 Checking the Version of TensorFlow \n","#Colab has tf 1.x and tf 2.x installed\n","#Restart runtime using 'Runtime' -> 'Restart runtime...'\n","%tensorflow_version 1.x\n","import tensorflow as tf\n","print(tf.__version__)"],"execution_count":3,"outputs":[{"output_type":"stream","text":["TensorFlow 1.x selected.\n","1.15.2\n"],"name":"stdout"}]},{"cell_type":"code","metadata":{"id":"jvVj0cLVkaPL","colab":{"base_uri":"https://localhost:8080/"},"executionInfo":{"status":"ok","timestamp":1611121728589,"user_tz":-330,"elapsed":30531,"user":{"displayName":"Karan Sonawane","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64","userId":"05479461208077736330"}},"outputId":"c3cabb3a-0dbf-40aa-d231-5de3b242baab"},"source":["#@title Step 5: Downloading the 345M parameter GPT-2 Model\n","# run code and send argument\n","import os # after runtime is restarted\n","os.chdir(\"/content/gpt-2\")\n","!python3 download_model.py '345M' "],"execution_count":4,"outputs":[{"output_type":"stream","text":["Fetching checkpoint: 1.00kit [00:00, 945kit/s] \n","Fetching encoder.json: 1.04Mit [00:00, 3.97Mit/s] \n","Fetching hparams.json: 1.00kit [00:00, 944kit/s] \n","Fetching model.ckpt.data-00000-of-00001: 1.42Git [00:27, 51.2Mit/s] \n","Fetching model.ckpt.index: 11.0kit [00:00, 9.49Mit/s] \n","Fetching model.ckpt.meta: 927kit [00:00, 3.13Mit/s] \n","Fetching vocab.bpe: 457kit [00:00, 2.48Mit/s] \n"],"name":"stdout"}]},{"cell_type":"code","metadata":{"id":"boCr2SydkydA","executionInfo":{"status":"ok","timestamp":1611121821353,"user_tz":-330,"elapsed":1106,"user":{"displayName":"Karan Sonawane","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64","userId":"05479461208077736330"}}},"source":["#@title Step 6: Printing UTF encoded text to the console\n","!export PYTHONIOENCODING=UTF-8"],"execution_count":5,"outputs":[]},{"cell_type":"code","metadata":{"id":"T7C7JhElk-Lh","executionInfo":{"status":"ok","timestamp":1611121828604,"user_tz":-330,"elapsed":1043,"user":{"displayName":"Karan Sonawane","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64","userId":"05479461208077736330"}}},"source":["#@title Step 7: Project Source Code\n","import os # import after runtime is restarted\n","os.chdir(\"/content/gpt-2/src\")"],"execution_count":6,"outputs":[]},{"cell_type":"code","metadata":{"id":"ckSsdAnblFIg","executionInfo":{"status":"ok","timestamp":1611121842649,"user_tz":-330,"elapsed":1122,"user":{"displayName":"Karan Sonawane","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64","userId":"05479461208077736330"}}},"source":["#@title Step 7a: Interactive Conditional Samples (src)\n","#Project Source Code for Interactive Conditional Samples:\n","# /content/gpt-2/src/interactive_conditional_samples.py file \n","import json\n","import os\n","import numpy as np\n","import tensorflow as tf"],"execution_count":7,"outputs":[]},{"cell_type":"code","metadata":{"id":"2mtuJxl8tb_B","executionInfo":{"status":"ok","timestamp":1611121856018,"user_tz":-330,"elapsed":3099,"user":{"displayName":"Karan Sonawane","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64","userId":"05479461208077736330"}}},"source":["#@title Step 7b: Importing model sample encoder\n","import model, sample, encoder\n","#if following message:\n","#ModuleNotFoundError: No module named 'tensorflow.contrib'\n","#then go back and run Step 2 Checking TensorFlow version "],"execution_count":8,"outputs":[]},{"cell_type":"code","metadata":{"id":"SAuHo4TilJhQ","executionInfo":{"status":"ok","timestamp":1611121861066,"user_tz":-330,"elapsed":1058,"user":{"displayName":"Karan Sonawane","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64","userId":"05479461208077736330"}}},"source":["#@title Step 8: Defining the model\n","def interact_model(\n"," model_name,\n"," seed,\n"," nsamples,\n"," batch_size,\n"," length,\n"," temperature,\n"," top_k,\n"," models_dir\n","):\n"," models_dir = os.path.expanduser(os.path.expandvars(models_dir))\n"," if batch_size is None:\n"," batch_size = 1\n"," assert nsamples % batch_size == 0\n","\n"," enc = encoder.get_encoder(model_name, models_dir)\n"," hparams = model.default_hparams()\n"," with open(os.path.join(models_dir, model_name, 'hparams.json')) as f:\n"," hparams.override_from_dict(json.load(f))\n","\n"," if length is None:\n"," length = hparams.n_ctx // 2\n"," elif length > hparams.n_ctx:\n"," raise ValueError(\"Can't get samples longer than window size: %s\" % hparams.n_ctx)\n","\n"," with tf.Session(graph=tf.Graph()) as sess:\n"," context = tf.placeholder(tf.int32, [batch_size, None])\n"," np.random.seed(seed)\n"," tf.set_random_seed(seed)\n"," output = sample.sample_sequence(\n"," hparams=hparams, length=length,\n"," context=context,\n"," batch_size=batch_size,\n"," temperature=temperature, top_k=top_k\n"," )\n","\n"," saver = tf.train.Saver()\n"," ckpt = tf.train.latest_checkpoint(os.path.join(models_dir, model_name))\n"," saver.restore(sess, ckpt)\n","\n"," while True:\n"," raw_text = input(\"Model prompt >>> \")\n"," while not raw_text:\n"," print('Prompt should not be empty!')\n"," raw_text = input(\"Model prompt >>> \")\n"," context_tokens = enc.encode(raw_text)\n"," generated = 0\n"," for _ in range(nsamples // batch_size):\n"," out = sess.run(output, feed_dict={\n"," context: [context_tokens for _ in range(batch_size)]\n"," })[:, len(context_tokens):]\n"," for i in range(batch_size):\n"," generated += 1\n"," text = enc.decode(out[i])\n"," print(\"=\" * 40 + \" SAMPLE \" + str(generated) + \" \" + \"=\" * 40)\n"," print(text)\n"," print(\"=\" * 80)"],"execution_count":9,"outputs":[]},{"cell_type":"code","metadata":{"id":"P8Prbrs-UHu3","colab":{"base_uri":"https://localhost:8080/","height":976},"executionInfo":{"status":"error","timestamp":1611127917030,"user_tz":-330,"elapsed":4045790,"user":{"displayName":"Karan Sonawane","photoUrl":"https://lh3.googleusercontent.com/a-/AOh14GjWjX1_4b0iu2fEkjbIRKIHq-Molc5N_CnbcU75=s64","userId":"05479461208077736330"}},"outputId":"9f768ee1-75a5-499a-f7f9-be0889a29f22"},"source":["#@title Step 9: Interacting with GPT-2 \r\n","interact_model('345M',None,1,1,300,1,0,'/content/gpt-2/models')"],"execution_count":10,"outputs":[{"output_type":"stream","text":["WARNING:tensorflow:From /content/gpt-2/src/sample.py:51: The name tf.AUTO_REUSE is deprecated. Please use tf.compat.v1.AUTO_REUSE instead.\n","\n","WARNING:tensorflow:From /content/gpt-2/src/model.py:148: The name tf.variable_scope is deprecated. Please use tf.compat.v1.variable_scope instead.\n","\n","WARNING:tensorflow:From /content/gpt-2/src/model.py:152: The name tf.get_variable is deprecated. Please use tf.compat.v1.get_variable instead.\n","\n","WARNING:tensorflow:From /content/gpt-2/src/model.py:36: The name tf.rsqrt is deprecated. Please use tf.math.rsqrt instead.\n","\n","WARNING:tensorflow:From /content/gpt-2/src/sample.py:64: to_float (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version.\n","Instructions for updating:\n","Use `tf.cast` instead.\n","WARNING:tensorflow:From /content/gpt-2/src/sample.py:39: where (from tensorflow.python.ops.array_ops) is deprecated and will be removed in a future version.\n","Instructions for updating:\n","Use tf.where in 2.0, which has the same broadcast rule as np.where\n","WARNING:tensorflow:From /content/gpt-2/src/sample.py:67: multinomial (from tensorflow.python.ops.random_ops) is deprecated and will be removed in a future version.\n","Instructions for updating:\n","Use `tf.random.categorical` instead.\n","INFO:tensorflow:Restoring parameters from /content/gpt-2/models/345M/model.ckpt\n","======================================== SAMPLE 1 ========================================\n"," But to hold to sense alone, as to the only thing capable of constituting our perfection, is the very aim wherein nature herself establishes herself. This shall never be the final end of human reason, as I apprehend this to be; unless, indeed, it begins from spirit, and and passes through man to no other end: therefore intellectual ideas don't contemplate any hell, the existence of which the Saccadic demon of Illustration would require for perfection.\n","\n","Now, if you should see it thus, it will seem rather to refute the sensible traits of Plato who posited nature as an objective object, when she was anathema to his spirit. Now, by conceiving of the nature of its objects as hard, dull and insufferable objects, nature abounds in practicability to delineate every part of its external parts, and appears to furnish no more expository descriptions, than the manufacturer usually has to conform to the contents of his camera. Thus the Book of the Dead, ie. the lateral kings of Flight, which shall God Himself destroy in order to release man from mortal space, contains information with a memento verbi. Nor do human actions, or nerve-angle,! however fine, cease to move their parts towards things which lie in a strait, as instinct (baculum) says. But since the system always transfers itself, at, the same time to forwards and backwards, and cannot come to a stop with these reverses of\n","================================================================================\n"],"name":"stdout"},{"output_type":"error","ename":"KeyboardInterrupt","evalue":"ignored","traceback":["\u001b[0;31m---------------------------------------------------------------------------\u001b[0m","\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)","\u001b[0;32m/usr/local/lib/python3.6/dist-packages/ipykernel/kernelbase.py\u001b[0m in \u001b[0;36m_input_request\u001b[0;34m(self, prompt, ident, parent, password)\u001b[0m\n\u001b[1;32m 728\u001b[0m \u001b[0;32mtry\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 729\u001b[0;31m \u001b[0mident\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mreply\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0msession\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mrecv\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mstdin_socket\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;36m0\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 730\u001b[0m \u001b[0;32mexcept\u001b[0m \u001b[0mException\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n","\u001b[0;32m/usr/local/lib/python3.6/dist-packages/jupyter_client/session.py\u001b[0m in \u001b[0;36mrecv\u001b[0;34m(self, socket, mode, content, copy)\u001b[0m\n\u001b[1;32m 802\u001b[0m \u001b[0;32mtry\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 803\u001b[0;31m \u001b[0mmsg_list\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0msocket\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mrecv_multipart\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mmode\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mcopy\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mcopy\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 804\u001b[0m \u001b[0;32mexcept\u001b[0m \u001b[0mzmq\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mZMQError\u001b[0m \u001b[0;32mas\u001b[0m \u001b[0me\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n","\u001b[0;32m/usr/local/lib/python3.6/dist-packages/zmq/sugar/socket.py\u001b[0m in \u001b[0;36mrecv_multipart\u001b[0;34m(self, flags, copy, track)\u001b[0m\n\u001b[1;32m 565\u001b[0m \"\"\"\n\u001b[0;32m--> 566\u001b[0;31m \u001b[0mparts\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0;34m[\u001b[0m\u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mrecv\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mflags\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mcopy\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mcopy\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mtrack\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0mtrack\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 567\u001b[0m \u001b[0;31m# have first part already, only loop while more to receive\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n","\u001b[0;32mzmq/backend/cython/socket.pyx\u001b[0m in \u001b[0;36mzmq.backend.cython.socket.Socket.recv\u001b[0;34m()\u001b[0m\n","\u001b[0;32mzmq/backend/cython/socket.pyx\u001b[0m in \u001b[0;36mzmq.backend.cython.socket.Socket.recv\u001b[0;34m()\u001b[0m\n","\u001b[0;32mzmq/backend/cython/socket.pyx\u001b[0m in \u001b[0;36mzmq.backend.cython.socket._recv_copy\u001b[0;34m()\u001b[0m\n","\u001b[0;32m/usr/local/lib/python3.6/dist-packages/zmq/backend/cython/checkrc.pxd\u001b[0m in \u001b[0;36mzmq.backend.cython.checkrc._check_rc\u001b[0;34m()\u001b[0m\n","\u001b[0;31mKeyboardInterrupt\u001b[0m: ","\nDuring handling of the above exception, another exception occurred:\n","\u001b[0;31mKeyboardInterrupt\u001b[0m Traceback (most recent call last)","\u001b[0;32m\u001b[0m in \u001b[0;36m\u001b[0;34m()\u001b[0m\n\u001b[1;32m 1\u001b[0m \u001b[0;31m#@title Step 9: Interacting with GPT-2\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 2\u001b[0;31m \u001b[0minteract_model\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'345M'\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;32mNone\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;36m300\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;36m1\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;36m0\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m'/content/gpt-2/models'\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m","\u001b[0;32m\u001b[0m in \u001b[0;36minteract_model\u001b[0;34m(model_name, seed, nsamples, batch_size, length, temperature, top_k, models_dir)\u001b[0m\n\u001b[1;32m 41\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 42\u001b[0m \u001b[0;32mwhile\u001b[0m \u001b[0;32mTrue\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m---> 43\u001b[0;31m \u001b[0mraw_text\u001b[0m \u001b[0;34m=\u001b[0m \u001b[0minput\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m\"Model prompt >>> \"\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 44\u001b[0m \u001b[0;32mwhile\u001b[0m \u001b[0;32mnot\u001b[0m \u001b[0mraw_text\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 45\u001b[0m \u001b[0mprint\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m'Prompt should not be empty!'\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n","\u001b[0;32m/usr/local/lib/python3.6/dist-packages/ipykernel/kernelbase.py\u001b[0m in \u001b[0;36mraw_input\u001b[0;34m(self, prompt)\u001b[0m\n\u001b[1;32m 702\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_parent_ident\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 703\u001b[0m \u001b[0mself\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0m_parent_header\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 704\u001b[0;31m \u001b[0mpassword\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;32mFalse\u001b[0m\u001b[0;34m,\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 705\u001b[0m )\n\u001b[1;32m 706\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n","\u001b[0;32m/usr/local/lib/python3.6/dist-packages/ipykernel/kernelbase.py\u001b[0m in \u001b[0;36m_input_request\u001b[0;34m(self, prompt, ident, parent, password)\u001b[0m\n\u001b[1;32m 732\u001b[0m \u001b[0;32mexcept\u001b[0m \u001b[0mKeyboardInterrupt\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 733\u001b[0m \u001b[0;31m# re-raise KeyboardInterrupt, to truncate traceback\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m--> 734\u001b[0;31m \u001b[0;32mraise\u001b[0m \u001b[0mKeyboardInterrupt\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m 735\u001b[0m \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m 736\u001b[0m \u001b[0;32mbreak\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n","\u001b[0;31mKeyboardInterrupt\u001b[0m: "]}]}]} ================================================ FILE: Chapter06/Training_OpenAI_GPT_2.ipynb ================================================ { "nbformat": 4, "nbformat_minor": 0, "metadata": { "colab": { "name": "Training OpenAI GPT-2.ipynb", "provenance": [], "collapsed_sections": [], "toc_visible": true }, "kernelspec": { "name": "python3", "display_name": "Python 3" }, "accelerator": "GPU" }, "cells": [ { "cell_type": "markdown", "metadata": { "id": "LH2YgC7LfzJZ", "colab_type": "text" }, "source": [ "#Training OpenAI GTP-2\n", "Copyright 2020, Denis Rothman MIT License. Denis Rothman created the Colab notebook using the OpenAI repository, adding title steps for educational purposes only.\n", "\n", "***Code References***\n", "\n", "[Reference: OpenAI Repository](https://github.com/openai/gpt-2)\n", "The repository was cloned and adapted to N Shepperd's repository.\n", "\n", "[Reference: N Shepperd Repository](https://github.com/nshepperd/gpt-2)\n", "The repository was not cloned. N Shepperd's training programs were inserted into the OpenAI Repository. The list of N Shepperd's programs are cited in the 'N Shepperd' section of the notebook. Some programs were modified for educational purposes only to work with this notebook.\n", "\n", "***Model Reference Paper***\n", "\n", "[Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei, Ilya Sutskever,2019,'Language Models are Unsupervised Multitask Learners'](https://d4mucfpksywv.cloudfront.net/better-language-models/language-models.pdf)\n", "\n", "\n", "***Step 1: Pre-requisites:***\n", "\n", "a) activate GPU in the notebook settings runTime menu
\n", "b) Upload the following program files and dset.txt(dataset) with the file manager: train.py,load_dataset.py,encode.py,accumulate,memory_saving_gradients.py,dset.txt" ] }, { "cell_type": "code", "metadata": { "id": "isqdu1fpfmqM", "colab_type": "code", "colab": { "base_uri": "https://localhost:8080/", "height": 102 }, "outputId": "1d38b600-5f4a-4d66-a00c-f8cab8f5a158" }, "source": [ "#@title Step 2: Cloning the OpenAI GPT-2 Repository \n", "#!git clone https://github.com/nshepperd/gpt-2.git\n", "!git clone https://github.com/openai/gpt-2.git" ], "execution_count": 1, "outputs": [ { "output_type": "stream", "text": [ "Cloning into 'gpt-2'...\n", "remote: Enumerating objects: 230, done.\u001b[K\n", "remote: Total 230 (delta 0), reused 0 (delta 0), pack-reused 230\u001b[K\n", "Receiving objects: 100% (230/230), 4.38 MiB | 6.13 MiB/s, done.\n", "Resolving deltas: 100% (119/119), done.\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "7RHOjN-TjUbj", "colab_type": "code", "colab": {} }, "source": [ "#@title Step 3: Installing the requirements\n", "import os # when the VM restarts import os necessary\n", "os.chdir(\"/content/gpt-2\") \n", "!pip3 install -r requirements.txt" ], "execution_count": null, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "q9vV73Opw68m", "colab_type": "code", "colab": { "base_uri": "https://localhost:8080/", "height": 105 }, "outputId": "482b8e2a-4e62-437c-dcf1-1046c2d28a0a" }, "source": [ "!pip install toposort" ], "execution_count": 3, "outputs": [ { "output_type": "stream", "text": [ "Collecting toposort\n", " Downloading https://files.pythonhosted.org/packages/e9/8a/321cd8ea5f4a22a06e3ba30ef31ec33bea11a3443eeb1d89807640ee6ed4/toposort-1.5-py2.py3-none-any.whl\n", "Installing collected packages: toposort\n", "Successfully installed toposort-1.5\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "_kpNCnh9fyYD", "colab_type": "code", "colab": { "base_uri": "https://localhost:8080/", "height": 51 }, "outputId": "4d140c05-569d-4dc6-a2ba-faaa883aac88" }, "source": [ "#@title Step 4: Checking TensorFlow version \n", "#Colab has tf 1.x and tf 2.x installed\n", "#Restart runtime using 'Runtime' -> 'Restart runtime...'\n", "%tensorflow_version 1.x\n", "import tensorflow as tf\n", "print(tf.__version__)" ], "execution_count": 1, "outputs": [ { "output_type": "stream", "text": [ "TensorFlow 1.x selected.\n", "1.15.2\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "jvVj0cLVkaPL", "colab_type": "code", "colab": { "base_uri": "https://localhost:8080/", "height": 136 }, "outputId": "eb34c742-4323-45de-b5bb-bf403cbed7c8" }, "source": [ "#@title Step 5: Downloading 117M parameter GPT-2 Model\n", "# run code and send argument\n", "import os # after runtime is restarted\n", "os.chdir(\"/content/gpt-2\")\n", "!python3 download_model.py '117M' #creates model directory" ], "execution_count": 2, "outputs": [ { "output_type": "stream", "text": [ "\rFetching checkpoint: 0%| | 0.00/77.0 [00:00 /path/to/output.npz # PYTHONPATH=src ./train --dataset /path/to/output.npz import argparse import numpy as np import encoder from load_dataset import load_dataset parser = argparse.ArgumentParser( description='Pre-encode text files into tokenized training set.', formatter_class=argparse.ArgumentDefaultsHelpFormatter) parser.add_argument('--model_name', metavar='MODEL', type=str, default='117M', help='Pretrained model name') parser.add_argument('--combine', metavar='CHARS', type=int, default=50000, help='Concatenate files with <|endoftext|> separator into chunks of this minimum size') parser.add_argument('--encoding', type=str, default='utf-8', help='Set the encoding for reading and writing files.') parser.add_argument('in_text', metavar='PATH', type=str, help='Input file, directory, or glob pattern (utf-8 text).') parser.add_argument('out_npz', metavar='OUT.npz', type=str, help='Output file path') def main(): models_dir='/content/gpt-2/src/models' args = parser.parse_args() enc = encoder.get_encoder(args.model_name,models_dir) print('Reading files') chunks = load_dataset(enc, args.in_text, args.combine, encoding=args.encoding) print('Writing', args.out_npz) np.savez_compressed(args.out_npz, *chunks) if __name__ == '__main__': main() ================================================ FILE: Chapter06/gpt-2-train_files/load_dataset.py ================================================ import glob import numpy as np import os import tensorflow as tf import tqdm def load_dataset(enc, path, combine, encoding=None): paths = [] if os.path.isfile(path): # Simple file paths.append(path) elif os.path.isdir(path): # Directory for (dirpath, _, fnames) in os.walk(path): for fname in fnames: paths.append(os.path.join(dirpath, fname)) else: # Assume glob paths = glob.glob(path) token_chunks = [] raw_text = '' for path in tqdm.tqdm(paths): if path.endswith('.npz'): # Pre-encoded with np.load(path) as npz: for item in npz.files: token_chunks.append(npz[item]) else: # Plain text with open(path, 'r', encoding=encoding) as fp: raw_text += fp.read() if len(raw_text) >= combine: tokens = np.stack(enc.encode(raw_text)) token_chunks.append(tokens) raw_text = '' else: raw_text += '<|endoftext|>' if raw_text: tokens = np.stack(enc.encode(raw_text)) token_chunks.append(tokens) return token_chunks def binary_search(f, lo, hi): if f(lo) or not f(hi): return None while hi > lo + 1: mid = (lo + hi) // 2 if f(mid): hi = mid else: lo = mid return hi class Sampler(object): """Fairly samples a slice from a set of variable sized chunks. 'Fairly' means that the distribution is the same as sampling from one concatenated chunk, but without crossing chunk boundaries.""" def __init__(self, chunks, seed=None): self.chunks = chunks self.total_size = sum(chunk.shape[0] for chunk in chunks) self.boundaries = [0] for i in range(len(chunks)): self.boundaries.append(self.boundaries[-1] + chunks[i].shape[0]) self.rs = np.random.RandomState(seed=seed) def sample(self, length): assert length < self.total_size // len( self.chunks ), "Dataset files are too small to sample {} tokens at a time".format( length) while True: index = self.rs.randint(0, self.total_size - length - 1) i = binary_search(lambda j: self.boundaries[j] > index, 0, len(self.boundaries) - 1) - 1 if self.boundaries[i + 1] > index + length: within_chunk = index - self.boundaries[i] return self.chunks[i][within_chunk:within_chunk + length] ================================================ FILE: Chapter06/gpt-2-train_files/memory_saving_gradients.py ================================================ from toposort import toposort import contextlib import numpy as np import tensorflow as tf import tensorflow.contrib.graph_editor as ge import time import sys sys.setrecursionlimit(10000) # refers back to current module if we decide to split helpers out util = sys.modules[__name__] # getting rid of "WARNING:tensorflow:VARIABLES collection name is deprecated" setattr(tf.GraphKeys, "VARIABLES", "variables") # save original gradients since tf.gradient could be monkey-patched to point # to our version from tensorflow.python.ops import gradients as tf_gradients_lib tf_gradients = tf_gradients_lib.gradients MIN_CHECKPOINT_NODE_SIZE=1024 # use lower value during testing # specific versions we can use to do process-wide replacement of tf.gradients def gradients_speed(ys, xs, grad_ys=None, **kwargs): return gradients(ys, xs, grad_ys, checkpoints='speed', **kwargs) def gradients_memory(ys, xs, grad_ys=None, **kwargs): return gradients(ys, xs, grad_ys, checkpoints='memory', **kwargs) def gradients_collection(ys, xs, grad_ys=None, **kwargs): return gradients(ys, xs, grad_ys, checkpoints='collection', **kwargs) def gradients(ys, xs, grad_ys=None, checkpoints='collection', **kwargs): ''' Authors: Tim Salimans & Yaroslav Bulatov memory efficient gradient implementation inspired by "Training Deep Nets with Sublinear Memory Cost" by Chen et al. 2016 (https://arxiv.org/abs/1604.06174) ys,xs,grad_ys,kwargs are the arguments to standard tensorflow tf.gradients (https://www.tensorflow.org/versions/r0.12/api_docs/python/train.html#gradients) 'checkpoints' can either be - a list consisting of tensors from the forward pass of the neural net that we should re-use when calculating the gradients in the backward pass all other tensors that do not appear in this list will be re-computed - a string specifying how this list should be determined. currently we support - 'speed': checkpoint all outputs of convolutions and matmuls. these ops are usually the most expensive, so checkpointing them maximizes the running speed (this is a good option if nonlinearities, concats, batchnorms, etc are taking up a lot of memory) - 'memory': try to minimize the memory usage (currently using a very simple strategy that identifies a number of bottleneck tensors in the graph to checkpoint) - 'collection': look for a tensorflow collection named 'checkpoints', which holds the tensors to checkpoint ''' # print("Calling memsaving gradients with", checkpoints) if not isinstance(ys,list): ys = [ys] if not isinstance(xs,list): xs = [xs] bwd_ops = ge.get_backward_walk_ops([y.op for y in ys], inclusive=True) debug_print("bwd_ops: %s", bwd_ops) # forward ops are all ops that are candidates for recomputation fwd_ops = ge.get_forward_walk_ops([x.op for x in xs], inclusive=True, within_ops=bwd_ops) debug_print("fwd_ops: %s", fwd_ops) # exclude ops with no inputs fwd_ops = [op for op in fwd_ops if op.inputs] # don't recompute xs, remove variables xs_ops = _to_ops(xs) fwd_ops = [op for op in fwd_ops if not op in xs_ops] fwd_ops = [op for op in fwd_ops if not '/assign' in op.name] fwd_ops = [op for op in fwd_ops if not '/Assign' in op.name] fwd_ops = [op for op in fwd_ops if not '/read' in op.name] ts_all = ge.filter_ts(fwd_ops, True) # get the tensors ts_all = [t for t in ts_all if '/read' not in t.name] ts_all = set(ts_all) - set(xs) - set(ys) # construct list of tensors to checkpoint during forward pass, if not # given as input if type(checkpoints) is not list: if checkpoints == 'collection': checkpoints = tf.get_collection('checkpoints') elif checkpoints == 'speed': # checkpoint all expensive ops to maximize running speed checkpoints = ge.filter_ts_from_regex(fwd_ops, 'conv2d|Conv|MatMul') elif checkpoints == 'memory': # remove very small tensors and some weird ops def fixdims(t): # tf.Dimension values are not compatible with int, convert manually try: return [int(e if e.value is not None else 64) for e in t] except: return [0] # unknown shape ts_all = [t for t in ts_all if np.prod(fixdims(t.shape)) > MIN_CHECKPOINT_NODE_SIZE] ts_all = [t for t in ts_all if 'L2Loss' not in t.name] ts_all = [t for t in ts_all if 'entropy' not in t.name] ts_all = [t for t in ts_all if 'FusedBatchNorm' not in t.name] ts_all = [t for t in ts_all if 'Switch' not in t.name] ts_all = [t for t in ts_all if 'dropout' not in t.name] # DV: FP16_FIX - need to add 'Cast' layer here to make it work for FP16 ts_all = [t for t in ts_all if 'Cast' not in t.name] # filter out all tensors that are inputs of the backward graph with util.capture_ops() as bwd_ops: tf_gradients(ys, xs, grad_ys, **kwargs) bwd_inputs = [t for op in bwd_ops for t in op.inputs] # list of tensors in forward graph that is in input to bwd graph ts_filtered = list(set(bwd_inputs).intersection(ts_all)) debug_print("Using tensors %s", ts_filtered) # try two slightly different ways of getting bottlenecks tensors # to checkpoint for ts in [ts_filtered, ts_all]: # get all bottlenecks in the graph bottleneck_ts = [] for t in ts: b = set(ge.get_backward_walk_ops(t.op, inclusive=True, within_ops=fwd_ops)) f = set(ge.get_forward_walk_ops(t.op, inclusive=False, within_ops=fwd_ops)) # check that there are not shortcuts b_inp = set([inp for op in b for inp in op.inputs]).intersection(ts_all) f_inp = set([inp for op in f for inp in op.inputs]).intersection(ts_all) if not set(b_inp).intersection(f_inp) and len(b_inp)+len(f_inp) >= len(ts_all): bottleneck_ts.append(t) # we have a bottleneck! else: debug_print("Rejected bottleneck candidate and ops %s", [t] + list(set(ts_all) - set(b_inp) - set(f_inp))) # success? or try again without filtering? if len(bottleneck_ts) >= np.sqrt(len(ts_filtered)): # yes, enough bottlenecks found! break if not bottleneck_ts: raise Exception('unable to find bottleneck tensors! please provide checkpoint nodes manually, or use checkpoints="speed".') # sort the bottlenecks bottlenecks_sorted_lists = tf_toposort(bottleneck_ts, within_ops=fwd_ops) sorted_bottlenecks = [t for ts in bottlenecks_sorted_lists for t in ts] # save an approximately optimal number ~ sqrt(N) N = len(ts_filtered) if len(bottleneck_ts) <= np.ceil(np.sqrt(N)): checkpoints = sorted_bottlenecks else: step = int(np.ceil(len(bottleneck_ts) / np.sqrt(N))) checkpoints = sorted_bottlenecks[step::step] else: raise Exception('%s is unsupported input for "checkpoints"' % (checkpoints,)) checkpoints = list(set(checkpoints).intersection(ts_all)) # at this point automatic selection happened and checkpoints is list of nodes assert isinstance(checkpoints, list) debug_print("Checkpoint nodes used: %s", checkpoints) # better error handling of special cases # xs are already handled as checkpoint nodes, so no need to include them xs_intersect_checkpoints = set(xs).intersection(set(checkpoints)) if xs_intersect_checkpoints: debug_print("Warning, some input nodes are also checkpoint nodes: %s", xs_intersect_checkpoints) ys_intersect_checkpoints = set(ys).intersection(set(checkpoints)) debug_print("ys: %s, checkpoints: %s, intersect: %s", ys, checkpoints, ys_intersect_checkpoints) # saving an output node (ys) gives no benefit in memory while creating # new edge cases, exclude them if ys_intersect_checkpoints: debug_print("Warning, some output nodes are also checkpoints nodes: %s", format_ops(ys_intersect_checkpoints)) # remove initial and terminal nodes from checkpoints list if present checkpoints = list(set(checkpoints) - set(ys) - set(xs)) # check that we have some nodes to checkpoint # if not checkpoints: # raise Exception('no checkpoints nodes found or given as input! ') # disconnect dependencies between checkpointed tensors checkpoints_disconnected = {} for x in checkpoints: if x.op and x.op.name is not None: grad_node = tf.stop_gradient(x, name=x.op.name+"_sg") else: grad_node = tf.stop_gradient(x) checkpoints_disconnected[x] = grad_node # partial derivatives to the checkpointed tensors and xs ops_to_copy = fast_backward_ops(seed_ops=[y.op for y in ys], stop_at_ts=checkpoints, within_ops=fwd_ops) debug_print("Found %s ops to copy within fwd_ops %s, seed %s, stop_at %s", len(ops_to_copy), fwd_ops, [r.op for r in ys], checkpoints) debug_print("ops_to_copy = %s", ops_to_copy) debug_print("Processing list %s", ys) copied_sgv, info = ge.copy_with_input_replacements(ge.sgv(ops_to_copy), {}) for origin_op, op in info._transformed_ops.items(): op._set_device(origin_op.node_def.device) copied_ops = info._transformed_ops.values() debug_print("Copied %s to %s", ops_to_copy, copied_ops) ge.reroute_ts(checkpoints_disconnected.values(), checkpoints_disconnected.keys(), can_modify=copied_ops) debug_print("Rewired %s in place of %s restricted to %s", checkpoints_disconnected.values(), checkpoints_disconnected.keys(), copied_ops) # get gradients with respect to current boundary + original x's copied_ys = [info._transformed_ops[y.op]._outputs[0] for y in ys] boundary = list(checkpoints_disconnected.values()) dv = tf_gradients(ys=copied_ys, xs=boundary+xs, grad_ys=grad_ys, **kwargs) debug_print("Got gradients %s", dv) debug_print("for %s", copied_ys) debug_print("with respect to %s", boundary+xs) inputs_to_do_before = [y.op for y in ys] if grad_ys is not None: inputs_to_do_before += grad_ys wait_to_do_ops = list(copied_ops) + [g.op for g in dv if g is not None] my_add_control_inputs(wait_to_do_ops, inputs_to_do_before) # partial derivatives to the checkpointed nodes # dictionary of "node: backprop" for nodes in the boundary d_checkpoints = {r: dr for r,dr in zip(checkpoints_disconnected.keys(), dv[:len(checkpoints_disconnected)])} # partial derivatives to xs (usually the params of the neural net) d_xs = dv[len(checkpoints_disconnected):] # incorporate derivatives flowing through the checkpointed nodes checkpoints_sorted_lists = tf_toposort(checkpoints, within_ops=fwd_ops) for ts in checkpoints_sorted_lists[::-1]: debug_print("Processing list %s", ts) checkpoints_other = [r for r in checkpoints if r not in ts] checkpoints_disconnected_other = [checkpoints_disconnected[r] for r in checkpoints_other] # copy part of the graph below current checkpoint node, stopping at # other checkpoints nodes ops_to_copy = fast_backward_ops(within_ops=fwd_ops, seed_ops=[r.op for r in ts], stop_at_ts=checkpoints_other) debug_print("Found %s ops to copy within %s, seed %s, stop_at %s", len(ops_to_copy), fwd_ops, [r.op for r in ts], checkpoints_other) debug_print("ops_to_copy = %s", ops_to_copy) if not ops_to_copy: # we're done! break copied_sgv, info = ge.copy_with_input_replacements(ge.sgv(ops_to_copy), {}) for origin_op, op in info._transformed_ops.items(): op._set_device(origin_op.node_def.device) copied_ops = info._transformed_ops.values() debug_print("Copied %s to %s", ops_to_copy, copied_ops) ge.reroute_ts(checkpoints_disconnected_other, checkpoints_other, can_modify=copied_ops) debug_print("Rewired %s in place of %s restricted to %s", checkpoints_disconnected_other, checkpoints_other, copied_ops) # gradient flowing through the checkpointed node boundary = [info._transformed_ops[r.op]._outputs[0] for r in ts] substitute_backprops = [d_checkpoints[r] for r in ts] dv = tf_gradients(boundary, checkpoints_disconnected_other+xs, grad_ys=substitute_backprops, **kwargs) debug_print("Got gradients %s", dv) debug_print("for %s", boundary) debug_print("with respect to %s", checkpoints_disconnected_other+xs) debug_print("with boundary backprop substitutions %s", substitute_backprops) inputs_to_do_before = [d_checkpoints[r].op for r in ts] wait_to_do_ops = list(copied_ops) + [g.op for g in dv if g is not None] my_add_control_inputs(wait_to_do_ops, inputs_to_do_before) # partial derivatives to the checkpointed nodes for r, dr in zip(checkpoints_other, dv[:len(checkpoints_other)]): if dr is not None: if d_checkpoints[r] is None: d_checkpoints[r] = dr else: d_checkpoints[r] += dr def _unsparsify(x): if not isinstance(x, tf.IndexedSlices): return x assert x.dense_shape is not None, "memory_saving_gradients encountered sparse gradients of unknown shape" indices = x.indices while indices.shape.ndims < x.values.shape.ndims: indices = tf.expand_dims(indices, -1) return tf.scatter_nd(indices, x.values, x.dense_shape) # partial derivatives to xs (usually the params of the neural net) d_xs_new = dv[len(checkpoints_other):] for j in range(len(xs)): if d_xs_new[j] is not None: if d_xs[j] is None: d_xs[j] = _unsparsify(d_xs_new[j]) else: d_xs[j] += _unsparsify(d_xs_new[j]) return d_xs def tf_toposort(ts, within_ops=None): all_ops = ge.get_forward_walk_ops([x.op for x in ts], within_ops=within_ops) deps = {} for op in all_ops: for o in op.outputs: deps[o] = set(op.inputs) sorted_ts = toposort(deps) # only keep the tensors from our original list ts_sorted_lists = [] for l in sorted_ts: keep = list(set(l).intersection(ts)) if keep: ts_sorted_lists.append(keep) return ts_sorted_lists def fast_backward_ops(within_ops, seed_ops, stop_at_ts): bwd_ops = set(ge.get_backward_walk_ops(seed_ops, stop_at_ts=stop_at_ts)) ops = bwd_ops.intersection(within_ops).difference([t.op for t in stop_at_ts]) return list(ops) @contextlib.contextmanager def capture_ops(): """Decorator to capture ops created in the block. with capture_ops() as ops: # create some ops print(ops) # => prints ops created. """ micros = int(time.time()*10**6) scope_name = str(micros) op_list = [] with tf.name_scope(scope_name): yield op_list g = tf.get_default_graph() op_list.extend(ge.select_ops(scope_name+"/.*", graph=g)) def _to_op(tensor_or_op): if hasattr(tensor_or_op, "op"): return tensor_or_op.op return tensor_or_op def _to_ops(iterable): if not _is_iterable(iterable): return iterable return [_to_op(i) for i in iterable] def _is_iterable(o): try: _ = iter(o) except Exception: return False return True DEBUG_LOGGING=False def debug_print(s, *args): """Like logger.log, but also replaces all TensorFlow ops/tensors with their names. Sensitive to value of DEBUG_LOGGING, see enable_debug/disable_debug Usage: debug_print("see tensors %s for %s", tensorlist, [1,2,3]) """ if DEBUG_LOGGING: formatted_args = [format_ops(arg) for arg in args] print("DEBUG "+s % tuple(formatted_args)) def format_ops(ops, sort_outputs=True): """Helper method for printing ops. Converts Tensor/Operation op to op.name, rest to str(op).""" if hasattr(ops, '__iter__') and not isinstance(ops, str): l = [(op.name if hasattr(op, "name") else str(op)) for op in ops] if sort_outputs: return sorted(l) return l else: return ops.name if hasattr(ops, "name") else str(ops) def my_add_control_inputs(wait_to_do_ops, inputs_to_do_before): for op in wait_to_do_ops: ci = [i for i in inputs_to_do_before if op.control_inputs is None or i not in op.control_inputs] ge.add_control_inputs(op, ci) ================================================ FILE: Chapter06/gpt-2-train_files/train.py ================================================ #!/usr/bin/env python3 # Usage: # PYTHONPATH=src ./train --dataset import argparse import json import os import numpy as np import tensorflow as tf import time import tqdm from tensorflow.core.protobuf import rewriter_config_pb2 import model, sample, encoder from load_dataset import load_dataset, Sampler from accumulate import AccumulatingOptimizer import memory_saving_gradients CHECKPOINT_DIR = 'checkpoint' SAMPLE_DIR = 'samples' parser = argparse.ArgumentParser( description='Fine-tune GPT-2 on your custom dataset.', formatter_class=argparse.ArgumentDefaultsHelpFormatter) parser.add_argument('--dataset', metavar='PATH', type=str, required=True, help='Input file, directory, or glob pattern (utf-8 text, or preencoded .npz files).') parser.add_argument('--model_name', metavar='MODEL', type=str, default='117M', help='Pretrained model name') parser.add_argument('--combine', metavar='CHARS', type=int, default=50000, help='Concatenate input files with <|endoftext|> separator into chunks of this minimum size') parser.add_argument('--encoding', type=str, default='utf-8', help='Set the encoding for reading and writing files.') parser.add_argument('--batch_size', metavar='SIZE', type=int, default=1, help='Batch size') parser.add_argument('--learning_rate', metavar='LR', type=float, default=0.00002, help='Learning rate for Adam') parser.add_argument('--accumulate_gradients', metavar='N', type=int, default=1, help='Accumulate gradients across N minibatches.') parser.add_argument('--memory_saving_gradients', default=False, action='store_true', help='Use gradient checkpointing to reduce vram usage.') parser.add_argument('--only_train_transformer_layers', default=False, action='store_true', help='Restrict training to the transformer blocks.') parser.add_argument('--optimizer', type=str, default='adam', help='Optimizer. .') parser.add_argument('--noise', type=float, default=0.0, help='Add noise to input training data to regularize against typos.') parser.add_argument('--top_k', type=int, default=40, help='K for top-k sampling.') parser.add_argument('--top_p', type=float, default=0.0, help='P for top-p sampling. Overrides top_k if set > 0.') parser.add_argument('--restore_from', type=str, default='latest', help='Either "latest", "fresh", or a path to a checkpoint file') parser.add_argument('--run_name', type=str, default='run1', help='Run id. Name of subdirectory in checkpoint/ and samples/') parser.add_argument('--sample_every', metavar='N', type=int, default=100, help='Generate samples every N steps') parser.add_argument('--sample_length', metavar='TOKENS', type=int, default=1023, help='Sample this many tokens') parser.add_argument('--sample_num', metavar='N', type=int, default=1, help='Generate this many samples') parser.add_argument('--save_every', metavar='N', type=int, default=1000, help='Write a checkpoint every N steps') parser.add_argument('--val_dataset', metavar='PATH', type=str, default=None, help='Dataset for validation loss, defaults to --dataset.') parser.add_argument('--val_batch_size', metavar='SIZE', type=int, default=2, help='Batch size for validation.') parser.add_argument('--val_batch_count', metavar='N', type=int, default=40, help='Number of batches for validation.') parser.add_argument('--val_every', metavar='STEPS', type=int, default=0, help='Calculate validation loss every STEPS steps.') def maketree(path): try: os.makedirs(path) except: pass def randomize(context, hparams, p): if p > 0: mask = tf.random.uniform(shape=tf.shape(context)) < p noise = tf.random.uniform(shape=tf.shape(context), minval=0, maxval=hparams.n_vocab, dtype=tf.int32) return tf.where(mask, noise, context) else: return context def main(): args = parser.parse_args() models_dir='/content/gpt-2/src/models' enc = encoder.get_encoder(args.model_name,models_dir) hparams = model.default_hparams() with open(os.path.join('models', args.model_name, 'hparams.json')) as f: hparams.override_from_dict(json.load(f)) if args.sample_length > hparams.n_ctx: raise ValueError( "Can't get samples longer than window size: %s" % hparams.n_ctx) if args.model_name == '345M': args.memory_saving_gradients = True if args.optimizer == 'adam': args.only_train_transformer_layers = True config = tf.ConfigProto() config.gpu_options.allow_growth = True config.graph_options.rewrite_options.layout_optimizer = rewriter_config_pb2.RewriterConfig.OFF with tf.Session(config=config) as sess: context = tf.placeholder(tf.int32, [args.batch_size, None]) context_in = randomize(context, hparams, args.noise) output = model.model(hparams=hparams, X=context_in) loss = tf.reduce_mean( tf.nn.sparse_softmax_cross_entropy_with_logits( labels=context[:, 1:], logits=output['logits'][:, :-1])) if args.val_every > 0: val_context = tf.placeholder(tf.int32, [args.val_batch_size, None]) val_output = model.model(hparams=hparams, X=val_context) val_loss = tf.reduce_mean( tf.nn.sparse_softmax_cross_entropy_with_logits( labels=val_context[:, 1:], logits=val_output['logits'][:, :-1])) val_loss_summary = tf.summary.scalar('val_loss', val_loss) tf_sample = sample.sample_sequence( hparams=hparams, length=args.sample_length, context=context, batch_size=args.batch_size, temperature=1.0, top_k=args.top_k, top_p=args.top_p) all_vars = [v for v in tf.trainable_variables() if 'model' in v.name] train_vars = [v for v in all_vars if '/h' in v.name] if args.only_train_transformer_layers else all_vars if args.optimizer == 'adam': opt = tf.train.AdamOptimizer(learning_rate=args.learning_rate) elif args.optimizer == 'sgd': opt = tf.train.GradientDescentOptimizer(learning_rate=args.learning_rate) else: exit('Bad optimizer:', args.optimizer) if args.accumulate_gradients > 1: if args.memory_saving_gradients: exit("Memory saving gradients are not implemented for gradient accumulation yet.") opt = AccumulatingOptimizer( opt=opt, var_list=train_vars) opt_reset = opt.reset() opt_compute = opt.compute_gradients(loss) opt_apply = opt.apply_gradients() summary_loss = tf.summary.scalar('loss', opt_apply) else: if args.memory_saving_gradients: opt_grads = memory_saving_gradients.gradients(loss, train_vars) else: opt_grads = tf.gradients(loss, train_vars) opt_grads = list(zip(opt_grads, train_vars)) opt_apply = opt.apply_gradients(opt_grads) summary_loss = tf.summary.scalar('loss', loss) summary_lr = tf.summary.scalar('learning_rate', args.learning_rate) summaries = tf.summary.merge([summary_lr, summary_loss]) summary_log = tf.summary.FileWriter( os.path.join(CHECKPOINT_DIR, args.run_name)) saver = tf.train.Saver( var_list=all_vars, max_to_keep=5, keep_checkpoint_every_n_hours=2) sess.run(tf.global_variables_initializer()) if args.restore_from == 'latest': ckpt = tf.train.latest_checkpoint( os.path.join(CHECKPOINT_DIR, args.run_name)) if ckpt is None: # Get fresh GPT weights if new run. ckpt = tf.train.latest_checkpoint( os.path.join('models', args.model_name)) elif args.restore_from == 'fresh': ckpt = tf.train.latest_checkpoint( os.path.join('models', args.model_name)) else: ckpt = tf.train.latest_checkpoint(args.restore_from) print('Loading checkpoint', ckpt) saver.restore(sess, ckpt) print('Loading dataset...') chunks = load_dataset(enc, args.dataset, args.combine, encoding=args.encoding) data_sampler = Sampler(chunks) if args.val_every > 0: if args.val_dataset: val_chunks = load_dataset(enc, args.val_dataset, args.combine, encoding=args.encoding) else: val_chunks = chunks print('dataset has', data_sampler.total_size, 'tokens') print('Training...') if args.val_every > 0: # Sample from validation set once with fixed seed to make # it deterministic during training as well as across runs. val_data_sampler = Sampler(val_chunks, seed=1) val_batches = [[val_data_sampler.sample(1024) for _ in range(args.val_batch_size)] for _ in range(args.val_batch_count)] counter = 1 counter_path = os.path.join(CHECKPOINT_DIR, args.run_name, 'counter') if os.path.exists(counter_path): # Load the step number if we're resuming a run # Add 1 so we don't immediately try to save again with open(counter_path, 'r') as fp: counter = int(fp.read()) + 1 def save(): maketree(os.path.join(CHECKPOINT_DIR, args.run_name)) print( 'Saving', os.path.join(CHECKPOINT_DIR, args.run_name, 'model-{}').format(counter)) saver.save( sess, os.path.join(CHECKPOINT_DIR, args.run_name, 'model'), global_step=counter) with open(counter_path, 'w') as fp: fp.write(str(counter) + '\n') def generate_samples(): print('Generating samples...') context_tokens = data_sampler.sample(1) all_text = [] index = 0 while index < args.sample_num: out = sess.run( tf_sample, feed_dict={context: args.batch_size * [context_tokens]}) for i in range(min(args.sample_num - index, args.batch_size)): text = enc.decode(out[i]) text = '======== SAMPLE {} ========\n{}\n'.format( index + 1, text) all_text.append(text) index += 1 print(text) maketree(os.path.join(SAMPLE_DIR, args.run_name)) with open( os.path.join(SAMPLE_DIR, args.run_name, 'samples-{}').format(counter), 'w', encoding=args.encoding) as fp: fp.write('\n'.join(all_text)) def validation(): print('Calculating validation loss...') losses = [] for batch in tqdm.tqdm(val_batches): losses.append(sess.run(val_loss, feed_dict={val_context: batch})) v_val_loss = np.mean(losses) v_summary = sess.run(val_loss_summary, feed_dict={val_loss: v_val_loss}) summary_log.add_summary(v_summary, counter) summary_log.flush() print( '[{counter} | {time:2.2f}] validation loss = {loss:2.2f}' .format( counter=counter, time=time.time() - start_time, loss=v_val_loss)) def sample_batch(): return [data_sampler.sample(1024) for _ in range(args.batch_size)] avg_loss = (0.0, 0.0) start_time = time.time() try: while True: if counter % args.save_every == 0: save() if counter % args.sample_every == 0: generate_samples() if args.val_every > 0 and (counter % args.val_every == 0 or counter == 1): validation() if args.accumulate_gradients > 1: sess.run(opt_reset) for _ in range(args.accumulate_gradients): sess.run( opt_compute, feed_dict={context: sample_batch()}) (v_loss, v_summary) = sess.run((opt_apply, summaries)) else: (_, v_loss, v_summary) = sess.run( (opt_apply, loss, summaries), feed_dict={context: sample_batch()}) summary_log.add_summary(v_summary, counter) avg_loss = (avg_loss[0] * 0.99 + v_loss, avg_loss[1] * 0.99 + 1.0) print( '[{counter} | {time:2.2f}] loss={loss:2.2f} avg={avg:2.2f}' .format( counter=counter, time=time.time() - start_time, loss=v_loss, avg=avg_loss[0] / avg_loss[1])) counter += 1 except KeyboardInterrupt: print('interrupted') save() if __name__ == '__main__': main() ================================================ FILE: Chapter06/head_view_bert.ipynb ================================================ { "nbformat": 4, "nbformat_minor": 0, "metadata": { "colab": { "name": "head_view_bert.ipynb", "provenance": [], "collapsed_sections": [] }, "kernelspec": { "name": "python3", "display_name": "Python 3" }, "widgets": { "application/vnd.jupyter.widget-state+json": { "4d1bd7a205b94210ba8e1fd946d75821": { "model_module": "@jupyter-widgets/controls", "model_name": "HBoxModel", "state": { "_view_name": "HBoxView", "_dom_classes": [], "_model_name": "HBoxModel", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.5.0", "box_style": "", "layout": "IPY_MODEL_1f2847673c374813ac442322e978eec7", "_model_module": "@jupyter-widgets/controls", "children": [ "IPY_MODEL_f48103aee6dc4c06b176bc115be332e0", "IPY_MODEL_c6f8b3bf7fce4c928a0db3651813347d" ] } }, "1f2847673c374813ac442322e978eec7": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "f48103aee6dc4c06b176bc115be332e0": { "model_module": "@jupyter-widgets/controls", "model_name": "FloatProgressModel", "state": { "_view_name": "ProgressView", "style": "IPY_MODEL_8d049af8ea834bf7b3a0fc7013fb3ff9", "_dom_classes": [], "description": "Downloading: 100%", "_model_name": "FloatProgressModel", "bar_style": "success", "max": 433, "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": 433, "_view_count": null, "_view_module_version": "1.5.0", "orientation": "horizontal", "min": 0, "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_1dc9d68906594c1eb6db7c9f31876d2a" } }, "c6f8b3bf7fce4c928a0db3651813347d": { "model_module": "@jupyter-widgets/controls", "model_name": "HTMLModel", "state": { "_view_name": "HTMLView", "style": "IPY_MODEL_f9d8ea0a95924b0596ab4b7f091a94b7", "_dom_classes": [], "description": "", "_model_name": "HTMLModel", "placeholder": "​", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": " 433/433 [00:00<00:00, 1.34kB/s]", "_view_count": null, "_view_module_version": "1.5.0", "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_ba7f35525a7b4cb1951ed1a1e6a57ffd" } }, "8d049af8ea834bf7b3a0fc7013fb3ff9": { "model_module": "@jupyter-widgets/controls", "model_name": "ProgressStyleModel", "state": { "_view_name": "StyleView", "_model_name": "ProgressStyleModel", "description_width": "initial", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "bar_color": null, "_model_module": "@jupyter-widgets/controls" } }, "1dc9d68906594c1eb6db7c9f31876d2a": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "f9d8ea0a95924b0596ab4b7f091a94b7": { "model_module": "@jupyter-widgets/controls", "model_name": "DescriptionStyleModel", "state": { "_view_name": "StyleView", "_model_name": "DescriptionStyleModel", "description_width": "", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "_model_module": "@jupyter-widgets/controls" } }, "ba7f35525a7b4cb1951ed1a1e6a57ffd": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "d3ee7a14538244b1b64abbeb24948102": { "model_module": "@jupyter-widgets/controls", "model_name": "HBoxModel", "state": { "_view_name": "HBoxView", "_dom_classes": [], "_model_name": "HBoxModel", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.5.0", "box_style": "", "layout": "IPY_MODEL_34507ce588b04412aaacea76987f27ea", "_model_module": "@jupyter-widgets/controls", "children": [ "IPY_MODEL_c32d39b32a144c4480cd8d6b1d6c199e", "IPY_MODEL_693433c2ec204437ac7878a8bee61647" ] } }, "34507ce588b04412aaacea76987f27ea": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "c32d39b32a144c4480cd8d6b1d6c199e": { "model_module": "@jupyter-widgets/controls", "model_name": "FloatProgressModel", "state": { "_view_name": "ProgressView", "style": "IPY_MODEL_19a09d359acf496bb0bc68c63dea1e78", "_dom_classes": [], "description": "Downloading: 100%", "_model_name": "FloatProgressModel", "bar_style": "success", "max": 440473133, "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": 440473133, "_view_count": null, "_view_module_version": "1.5.0", "orientation": "horizontal", "min": 0, "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_8e81da5616354ddb887419d81251096b" } }, "693433c2ec204437ac7878a8bee61647": { "model_module": "@jupyter-widgets/controls", "model_name": "HTMLModel", "state": { "_view_name": "HTMLView", "style": "IPY_MODEL_2a084a03747d4c9984cc13136ccc4217", "_dom_classes": [], "description": "", "_model_name": "HTMLModel", "placeholder": "​", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": " 440M/440M [00:07<00:00, 61.0MB/s]", "_view_count": null, "_view_module_version": "1.5.0", "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_679c7f033d2940f489382161964c5c0d" } }, "19a09d359acf496bb0bc68c63dea1e78": { "model_module": "@jupyter-widgets/controls", "model_name": "ProgressStyleModel", "state": { "_view_name": "StyleView", "_model_name": "ProgressStyleModel", "description_width": "initial", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "bar_color": null, "_model_module": "@jupyter-widgets/controls" } }, "8e81da5616354ddb887419d81251096b": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "2a084a03747d4c9984cc13136ccc4217": { "model_module": "@jupyter-widgets/controls", "model_name": "DescriptionStyleModel", "state": { "_view_name": "StyleView", "_model_name": "DescriptionStyleModel", "description_width": "", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "_model_module": "@jupyter-widgets/controls" } }, "679c7f033d2940f489382161964c5c0d": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "2861d6bdfed84911ab25f8175d718e2e": { "model_module": "@jupyter-widgets/controls", "model_name": "HBoxModel", "state": { "_view_name": "HBoxView", "_dom_classes": [], "_model_name": "HBoxModel", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.5.0", "box_style": "", "layout": "IPY_MODEL_df575c6406c0426ca9f222b818896439", "_model_module": "@jupyter-widgets/controls", "children": [ "IPY_MODEL_5f214f2f5a964fdc97eda797248f720f", "IPY_MODEL_40f8b46b4b8f44dcabd98d6a5e3044b3" ] } }, "df575c6406c0426ca9f222b818896439": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "5f214f2f5a964fdc97eda797248f720f": { "model_module": "@jupyter-widgets/controls", "model_name": "FloatProgressModel", "state": { "_view_name": "ProgressView", "style": "IPY_MODEL_5a68a09e677d4118bc7e3efc18c35999", "_dom_classes": [], "description": "Downloading: 100%", "_model_name": "FloatProgressModel", "bar_style": "success", "max": 231508, "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": 231508, "_view_count": null, "_view_module_version": "1.5.0", "orientation": "horizontal", "min": 0, "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_73ff50d8f1dc4f4e99062845a024a20b" } }, "40f8b46b4b8f44dcabd98d6a5e3044b3": { "model_module": "@jupyter-widgets/controls", "model_name": "HTMLModel", "state": { "_view_name": "HTMLView", "style": "IPY_MODEL_b8aa22a0efcb4f43a752e21bdeb73589", "_dom_classes": [], "description": "", "_model_name": "HTMLModel", "placeholder": "​", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": " 232k/232k [00:00<00:00, 621kB/s]", "_view_count": null, "_view_module_version": "1.5.0", "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_f2095ed84f644757888f5746c2a10ee4" } }, "5a68a09e677d4118bc7e3efc18c35999": { "model_module": "@jupyter-widgets/controls", "model_name": "ProgressStyleModel", "state": { "_view_name": "StyleView", "_model_name": "ProgressStyleModel", "description_width": "initial", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "bar_color": null, "_model_module": "@jupyter-widgets/controls" } }, "73ff50d8f1dc4f4e99062845a024a20b": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "b8aa22a0efcb4f43a752e21bdeb73589": { "model_module": "@jupyter-widgets/controls", "model_name": "DescriptionStyleModel", "state": { "_view_name": "StyleView", "_model_name": "DescriptionStyleModel", "description_width": "", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "_model_module": "@jupyter-widgets/controls" } }, "f2095ed84f644757888f5746c2a10ee4": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } } } } }, "cells": [ { "cell_type": "markdown", "metadata": { "id": "lqAxyueMKgXT" }, "source": [ "#BertViz\n", "\n", "Note: Denis Rothman added some titles to the sections of the reference notebook\n", "\n", "[Reference BertViz GitHub Repository by Jesse Vig](https://github.com/jessevig/bertviz)" ] }, { "cell_type": "code", "metadata": { "id": "zFo1IBx-x-rC", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "a2e29214-6b92-45d0-d072-39ac3195f63c" }, "source": [ "#@title Step 1: Installing BertViz and Requirements\n", "import sys\n", "!test -d bertviz_repo && echo \"FYI: bertviz_repo directory already exists, to pull latest version uncomment this line: !rm -r bertviz_repo\"\n", "# !rm -r bertviz_repo # Uncomment if you need a clean pull from repo\n", "!test -d bertviz_repo || git clone https://github.com/jessevig/bertviz bertviz_repo\n", "if not 'bertviz_repo' in sys.path:\n", " sys.path += ['bertviz_repo']\n", "!pip install regex\n", "!pip install transformers" ], "execution_count": null, "outputs": [ { "output_type": "stream", "text": [ "Cloning into 'bertviz_repo'...\n", "remote: Enumerating objects: 3, done.\u001b[K\n", "remote: Counting objects: 100% (3/3), done.\u001b[K\n", "remote: Compressing objects: 100% (3/3), done.\u001b[K\n", "remote: Total 1077 (delta 0), reused 2 (delta 0), pack-reused 1074\u001b[K\n", "Receiving objects: 100% (1077/1077), 100.00 MiB | 10.18 MiB/s, done.\n", "Resolving deltas: 100% (687/687), done.\n", "Requirement already satisfied: regex in /usr/local/lib/python3.6/dist-packages (2019.12.20)\n", "Collecting transformers\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/99/84/7bc03215279f603125d844bf81c3fb3f2d50fe8e511546eb4897e4be2067/transformers-4.0.0-py3-none-any.whl (1.4MB)\n", "\u001b[K |████████████████████████████████| 1.4MB 12.8MB/s \n", "\u001b[?25hCollecting sacremoses\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/7d/34/09d19aff26edcc8eb2a01bed8e98f13a1537005d31e95233fd48216eed10/sacremoses-0.0.43.tar.gz (883kB)\n", "\u001b[K |████████████████████████████████| 890kB 50.3MB/s \n", "\u001b[?25hRequirement already satisfied: packaging in /usr/local/lib/python3.6/dist-packages (from transformers) (20.4)\n", "Requirement already satisfied: numpy in /usr/local/lib/python3.6/dist-packages (from transformers) (1.18.5)\n", "Requirement already satisfied: regex!=2019.12.17 in /usr/local/lib/python3.6/dist-packages (from transformers) (2019.12.20)\n", "Requirement already satisfied: tqdm>=4.27 in /usr/local/lib/python3.6/dist-packages (from transformers) (4.41.1)\n", "Requirement already satisfied: requests in /usr/local/lib/python3.6/dist-packages (from transformers) (2.23.0)\n", "Requirement already satisfied: filelock in /usr/local/lib/python3.6/dist-packages (from transformers) (3.0.12)\n", "Requirement already satisfied: dataclasses; python_version < \"3.7\" in /usr/local/lib/python3.6/dist-packages (from transformers) (0.8)\n", "Collecting tokenizers==0.9.4\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/0f/1c/e789a8b12e28be5bc1ce2156cf87cb522b379be9cadc7ad8091a4cc107c4/tokenizers-0.9.4-cp36-cp36m-manylinux2010_x86_64.whl (2.9MB)\n", "\u001b[K |████████████████████████████████| 2.9MB 41.0MB/s \n", "\u001b[?25hRequirement already satisfied: six in /usr/local/lib/python3.6/dist-packages (from sacremoses->transformers) (1.15.0)\n", "Requirement already satisfied: click in /usr/local/lib/python3.6/dist-packages (from sacremoses->transformers) (7.1.2)\n", "Requirement already satisfied: joblib in /usr/local/lib/python3.6/dist-packages (from sacremoses->transformers) (0.17.0)\n", "Requirement already satisfied: pyparsing>=2.0.2 in /usr/local/lib/python3.6/dist-packages (from packaging->transformers) (2.4.7)\n", "Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /usr/local/lib/python3.6/dist-packages (from requests->transformers) (1.24.3)\n", "Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.6/dist-packages (from requests->transformers) (2020.11.8)\n", "Requirement already satisfied: idna<3,>=2.5 in /usr/local/lib/python3.6/dist-packages (from requests->transformers) (2.10)\n", "Requirement already satisfied: chardet<4,>=3.0.2 in /usr/local/lib/python3.6/dist-packages (from requests->transformers) (3.0.4)\n", "Building wheels for collected packages: sacremoses\n", " Building wheel for sacremoses (setup.py) ... \u001b[?25l\u001b[?25hdone\n", " Created wheel for sacremoses: filename=sacremoses-0.0.43-cp36-none-any.whl size=893257 sha256=3f95484e6bfe6dca37925ea4e4eb8f8fd8cd56a1234b54c56e663ce9d809bdc6\n", " Stored in directory: /root/.cache/pip/wheels/29/3c/fd/7ce5c3f0666dab31a50123635e6fb5e19ceb42ce38d4e58f45\n", "Successfully built sacremoses\n", "Installing collected packages: sacremoses, tokenizers, transformers\n", "Successfully installed sacremoses-0.0.43 tokenizers-0.9.4 transformers-4.0.0\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "nCKW2hAUyK_4" }, "source": [ "#@title Step 2: Import BertViz Head Views and BERT \n", "from bertviz import head_view\n", "from transformers import BertTokenizer, BertModel" ], "execution_count": null, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "Mv6H9QK9yLLe" }, "source": [ "#@title Step 3: Defining the HTML Function\n", "def call_html():\n", " import IPython\n", " display(IPython.core.display.HTML('''\n", " \n", " \n", " '''))" ], "execution_count": null, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "fZAXH7hWyt58", "colab": { "resources": { "http://localhost:8080/static/components/requirejs/require.js": { "data": "/** vim: et:ts=4:sw=4:sts=4
 * @license RequireJS 2.1.22 Copyright (c) 2010-2015, The Dojo Foundation All Rights Reserved.
 * Available via the MIT or new BSD license.
 * see: http://github.com/jrburke/requirejs for details
 */
//Not using strict: uneven strict support in browsers, #392, and causes
//problems with requirejs.exec()/transpiler plugins that may not be strict.
/*jslint regexp: true, nomen: true, sloppy: true */
/*global window, navigator, document, importScripts, setTimeout, opera */

var requirejs, require, define;
(function (global) {
    var req, s, head, baseElement, dataMain, src,
        interactiveScript, currentlyAddingScript, mainScript, subPath,
        version = '2.1.22',
        commentRegExp = /(\/\*([\s\S]*?)\*\/|([^:]|^)\/\/(.*)$)/mg,
        cjsRequireRegExp = /[^.]\s*require\s*\(\s*["']([^'"\s]+)["']\s*\)/g,
        jsSuffixRegExp = /\.js$/,
        currDirRegExp = /^\.\//,
        op = Object.prototype,
        ostring = op.toString,
        hasOwn = op.hasOwnProperty,
        ap = Array.prototype,
        isBrowser = !!(typeof window !== 'undefined' && typeof navigator !== 'undefined' && window.document),
        isWebWorker = !isBrowser && typeof importScripts !== 'undefined',
        //PS3 indicates loaded and complete, but need to wait for complete
        //specifically. Sequence is 'loading', 'loaded', execution,
        // then 'complete'. The UA check is unfortunate, but not sure how
        //to feature test w/o causing perf issues.
        readyRegExp = isBrowser && navigator.platform === 'PLAYSTATION 3' ?
                      /^complete$/ : /^(complete|loaded)$/,
        defContextName = '_',
        //Oh the tragedy, detecting opera. See the usage of isOpera for reason.
        isOpera = typeof opera !== 'undefined' && opera.toString() === '[object Opera]',
        contexts = {},
        cfg = {},
        globalDefQueue = [],
        useInteractive = false;

    function isFunction(it) {
        return ostring.call(it) === '[object Function]';
    }

    function isArray(it) {
        return ostring.call(it) === '[object Array]';
    }

    /**
     * Helper function for iterating over an array. If the func returns
     * a true value, it will break out of the loop.
     */
    function each(ary, func) {
        if (ary) {
            var i;
            for (i = 0; i < ary.length; i += 1) {
                if (ary[i] && func(ary[i], i, ary)) {
                    break;
                }
            }
        }
    }

    /**
     * Helper function for iterating over an array backwards. If the func
     * returns a true value, it will break out of the loop.
     */
    function eachReverse(ary, func) {
        if (ary) {
            var i;
            for (i = ary.length - 1; i > -1; i -= 1) {
                if (ary[i] && func(ary[i], i, ary)) {
                    break;
                }
            }
        }
    }

    function hasProp(obj, prop) {
        return hasOwn.call(obj, prop);
    }

    function getOwn(obj, prop) {
        return hasProp(obj, prop) && obj[prop];
    }

    /**
     * Cycles over properties in an object and calls a function for each
     * property value. If the function returns a truthy value, then the
     * iteration is stopped.
     */
    function eachProp(obj, func) {
        var prop;
        for (prop in obj) {
            if (hasProp(obj, prop)) {
                if (func(obj[prop], prop)) {
                    break;
                }
            }
        }
    }

    /**
     * Simple function to mix in properties from source into target,
     * but only if target does not already have a property of the same name.
     */
    function mixin(target, source, force, deepStringMixin) {
        if (source) {
            eachProp(source, function (value, prop) {
                if (force || !hasProp(target, prop)) {
                    if (deepStringMixin && typeof value === 'object' && value &&
                        !isArray(value) && !isFunction(value) &&
                        !(value instanceof RegExp)) {

                        if (!target[prop]) {
                            target[prop] = {};
                        }
                        mixin(target[prop], value, force, deepStringMixin);
                    } else {
                        target[prop] = value;
                    }
                }
            });
        }
        return target;
    }

    //Similar to Function.prototype.bind, but the 'this' object is specified
    //first, since it is easier to read/figure out what 'this' will be.
    function bind(obj, fn) {
        return function () {
            return fn.apply(obj, arguments);
        };
    }

    function scripts() {
        return document.getElementsByTagName('script');
    }

    function defaultOnError(err) {
        throw err;
    }

    //Allow getting a global that is expressed in
    //dot notation, like 'a.b.c'.
    function getGlobal(value) {
        if (!value) {
            return value;
        }
        var g = global;
        each(value.split('.'), function (part) {
            g = g[part];
        });
        return g;
    }

    /**
     * Constructs an error with a pointer to an URL with more information.
     * @param {String} id the error ID that maps to an ID on a web page.
     * @param {String} message human readable error.
     * @param {Error} [err] the original error, if there is one.
     *
     * @returns {Error}
     */
    function makeError(id, msg, err, requireModules) {
        var e = new Error(msg + '\nhttp://requirejs.org/docs/errors.html#' + id);
        e.requireType = id;
        e.requireModules = requireModules;
        if (err) {
            e.originalError = err;
        }
        return e;
    }

    if (typeof define !== 'undefined') {
        //If a define is already in play via another AMD loader,
        //do not overwrite.
        return;
    }

    if (typeof requirejs !== 'undefined') {
        if (isFunction(requirejs)) {
            //Do not overwrite an existing requirejs instance.
            return;
        }
        cfg = requirejs;
        requirejs = undefined;
    }

    //Allow for a require config object
    if (typeof require !== 'undefined' && !isFunction(require)) {
        //assume it is a config object.
        cfg = require;
        require = undefined;
    }

    function newContext(contextName) {
        var inCheckLoaded, Module, context, handlers,
            checkLoadedTimeoutId,
            config = {
                //Defaults. Do not set a default for map
                //config to speed up normalize(), which
                //will run faster if there is no default.
                waitSeconds: 7,
                baseUrl: './',
                paths: {},
                bundles: {},
                pkgs: {},
                shim: {},
                config: {}
            },
            registry = {},
            //registry of just enabled modules, to speed
            //cycle breaking code when lots of modules
            //are registered, but not activated.
            enabledRegistry = {},
            undefEvents = {},
            defQueue = [],
            defined = {},
            urlFetched = {},
            bundlesMap = {},
            requireCounter = 1,
            unnormalizedCounter = 1;

        /**
         * Trims the . and .. from an array of path segments.
         * It will keep a leading path segment if a .. will become
         * the first path segment, to help with module name lookups,
         * which act like paths, but can be remapped. But the end result,
         * all paths that use this function should look normalized.
         * NOTE: this method MODIFIES the input array.
         * @param {Array} ary the array of path segments.
         */
        function trimDots(ary) {
            var i, part;
            for (i = 0; i < ary.length; i++) {
                part = ary[i];
                if (part === '.') {
                    ary.splice(i, 1);
                    i -= 1;
                } else if (part === '..') {
                    // If at the start, or previous value is still ..,
                    // keep them so that when converted to a path it may
                    // still work when converted to a path, even though
                    // as an ID it is less than ideal. In larger point
                    // releases, may be better to just kick out an error.
                    if (i === 0 || (i === 1 && ary[2] === '..') || ary[i - 1] === '..') {
                        continue;
                    } else if (i > 0) {
                        ary.splice(i - 1, 2);
                        i -= 2;
                    }
                }
            }
        }

        /**
         * Given a relative module name, like ./something, normalize it to
         * a real name that can be mapped to a path.
         * @param {String} name the relative name
         * @param {String} baseName a real name that the name arg is relative
         * to.
         * @param {Boolean} applyMap apply the map config to the value. Should
         * only be done if this normalization is for a dependency ID.
         * @returns {String} normalized name
         */
        function normalize(name, baseName, applyMap) {
            var pkgMain, mapValue, nameParts, i, j, nameSegment, lastIndex,
                foundMap, foundI, foundStarMap, starI, normalizedBaseParts,
                baseParts = (baseName && baseName.split('/')),
                map = config.map,
                starMap = map && map['*'];

            //Adjust any relative paths.
            if (name) {
                name = name.split('/');
                lastIndex = name.length - 1;

                // If wanting node ID compatibility, strip .js from end
                // of IDs. Have to do this here, and not in nameToUrl
                // because node allows either .js or non .js to map
                // to same file.
                if (config.nodeIdCompat && jsSuffixRegExp.test(name[lastIndex])) {
                    name[lastIndex] = name[lastIndex].replace(jsSuffixRegExp, '');
                }

                // Starts with a '.' so need the baseName
                if (name[0].charAt(0) === '.' && baseParts) {
                    //Convert baseName to array, and lop off the last part,
                    //so that . matches that 'directory' and not name of the baseName's
                    //module. For instance, baseName of 'one/two/three', maps to
                    //'one/two/three.js', but we want the directory, 'one/two' for
                    //this normalization.
                    normalizedBaseParts = baseParts.slice(0, baseParts.length - 1);
                    name = normalizedBaseParts.concat(name);
                }

                trimDots(name);
                name = name.join('/');
            }

            //Apply map config if available.
            if (applyMap && map && (baseParts || starMap)) {
                nameParts = name.split('/');

                outerLoop: for (i = nameParts.length; i > 0; i -= 1) {
                    nameSegment = nameParts.slice(0, i).join('/');

                    if (baseParts) {
                        //Find the longest baseName segment match in the config.
                        //So, do joins on the biggest to smallest lengths of baseParts.
                        for (j = baseParts.length; j > 0; j -= 1) {
                            mapValue = getOwn(map, baseParts.slice(0, j).join('/'));

                            //baseName segment has config, find if it has one for
                            //this name.
                            if (mapValue) {
                                mapValue = getOwn(mapValue, nameSegment);
                                if (mapValue) {
                                    //Match, update name to the new value.
                                    foundMap = mapValue;
                                    foundI = i;
                                    break outerLoop;
                                }
                            }
                        }
                    }

                    //Check for a star map match, but just hold on to it,
                    //if there is a shorter segment match later in a matching
                    //config, then favor over this star map.
                    if (!foundStarMap && starMap && getOwn(starMap, nameSegment)) {
                        foundStarMap = getOwn(starMap, nameSegment);
                        starI = i;
                    }
                }

                if (!foundMap && foundStarMap) {
                    foundMap = foundStarMap;
                    foundI = starI;
                }

                if (foundMap) {
                    nameParts.splice(0, foundI, foundMap);
                    name = nameParts.join('/');
                }
            }

            // If the name points to a package's name, use
            // the package main instead.
            pkgMain = getOwn(config.pkgs, name);

            return pkgMain ? pkgMain : name;
        }

        function removeScript(name) {
            if (isBrowser) {
                each(scripts(), function (scriptNode) {
                    if (scriptNode.getAttribute('data-requiremodule') === name &&
                            scriptNode.getAttribute('data-requirecontext') === context.contextName) {
                        scriptNode.parentNode.removeChild(scriptNode);
                        return true;
                    }
                });
            }
        }

        function hasPathFallback(id) {
            var pathConfig = getOwn(config.paths, id);
            if (pathConfig && isArray(pathConfig) && pathConfig.length > 1) {
                //Pop off the first array value, since it failed, and
                //retry
                pathConfig.shift();
                context.require.undef(id);

                //Custom require that does not do map translation, since
                //ID is "absolute", already mapped/resolved.
                context.makeRequire(null, {
                    skipMap: true
                })([id]);

                return true;
            }
        }

        //Turns a plugin!resource to [plugin, resource]
        //with the plugin being undefined if the name
        //did not have a plugin prefix.
        function splitPrefix(name) {
            var prefix,
                index = name ? name.indexOf('!') : -1;
            if (index > -1) {
                prefix = name.substring(0, index);
                name = name.substring(index + 1, name.length);
            }
            return [prefix, name];
        }

        /**
         * Creates a module mapping that includes plugin prefix, module
         * name, and path. If parentModuleMap is provided it will
         * also normalize the name via require.normalize()
         *
         * @param {String} name the module name
         * @param {String} [parentModuleMap] parent module map
         * for the module name, used to resolve relative names.
         * @param {Boolean} isNormalized: is the ID already normalized.
         * This is true if this call is done for a define() module ID.
         * @param {Boolean} applyMap: apply the map config to the ID.
         * Should only be true if this map is for a dependency.
         *
         * @returns {Object}
         */
        function makeModuleMap(name, parentModuleMap, isNormalized, applyMap) {
            var url, pluginModule, suffix, nameParts,
                prefix = null,
                parentName = parentModuleMap ? parentModuleMap.name : null,
                originalName = name,
                isDefine = true,
                normalizedName = '';

            //If no name, then it means it is a require call, generate an
            //internal name.
            if (!name) {
                isDefine = false;
                name = '_@r' + (requireCounter += 1);
            }

            nameParts = splitPrefix(name);
            prefix = nameParts[0];
            name = nameParts[1];

            if (prefix) {
                prefix = normalize(prefix, parentName, applyMap);
                pluginModule = getOwn(defined, prefix);
            }

            //Account for relative paths if there is a base name.
            if (name) {
                if (prefix) {
                    if (pluginModule && pluginModule.normalize) {
                        //Plugin is loaded, use its normalize method.
                        normalizedName = pluginModule.normalize(name, function (name) {
                            return normalize(name, parentName, applyMap);
                        });
                    } else {
                        // If nested plugin references, then do not try to
                        // normalize, as it will not normalize correctly. This
                        // places a restriction on resourceIds, and the longer
                        // term solution is not to normalize until plugins are
                        // loaded and all normalizations to allow for async
                        // loading of a loader plugin. But for now, fixes the
                        // common uses. Details in #1131
                        normalizedName = name.indexOf('!') === -1 ?
                                         normalize(name, parentName, applyMap) :
                                         name;
                    }
                } else {
                    //A regular module.
                    normalizedName = normalize(name, parentName, applyMap);

                    //Normalized name may be a plugin ID due to map config
                    //application in normalize. The map config values must
                    //already be normalized, so do not need to redo that part.
                    nameParts = splitPrefix(normalizedName);
                    prefix = nameParts[0];
                    normalizedName = nameParts[1];
                    isNormalized = true;

                    url = context.nameToUrl(normalizedName);
                }
            }

            //If the id is a plugin id that cannot be determined if it needs
            //normalization, stamp it with a unique ID so two matching relative
            //ids that may conflict can be separate.
            suffix = prefix && !pluginModule && !isNormalized ?
                     '_unnormalized' + (unnormalizedCounter += 1) :
                     '';

            return {
                prefix: prefix,
                name: normalizedName,
                parentMap: parentModuleMap,
                unnormalized: !!suffix,
                url: url,
                originalName: originalName,
                isDefine: isDefine,
                id: (prefix ?
                        prefix + '!' + normalizedName :
                        normalizedName) + suffix
            };
        }

        function getModule(depMap) {
            var id = depMap.id,
                mod = getOwn(registry, id);

            if (!mod) {
                mod = registry[id] = new context.Module(depMap);
            }

            return mod;
        }

        function on(depMap, name, fn) {
            var id = depMap.id,
                mod = getOwn(registry, id);

            if (hasProp(defined, id) &&
                    (!mod || mod.defineEmitComplete)) {
                if (name === 'defined') {
                    fn(defined[id]);
                }
            } else {
                mod = getModule(depMap);
                if (mod.error && name === 'error') {
                    fn(mod.error);
                } else {
                    mod.on(name, fn);
                }
            }
        }

        function onError(err, errback) {
            var ids = err.requireModules,
                notified = false;

            if (errback) {
                errback(err);
            } else {
                each(ids, function (id) {
                    var mod = getOwn(registry, id);
                    if (mod) {
                        //Set error on module, so it skips timeout checks.
                        mod.error = err;
                        if (mod.events.error) {
                            notified = true;
                            mod.emit('error', err);
                        }
                    }
                });

                if (!notified) {
                    req.onError(err);
                }
            }
        }

        /**
         * Internal method to transfer globalQueue items to this context's
         * defQueue.
         */
        function takeGlobalQueue() {
            //Push all the globalDefQueue items into the context's defQueue
            if (globalDefQueue.length) {
                each(globalDefQueue, function(queueItem) {
                    var id = queueItem[0];
                    if (typeof id === 'string') {
                        context.defQueueMap[id] = true;
                    }
                    defQueue.push(queueItem);
                });
                globalDefQueue = [];
            }
        }

        handlers = {
            'require': function (mod) {
                if (mod.require) {
                    return mod.require;
                } else {
                    return (mod.require = context.makeRequire(mod.map));
                }
            },
            'exports': function (mod) {
                mod.usingExports = true;
                if (mod.map.isDefine) {
                    if (mod.exports) {
                        return (defined[mod.map.id] = mod.exports);
                    } else {
                        return (mod.exports = defined[mod.map.id] = {});
                    }
                }
            },
            'module': function (mod) {
                if (mod.module) {
                    return mod.module;
                } else {
                    return (mod.module = {
                        id: mod.map.id,
                        uri: mod.map.url,
                        config: function () {
                            return getOwn(config.config, mod.map.id) || {};
                        },
                        exports: mod.exports || (mod.exports = {})
                    });
                }
            }
        };

        function cleanRegistry(id) {
            //Clean up machinery used for waiting modules.
            delete registry[id];
            delete enabledRegistry[id];
        }

        function breakCycle(mod, traced, processed) {
            var id = mod.map.id;

            if (mod.error) {
                mod.emit('error', mod.error);
            } else {
                traced[id] = true;
                each(mod.depMaps, function (depMap, i) {
                    var depId = depMap.id,
                        dep = getOwn(registry, depId);

                    //Only force things that have not completed
                    //being defined, so still in the registry,
                    //and only if it has not been matched up
                    //in the module already.
                    if (dep && !mod.depMatched[i] && !processed[depId]) {
                        if (getOwn(traced, depId)) {
                            mod.defineDep(i, defined[depId]);
                            mod.check(); //pass false?
                        } else {
                            breakCycle(dep, traced, processed);
                        }
                    }
                });
                processed[id] = true;
            }
        }

        function checkLoaded() {
            var err, usingPathFallback,
                waitInterval = config.waitSeconds * 1000,
                //It is possible to disable the wait interval by using waitSeconds of 0.
                expired = waitInterval && (context.startTime + waitInterval) < new Date().getTime(),
                noLoads = [],
                reqCalls = [],
                stillLoading = false,
                needCycleCheck = true;

            //Do not bother if this call was a result of a cycle break.
            if (inCheckLoaded) {
                return;
            }

            inCheckLoaded = true;

            //Figure out the state of all the modules.
            eachProp(enabledRegistry, function (mod) {
                var map = mod.map,
                    modId = map.id;

                //Skip things that are not enabled or in error state.
                if (!mod.enabled) {
                    return;
                }

                if (!map.isDefine) {
                    reqCalls.push(mod);
                }

                if (!mod.error) {
                    //If the module should be executed, and it has not
                    //been inited and time is up, remember it.
                    if (!mod.inited && expired) {
                        if (hasPathFallback(modId)) {
                            usingPathFallback = true;
                            stillLoading = true;
                        } else {
                            noLoads.push(modId);
                            removeScript(modId);
                        }
                    } else if (!mod.inited && mod.fetched && map.isDefine) {
                        stillLoading = true;
                        if (!map.prefix) {
                            //No reason to keep looking for unfinished
                            //loading. If the only stillLoading is a
                            //plugin resource though, keep going,
                            //because it may be that a plugin resource
                            //is waiting on a non-plugin cycle.
                            return (needCycleCheck = false);
                        }
                    }
                }
            });

            if (expired && noLoads.length) {
                //If wait time expired, throw error of unloaded modules.
                err = makeError('timeout', 'Load timeout for modules: ' + noLoads, null, noLoads);
                err.contextName = context.contextName;
                return onError(err);
            }

            //Not expired, check for a cycle.
            if (needCycleCheck) {
                each(reqCalls, function (mod) {
                    breakCycle(mod, {}, {});
                });
            }

            //If still waiting on loads, and the waiting load is something
            //other than a plugin resource, or there are still outstanding
            //scripts, then just try back later.
            if ((!expired || usingPathFallback) && stillLoading) {
                //Something is still waiting to load. Wait for it, but only
                //if a timeout is not already in effect.
                if ((isBrowser || isWebWorker) && !checkLoadedTimeoutId) {
                    checkLoadedTimeoutId = setTimeout(function () {
                        checkLoadedTimeoutId = 0;
                        checkLoaded();
                    }, 50);
                }
            }

            inCheckLoaded = false;
        }

        Module = function (map) {
            this.events = getOwn(undefEvents, map.id) || {};
            this.map = map;
            this.shim = getOwn(config.shim, map.id);
            this.depExports = [];
            this.depMaps = [];
            this.depMatched = [];
            this.pluginMaps = {};
            this.depCount = 0;

            /* this.exports this.factory
               this.depMaps = [],
               this.enabled, this.fetched
            */
        };

        Module.prototype = {
            init: function (depMaps, factory, errback, options) {
                options = options || {};

                //Do not do more inits if already done. Can happen if there
                //are multiple define calls for the same module. That is not
                //a normal, common case, but it is also not unexpected.
                if (this.inited) {
                    return;
                }

                this.factory = factory;

                if (errback) {
                    //Register for errors on this module.
                    this.on('error', errback);
                } else if (this.events.error) {
                    //If no errback already, but there are error listeners
                    //on this module, set up an errback to pass to the deps.
                    errback = bind(this, function (err) {
                        this.emit('error', err);
                    });
                }

                //Do a copy of the dependency array, so that
                //source inputs are not modified. For example
                //"shim" deps are passed in here directly, and
                //doing a direct modification of the depMaps array
                //would affect that config.
                this.depMaps = depMaps && depMaps.slice(0);

                this.errback = errback;

                //Indicate this module has be initialized
                this.inited = true;

                this.ignore = options.ignore;

                //Could have option to init this module in enabled mode,
                //or could have been previously marked as enabled. However,
                //the dependencies are not known until init is called. So
                //if enabled previously, now trigger dependencies as enabled.
                if (options.enabled || this.enabled) {
                    //Enable this module and dependencies.
                    //Will call this.check()
                    this.enable();
                } else {
                    this.check();
                }
            },

            defineDep: function (i, depExports) {
                //Because of cycles, defined callback for a given
                //export can be called more than once.
                if (!this.depMatched[i]) {
                    this.depMatched[i] = true;
                    this.depCount -= 1;
                    this.depExports[i] = depExports;
                }
            },

            fetch: function () {
                if (this.fetched) {
                    return;
                }
                this.fetched = true;

                context.startTime = (new Date()).getTime();

                var map = this.map;

                //If the manager is for a plugin managed resource,
                //ask the plugin to load it now.
                if (this.shim) {
                    context.makeRequire(this.map, {
                        enableBuildCallback: true
                    })(this.shim.deps || [], bind(this, function () {
                        return map.prefix ? this.callPlugin() : this.load();
                    }));
                } else {
                    //Regular dependency.
                    return map.prefix ? this.callPlugin() : this.load();
                }
            },

            load: function () {
                var url = this.map.url;

                //Regular dependency.
                if (!urlFetched[url]) {
                    urlFetched[url] = true;
                    context.load(this.map.id, url);
                }
            },

            /**
             * Checks if the module is ready to define itself, and if so,
             * define it.
             */
            check: function () {
                if (!this.enabled || this.enabling) {
                    return;
                }

                var err, cjsModule,
                    id = this.map.id,
                    depExports = this.depExports,
                    exports = this.exports,
                    factory = this.factory;

                if (!this.inited) {
                    // Only fetch if not already in the defQueue.
                    if (!hasProp(context.defQueueMap, id)) {
                        this.fetch();
                    }
                } else if (this.error) {
                    this.emit('error', this.error);
                } else if (!this.defining) {
                    //The factory could trigger another require call
                    //that would result in checking this module to
                    //define itself again. If already in the process
                    //of doing that, skip this work.
                    this.defining = true;

                    if (this.depCount < 1 && !this.defined) {
                        if (isFunction(factory)) {
                            try {
                                exports = context.execCb(id, factory, depExports, exports);
                            } catch (e) {
                                err = e;
                            }

                            // Favor return value over exports. If node/cjs in play,
                            // then will not have a return value anyway. Favor
                            // module.exports assignment over exports object.
                            if (this.map.isDefine && exports === undefined) {
                                cjsModule = this.module;
                                if (cjsModule) {
                                    exports = cjsModule.exports;
                                } else if (this.usingExports) {
                                    //exports already set the defined value.
                                    exports = this.exports;
                                }
                            }

                            if (err) {
                                // If there is an error listener, favor passing
                                // to that instead of throwing an error. However,
                                // only do it for define()'d  modules. require
                                // errbacks should not be called for failures in
                                // their callbacks (#699). However if a global
                                // onError is set, use that.
                                if ((this.events.error && this.map.isDefine) ||
                                    req.onError !== defaultOnError) {
                                    err.requireMap = this.map;
                                    err.requireModules = this.map.isDefine ? [this.map.id] : null;
                                    err.requireType = this.map.isDefine ? 'define' : 'require';
                                    return onError((this.error = err));
                                } else if (typeof console !== 'undefined' &&
                                           console.error) {
                                    // Log the error for debugging. If promises could be
                                    // used, this would be different, but making do.
                                    console.error(err);
                                } else {
                                    // Do not want to completely lose the error. While this
                                    // will mess up processing and lead to similar results
                                    // as bug 1440, it at least surfaces the error.
                                    req.onError(err);
                                }
                            }
                        } else {
                            //Just a literal value
                            exports = factory;
                        }

                        this.exports = exports;

                        if (this.map.isDefine && !this.ignore) {
                            defined[id] = exports;

                            if (req.onResourceLoad) {
                                var resLoadMaps = [];
                                each(this.depMaps, function (depMap) {
                                    resLoadMaps.push(depMap.normalizedMap || depMap);
                                });
                                req.onResourceLoad(context, this.map, resLoadMaps);
                            }
                        }

                        //Clean up
                        cleanRegistry(id);

                        this.defined = true;
                    }

                    //Finished the define stage. Allow calling check again
                    //to allow define notifications below in the case of a
                    //cycle.
                    this.defining = false;

                    if (this.defined && !this.defineEmitted) {
                        this.defineEmitted = true;
                        this.emit('defined', this.exports);
                        this.defineEmitComplete = true;
                    }

                }
            },

            callPlugin: function () {
                var map = this.map,
                    id = map.id,
                    //Map already normalized the prefix.
                    pluginMap = makeModuleMap(map.prefix);

                //Mark this as a dependency for this plugin, so it
                //can be traced for cycles.
                this.depMaps.push(pluginMap);

                on(pluginMap, 'defined', bind(this, function (plugin) {
                    var load, normalizedMap, normalizedMod,
                        bundleId = getOwn(bundlesMap, this.map.id),
                        name = this.map.name,
                        parentName = this.map.parentMap ? this.map.parentMap.name : null,
                        localRequire = context.makeRequire(map.parentMap, {
                            enableBuildCallback: true
                        });

                    //If current map is not normalized, wait for that
                    //normalized name to load instead of continuing.
                    if (this.map.unnormalized) {
                        //Normalize the ID if the plugin allows it.
                        if (plugin.normalize) {
                            name = plugin.normalize(name, function (name) {
                                return normalize(name, parentName, true);
                            }) || '';
                        }

                        //prefix and name should already be normalized, no need
                        //for applying map config again either.
                        normalizedMap = makeModuleMap(map.prefix + '!' + name,
                                                      this.map.parentMap);
                        on(normalizedMap,
                            'defined', bind(this, function (value) {
                                this.map.normalizedMap = normalizedMap;
                                this.init([], function () { return value; }, null, {
                                    enabled: true,
                                    ignore: true
                                });
                            }));

                        normalizedMod = getOwn(registry, normalizedMap.id);
                        if (normalizedMod) {
                            //Mark this as a dependency for this plugin, so it
                            //can be traced for cycles.
                            this.depMaps.push(normalizedMap);

                            if (this.events.error) {
                                normalizedMod.on('error', bind(this, function (err) {
                                    this.emit('error', err);
                                }));
                            }
                            normalizedMod.enable();
                        }

                        return;
                    }

                    //If a paths config, then just load that file instead to
                    //resolve the plugin, as it is built into that paths layer.
                    if (bundleId) {
                        this.map.url = context.nameToUrl(bundleId);
                        this.load();
                        return;
                    }

                    load = bind(this, function (value) {
                        this.init([], function () { return value; }, null, {
                            enabled: true
                        });
                    });

                    load.error = bind(this, function (err) {
                        this.inited = true;
                        this.error = err;
                        err.requireModules = [id];

                        //Remove temp unnormalized modules for this module,
                        //since they will never be resolved otherwise now.
                        eachProp(registry, function (mod) {
                            if (mod.map.id.indexOf(id + '_unnormalized') === 0) {
                                cleanRegistry(mod.map.id);
                            }
                        });

                        onError(err);
                    });

                    //Allow plugins to load other code without having to know the
                    //context or how to 'complete' the load.
                    load.fromText = bind(this, function (text, textAlt) {
                        /*jslint evil: true */
                        var moduleName = map.name,
                            moduleMap = makeModuleMap(moduleName),
                            hasInteractive = useInteractive;

                        //As of 2.1.0, support just passing the text, to reinforce
                        //fromText only being called once per resource. Still
                        //support old style of passing moduleName but discard
                        //that moduleName in favor of the internal ref.
                        if (textAlt) {
                            text = textAlt;
                        }

                        //Turn off interactive script matching for IE for any define
                        //calls in the text, then turn it back on at the end.
                        if (hasInteractive) {
                            useInteractive = false;
                        }

                        //Prime the system by creating a module instance for
                        //it.
                        getModule(moduleMap);

                        //Transfer any config to this other module.
                        if (hasProp(config.config, id)) {
                            config.config[moduleName] = config.config[id];
                        }

                        try {
                            req.exec(text);
                        } catch (e) {
                            return onError(makeError('fromtexteval',
                                             'fromText eval for ' + id +
                                            ' failed: ' + e,
                                             e,
                                             [id]));
                        }

                        if (hasInteractive) {
                            useInteractive = true;
                        }

                        //Mark this as a dependency for the plugin
                        //resource
                        this.depMaps.push(moduleMap);

                        //Support anonymous modules.
                        context.completeLoad(moduleName);

                        //Bind the value of that module to the value for this
                        //resource ID.
                        localRequire([moduleName], load);
                    });

                    //Use parentName here since the plugin's name is not reliable,
                    //could be some weird string with no path that actually wants to
                    //reference the parentName's path.
                    plugin.load(map.name, localRequire, load, config);
                }));

                context.enable(pluginMap, this);
                this.pluginMaps[pluginMap.id] = pluginMap;
            },

            enable: function () {
                enabledRegistry[this.map.id] = this;
                this.enabled = true;

                //Set flag mentioning that the module is enabling,
                //so that immediate calls to the defined callbacks
                //for dependencies do not trigger inadvertent load
                //with the depCount still being zero.
                this.enabling = true;

                //Enable each dependency
                each(this.depMaps, bind(this, function (depMap, i) {
                    var id, mod, handler;

                    if (typeof depMap === 'string') {
                        //Dependency needs to be converted to a depMap
                        //and wired up to this module.
                        depMap = makeModuleMap(depMap,
                                               (this.map.isDefine ? this.map : this.map.parentMap),
                                               false,
                                               !this.skipMap);
                        this.depMaps[i] = depMap;

                        handler = getOwn(handlers, depMap.id);

                        if (handler) {
                            this.depExports[i] = handler(this);
                            return;
                        }

                        this.depCount += 1;

                        on(depMap, 'defined', bind(this, function (depExports) {
                            if (this.undefed) {
                                return;
                            }
                            this.defineDep(i, depExports);
                            this.check();
                        }));

                        if (this.errback) {
                            on(depMap, 'error', bind(this, this.errback));
                        } else if (this.events.error) {
                            // No direct errback on this module, but something
                            // else is listening for errors, so be sure to
                            // propagate the error correctly.
                            on(depMap, 'error', bind(this, function(err) {
                                this.emit('error', err);
                            }));
                        }
                    }

                    id = depMap.id;
                    mod = registry[id];

                    //Skip special modules like 'require', 'exports', 'module'
                    //Also, don't call enable if it is already enabled,
                    //important in circular dependency cases.
                    if (!hasProp(handlers, id) && mod && !mod.enabled) {
                        context.enable(depMap, this);
                    }
                }));

                //Enable each plugin that is used in
                //a dependency
                eachProp(this.pluginMaps, bind(this, function (pluginMap) {
                    var mod = getOwn(registry, pluginMap.id);
                    if (mod && !mod.enabled) {
                        context.enable(pluginMap, this);
                    }
                }));

                this.enabling = false;

                this.check();
            },

            on: function (name, cb) {
                var cbs = this.events[name];
                if (!cbs) {
                    cbs = this.events[name] = [];
                }
                cbs.push(cb);
            },

            emit: function (name, evt) {
                each(this.events[name], function (cb) {
                    cb(evt);
                });
                if (name === 'error') {
                    //Now that the error handler was triggered, remove
                    //the listeners, since this broken Module instance
                    //can stay around for a while in the registry.
                    delete this.events[name];
                }
            }
        };

        function callGetModule(args) {
            //Skip modules already defined.
            if (!hasProp(defined, args[0])) {
                getModule(makeModuleMap(args[0], null, true)).init(args[1], args[2]);
            }
        }

        function removeListener(node, func, name, ieName) {
            //Favor detachEvent because of IE9
            //issue, see attachEvent/addEventListener comment elsewhere
            //in this file.
            if (node.detachEvent && !isOpera) {
                //Probably IE. If not it will throw an error, which will be
                //useful to know.
                if (ieName) {
                    node.detachEvent(ieName, func);
                }
            } else {
                node.removeEventListener(name, func, false);
            }
        }

        /**
         * Given an event from a script node, get the requirejs info from it,
         * and then removes the event listeners on the node.
         * @param {Event} evt
         * @returns {Object}
         */
        function getScriptData(evt) {
            //Using currentTarget instead of target for Firefox 2.0's sake. Not
            //all old browsers will be supported, but this one was easy enough
            //to support and still makes sense.
            var node = evt.currentTarget || evt.srcElement;

            //Remove the listeners once here.
            removeListener(node, context.onScriptLoad, 'load', 'onreadystatechange');
            removeListener(node, context.onScriptError, 'error');

            return {
                node: node,
                id: node && node.getAttribute('data-requiremodule')
            };
        }

        function intakeDefines() {
            var args;

            //Any defined modules in the global queue, intake them now.
            takeGlobalQueue();

            //Make sure any remaining defQueue items get properly processed.
            while (defQueue.length) {
                args = defQueue.shift();
                if (args[0] === null) {
                    return onError(makeError('mismatch', 'Mismatched anonymous define() module: ' +
                        args[args.length - 1]));
                } else {
                    //args are id, deps, factory. Should be normalized by the
                    //define() function.
                    callGetModule(args);
                }
            }
            context.defQueueMap = {};
        }

        context = {
            config: config,
            contextName: contextName,
            registry: registry,
            defined: defined,
            urlFetched: urlFetched,
            defQueue: defQueue,
            defQueueMap: {},
            Module: Module,
            makeModuleMap: makeModuleMap,
            nextTick: req.nextTick,
            onError: onError,

            /**
             * Set a configuration for the context.
             * @param {Object} cfg config object to integrate.
             */
            configure: function (cfg) {
                //Make sure the baseUrl ends in a slash.
                if (cfg.baseUrl) {
                    if (cfg.baseUrl.charAt(cfg.baseUrl.length - 1) !== '/') {
                        cfg.baseUrl += '/';
                    }
                }

                //Save off the paths since they require special processing,
                //they are additive.
                var shim = config.shim,
                    objs = {
                        paths: true,
                        bundles: true,
                        config: true,
                        map: true
                    };

                eachProp(cfg, function (value, prop) {
                    if (objs[prop]) {
                        if (!config[prop]) {
                            config[prop] = {};
                        }
                        mixin(config[prop], value, true, true);
                    } else {
                        config[prop] = value;
                    }
                });

                //Reverse map the bundles
                if (cfg.bundles) {
                    eachProp(cfg.bundles, function (value, prop) {
                        each(value, function (v) {
                            if (v !== prop) {
                                bundlesMap[v] = prop;
                            }
                        });
                    });
                }

                //Merge shim
                if (cfg.shim) {
                    eachProp(cfg.shim, function (value, id) {
                        //Normalize the structure
                        if (isArray(value)) {
                            value = {
                                deps: value
                            };
                        }
                        if ((value.exports || value.init) && !value.exportsFn) {
                            value.exportsFn = context.makeShimExports(value);
                        }
                        shim[id] = value;
                    });
                    config.shim = shim;
                }

                //Adjust packages if necessary.
                if (cfg.packages) {
                    each(cfg.packages, function (pkgObj) {
                        var location, name;

                        pkgObj = typeof pkgObj === 'string' ? {name: pkgObj} : pkgObj;

                        name = pkgObj.name;
                        location = pkgObj.location;
                        if (location) {
                            config.paths[name] = pkgObj.location;
                        }

                        //Save pointer to main module ID for pkg name.
                        //Remove leading dot in main, so main paths are normalized,
                        //and remove any trailing .js, since different package
                        //envs have different conventions: some use a module name,
                        //some use a file name.
                        config.pkgs[name] = pkgObj.name + '/' + (pkgObj.main || 'main')
                                     .replace(currDirRegExp, '')
                                     .replace(jsSuffixRegExp, '');
                    });
                }

                //If there are any "waiting to execute" modules in the registry,
                //update the maps for them, since their info, like URLs to load,
                //may have changed.
                eachProp(registry, function (mod, id) {
                    //If module already has init called, since it is too
                    //late to modify them, and ignore unnormalized ones
                    //since they are transient.
                    if (!mod.inited && !mod.map.unnormalized) {
                        mod.map = makeModuleMap(id, null, true);
                    }
                });

                //If a deps array or a config callback is specified, then call
                //require with those args. This is useful when require is defined as a
                //config object before require.js is loaded.
                if (cfg.deps || cfg.callback) {
                    context.require(cfg.deps || [], cfg.callback);
                }
            },

            makeShimExports: function (value) {
                function fn() {
                    var ret;
                    if (value.init) {
                        ret = value.init.apply(global, arguments);
                    }
                    return ret || (value.exports && getGlobal(value.exports));
                }
                return fn;
            },

            makeRequire: function (relMap, options) {
                options = options || {};

                function localRequire(deps, callback, errback) {
                    var id, map, requireMod;

                    if (options.enableBuildCallback && callback && isFunction(callback)) {
                        callback.__requireJsBuild = true;
                    }

                    if (typeof deps === 'string') {
                        if (isFunction(callback)) {
                            //Invalid call
                            return onError(makeError('requireargs', 'Invalid require call'), errback);
                        }

                        //If require|exports|module are requested, get the
                        //value for them from the special handlers. Caveat:
                        //this only works while module is being defined.
                        if (relMap && hasProp(handlers, deps)) {
                            return handlers[deps](registry[relMap.id]);
                        }

                        //Synchronous access to one module. If require.get is
                        //available (as in the Node adapter), prefer that.
                        if (req.get) {
                            return req.get(context, deps, relMap, localRequire);
                        }

                        //Normalize module name, if it contains . or ..
                        map = makeModuleMap(deps, relMap, false, true);
                        id = map.id;

                        if (!hasProp(defined, id)) {
                            return onError(makeError('notloaded', 'Module name "' +
                                        id +
                                        '" has not been loaded yet for context: ' +
                                        contextName +
                                        (relMap ? '' : '. Use require([])')));
                        }
                        return defined[id];
                    }

                    //Grab defines waiting in the global queue.
                    intakeDefines();

                    //Mark all the dependencies as needing to be loaded.
                    context.nextTick(function () {
                        //Some defines could have been added since the
                        //require call, collect them.
                        intakeDefines();

                        requireMod = getModule(makeModuleMap(null, relMap));

                        //Store if map config should be applied to this require
                        //call for dependencies.
                        requireMod.skipMap = options.skipMap;

                        requireMod.init(deps, callback, errback, {
                            enabled: true
                        });

                        checkLoaded();
                    });

                    return localRequire;
                }

                mixin(localRequire, {
                    isBrowser: isBrowser,

                    /**
                     * Converts a module name + .extension into an URL path.
                     * *Requires* the use of a module name. It does not support using
                     * plain URLs like nameToUrl.
                     */
                    toUrl: function (moduleNamePlusExt) {
                        var ext,
                            index = moduleNamePlusExt.lastIndexOf('.'),
                            segment = moduleNamePlusExt.split('/')[0],
                            isRelative = segment === '.' || segment === '..';

                        //Have a file extension alias, and it is not the
                        //dots from a relative path.
                        if (index !== -1 && (!isRelative || index > 1)) {
                            ext = moduleNamePlusExt.substring(index, moduleNamePlusExt.length);
                            moduleNamePlusExt = moduleNamePlusExt.substring(0, index);
                        }

                        return context.nameToUrl(normalize(moduleNamePlusExt,
                                                relMap && relMap.id, true), ext,  true);
                    },

                    defined: function (id) {
                        return hasProp(defined, makeModuleMap(id, relMap, false, true).id);
                    },

                    specified: function (id) {
                        id = makeModuleMap(id, relMap, false, true).id;
                        return hasProp(defined, id) || hasProp(registry, id);
                    }
                });

                //Only allow undef on top level require calls
                if (!relMap) {
                    localRequire.undef = function (id) {
                        //Bind any waiting define() calls to this context,
                        //fix for #408
                        takeGlobalQueue();

                        var map = makeModuleMap(id, relMap, true),
                            mod = getOwn(registry, id);

                        mod.undefed = true;
                        removeScript(id);

                        delete defined[id];
                        delete urlFetched[map.url];
                        delete undefEvents[id];

                        //Clean queued defines too. Go backwards
                        //in array so that the splices do not
                        //mess up the iteration.
                        eachReverse(defQueue, function(args, i) {
                            if (args[0] === id) {
                                defQueue.splice(i, 1);
                            }
                        });
                        delete context.defQueueMap[id];

                        if (mod) {
                            //Hold on to listeners in case the
                            //module will be attempted to be reloaded
                            //using a different config.
                            if (mod.events.defined) {
                                undefEvents[id] = mod.events;
                            }

                            cleanRegistry(id);
                        }
                    };
                }

                return localRequire;
            },

            /**
             * Called to enable a module if it is still in the registry
             * awaiting enablement. A second arg, parent, the parent module,
             * is passed in for context, when this method is overridden by
             * the optimizer. Not shown here to keep code compact.
             */
            enable: function (depMap) {
                var mod = getOwn(registry, depMap.id);
                if (mod) {
                    getModule(depMap).enable();
                }
            },

            /**
             * Internal method used by environment adapters to complete a load event.
             * A load event could be a script load or just a load pass from a synchronous
             * load call.
             * @param {String} moduleName the name of the module to potentially complete.
             */
            completeLoad: function (moduleName) {
                var found, args, mod,
                    shim = getOwn(config.shim, moduleName) || {},
                    shExports = shim.exports;

                takeGlobalQueue();

                while (defQueue.length) {
                    args = defQueue.shift();
                    if (args[0] === null) {
                        args[0] = moduleName;
                        //If already found an anonymous module and bound it
                        //to this name, then this is some other anon module
                        //waiting for its completeLoad to fire.
                        if (found) {
                            break;
                        }
                        found = true;
                    } else if (args[0] === moduleName) {
                        //Found matching define call for this script!
                        found = true;
                    }

                    callGetModule(args);
                }
                context.defQueueMap = {};

                //Do this after the cycle of callGetModule in case the result
                //of those calls/init calls changes the registry.
                mod = getOwn(registry, moduleName);

                if (!found && !hasProp(defined, moduleName) && mod && !mod.inited) {
                    if (config.enforceDefine && (!shExports || !getGlobal(shExports))) {
                        if (hasPathFallback(moduleName)) {
                            return;
                        } else {
                            return onError(makeError('nodefine',
                                             'No define call for ' + moduleName,
                                             null,
                                             [moduleName]));
                        }
                    } else {
                        //A script that does not call define(), so just simulate
                        //the call for it.
                        callGetModule([moduleName, (shim.deps || []), shim.exportsFn]);
                    }
                }

                checkLoaded();
            },

            /**
             * Converts a module name to a file path. Supports cases where
             * moduleName may actually be just an URL.
             * Note that it **does not** call normalize on the moduleName,
             * it is assumed to have already been normalized. This is an
             * internal API, not a public one. Use toUrl for the public API.
             */
            nameToUrl: function (moduleName, ext, skipExt) {
                var paths, syms, i, parentModule, url,
                    parentPath, bundleId,
                    pkgMain = getOwn(config.pkgs, moduleName);

                if (pkgMain) {
                    moduleName = pkgMain;
                }

                bundleId = getOwn(bundlesMap, moduleName);

                if (bundleId) {
                    return context.nameToUrl(bundleId, ext, skipExt);
                }

                //If a colon is in the URL, it indicates a protocol is used and it is just
                //an URL to a file, or if it starts with a slash, contains a query arg (i.e. ?)
                //or ends with .js, then assume the user meant to use an url and not a module id.
                //The slash is important for protocol-less URLs as well as full paths.
                if (req.jsExtRegExp.test(moduleName)) {
                    //Just a plain path, not module name lookup, so just return it.
                    //Add extension if it is included. This is a bit wonky, only non-.js things pass
                    //an extension, this method probably needs to be reworked.
                    url = moduleName + (ext || '');
                } else {
                    //A module that needs to be converted to a path.
                    paths = config.paths;

                    syms = moduleName.split('/');
                    //For each module name segment, see if there is a path
                    //registered for it. Start with most specific name
                    //and work up from it.
                    for (i = syms.length; i > 0; i -= 1) {
                        parentModule = syms.slice(0, i).join('/');

                        parentPath = getOwn(paths, parentModule);
                        if (parentPath) {
                            //If an array, it means there are a few choices,
                            //Choose the one that is desired
                            if (isArray(parentPath)) {
                                parentPath = parentPath[0];
                            }
                            syms.splice(0, i, parentPath);
                            break;
                        }
                    }

                    //Join the path parts together, then figure out if baseUrl is needed.
                    url = syms.join('/');
                    url += (ext || (/^data\:|\?/.test(url) || skipExt ? '' : '.js'));
                    url = (url.charAt(0) === '/' || url.match(/^[\w\+\.\-]+:/) ? '' : config.baseUrl) + url;
                }

                return config.urlArgs ? url +
                                        ((url.indexOf('?') === -1 ? '?' : '&') +
                                         config.urlArgs) : url;
            },

            //Delegates to req.load. Broken out as a separate function to
            //allow overriding in the optimizer.
            load: function (id, url) {
                req.load(context, id, url);
            },

            /**
             * Executes a module callback function. Broken out as a separate function
             * solely to allow the build system to sequence the files in the built
             * layer in the right sequence.
             *
             * @private
             */
            execCb: function (name, callback, args, exports) {
                return callback.apply(exports, args);
            },

            /**
             * callback for script loads, used to check status of loading.
             *
             * @param {Event} evt the event from the browser for the script
             * that was loaded.
             */
            onScriptLoad: function (evt) {
                //Using currentTarget instead of target for Firefox 2.0's sake. Not
                //all old browsers will be supported, but this one was easy enough
                //to support and still makes sense.
                if (evt.type === 'load' ||
                        (readyRegExp.test((evt.currentTarget || evt.srcElement).readyState))) {
                    //Reset interactive script so a script node is not held onto for
                    //to long.
                    interactiveScript = null;

                    //Pull out the name of the module and the context.
                    var data = getScriptData(evt);
                    context.completeLoad(data.id);
                }
            },

            /**
             * Callback for script errors.
             */
            onScriptError: function (evt) {
                var data = getScriptData(evt);
                if (!hasPathFallback(data.id)) {
                    var parents = [];
                    eachProp(registry, function(value, key) {
                        if (key.indexOf('_@r') !== 0) {
                            each(value.depMaps, function(depMap) {
                                if (depMap.id === data.id) {
                                    parents.push(key);
                                }
                                return true;
                            });
                        }
                    });
                    return onError(makeError('scripterror', 'Script error for "' + data.id +
                                             (parents.length ?
                                             '", needed by: ' + parents.join(', ') :
                                             '"'), evt, [data.id]));
                }
            }
        };

        context.require = context.makeRequire();
        return context;
    }

    /**
     * Main entry point.
     *
     * If the only argument to require is a string, then the module that
     * is represented by that string is fetched for the appropriate context.
     *
     * If the first argument is an array, then it will be treated as an array
     * of dependency string names to fetch. An optional function callback can
     * be specified to execute when all of those dependencies are available.
     *
     * Make a local req variable to help Caja compliance (it assumes things
     * on a require that are not standardized), and to give a short
     * name for minification/local scope use.
     */
    req = requirejs = function (deps, callback, errback, optional) {

        //Find the right context, use default
        var context, config,
            contextName = defContextName;

        // Determine if have config object in the call.
        if (!isArray(deps) && typeof deps !== 'string') {
            // deps is a config object
            config = deps;
            if (isArray(callback)) {
                // Adjust args if there are dependencies
                deps = callback;
                callback = errback;
                errback = optional;
            } else {
                deps = [];
            }
        }

        if (config && config.context) {
            contextName = config.context;
        }

        context = getOwn(contexts, contextName);
        if (!context) {
            context = contexts[contextName] = req.s.newContext(contextName);
        }

        if (config) {
            context.configure(config);
        }

        return context.require(deps, callback, errback);
    };

    /**
     * Support require.config() to make it easier to cooperate with other
     * AMD loaders on globally agreed names.
     */
    req.config = function (config) {
        return req(config);
    };

    /**
     * Execute something after the current tick
     * of the event loop. Override for other envs
     * that have a better solution than setTimeout.
     * @param  {Function} fn function to execute later.
     */
    req.nextTick = typeof setTimeout !== 'undefined' ? function (fn) {
        setTimeout(fn, 4);
    } : function (fn) { fn(); };

    /**
     * Export require as a global, but only if it does not already exist.
     */
    if (!require) {
        require = req;
    }

    req.version = version;

    //Used to filter out dependencies that are already paths.
    req.jsExtRegExp = /^\/|:|\?|\.js$/;
    req.isBrowser = isBrowser;
    s = req.s = {
        contexts: contexts,
        newContext: newContext
    };

    //Create default context.
    req({});

    //Exports some context-sensitive methods on global require.
    each([
        'toUrl',
        'undef',
        'defined',
        'specified'
    ], function (prop) {
        //Reference from contexts instead of early binding to default context,
        //so that during builds, the latest instance of the default context
        //with its config gets used.
        req[prop] = function () {
            var ctx = contexts[defContextName];
            return ctx.require[prop].apply(ctx, arguments);
        };
    });

    if (isBrowser) {
        head = s.head = document.getElementsByTagName('head')[0];
        //If BASE tag is in play, using appendChild is a problem for IE6.
        //When that browser dies, this can be removed. Details in this jQuery bug:
        //http://dev.jquery.com/ticket/2709
        baseElement = document.getElementsByTagName('base')[0];
        if (baseElement) {
            head = s.head = baseElement.parentNode;
        }
    }

    /**
     * Any errors that require explicitly generates will be passed to this
     * function. Intercept/override it if you want custom error handling.
     * @param {Error} err the error object.
     */
    req.onError = defaultOnError;

    /**
     * Creates the node for the load command. Only used in browser envs.
     */
    req.createNode = function (config, moduleName, url) {
        var node = config.xhtml ?
                document.createElementNS('http://www.w3.org/1999/xhtml', 'html:script') :
                document.createElement('script');
        node.type = config.scriptType || 'text/javascript';
        node.charset = 'utf-8';
        node.async = true;
        return node;
    };

    /**
     * Does the request to load a module for the browser case.
     * Make this a separate function to allow other environments
     * to override it.
     *
     * @param {Object} context the require context to find state.
     * @param {String} moduleName the name of the module.
     * @param {Object} url the URL to the module.
     */
    req.load = function (context, moduleName, url) {
        var config = (context && context.config) || {},
            node;
        if (isBrowser) {
            //In the browser so use a script tag
            node = req.createNode(config, moduleName, url);
            if (config.onNodeCreated) {
                config.onNodeCreated(node, config, moduleName, url);
            }

            node.setAttribute('data-requirecontext', context.contextName);
            node.setAttribute('data-requiremodule', moduleName);

            //Set up load listener. Test attachEvent first because IE9 has
            //a subtle issue in its addEventListener and script onload firings
            //that do not match the behavior of all other browsers with
            //addEventListener support, which fire the onload event for a
            //script right after the script execution. See:
            //https://connect.microsoft.com/IE/feedback/details/648057/script-onload-event-is-not-fired-immediately-after-script-execution
            //UNFORTUNATELY Opera implements attachEvent but does not follow the script
            //script execution mode.
            if (node.attachEvent &&
                    //Check if node.attachEvent is artificially added by custom script or
                    //natively supported by browser
                    //read https://github.com/jrburke/requirejs/issues/187
                    //if we can NOT find [native code] then it must NOT natively supported.
                    //in IE8, node.attachEvent does not have toString()
                    //Note the test for "[native code" with no closing brace, see:
                    //https://github.com/jrburke/requirejs/issues/273
                    !(node.attachEvent.toString && node.attachEvent.toString().indexOf('[native code') < 0) &&
                    !isOpera) {
                //Probably IE. IE (at least 6-8) do not fire
                //script onload right after executing the script, so
                //we cannot tie the anonymous define call to a name.
                //However, IE reports the script as being in 'interactive'
                //readyState at the time of the define call.
                useInteractive = true;

                node.attachEvent('onreadystatechange', context.onScriptLoad);
                //It would be great to add an error handler here to catch
                //404s in IE9+. However, onreadystatechange will fire before
                //the error handler, so that does not help. If addEventListener
                //is used, then IE will fire error before load, but we cannot
                //use that pathway given the connect.microsoft.com issue
                //mentioned above about not doing the 'script execute,
                //then fire the script load event listener before execute
                //next script' that other browsers do.
                //Best hope: IE10 fixes the issues,
                //and then destroys all installs of IE 6-9.
                //node.attachEvent('onerror', context.onScriptError);
            } else {
                node.addEventListener('load', context.onScriptLoad, false);
                node.addEventListener('error', context.onScriptError, false);
            }
            node.src = url;

            //For some cache cases in IE 6-8, the script executes before the end
            //of the appendChild execution, so to tie an anonymous define
            //call to the module name (which is stored on the node), hold on
            //to a reference to this node, but clear after the DOM insertion.
            currentlyAddingScript = node;
            if (baseElement) {
                head.insertBefore(node, baseElement);
            } else {
                head.appendChild(node);
            }
            currentlyAddingScript = null;

            return node;
        } else if (isWebWorker) {
            try {
                //In a web worker, use importScripts. This is not a very
                //efficient use of importScripts, importScripts will block until
                //its script is downloaded and evaluated. However, if web workers
                //are in play, the expectation is that a build has been done so
                //that only one script needs to be loaded anyway. This may need
                //to be reevaluated if other use cases become common.
                importScripts(url);

                //Account for anonymous modules
                context.completeLoad(moduleName);
            } catch (e) {
                context.onError(makeError('importscripts',
                                'importScripts failed for ' +
                                    moduleName + ' at ' + url,
                                e,
                                [moduleName]));
            }
        }
    };

    function getInteractiveScript() {
        if (interactiveScript && interactiveScript.readyState === 'interactive') {
            return interactiveScript;
        }

        eachReverse(scripts(), function (script) {
            if (script.readyState === 'interactive') {
                return (interactiveScript = script);
            }
        });
        return interactiveScript;
    }

    //Look for a data-main script attribute, which could also adjust the baseUrl.
    if (isBrowser && !cfg.skipDataMain) {
        //Figure out baseUrl. Get it from the script tag with require.js in it.
        eachReverse(scripts(), function (script) {
            //Set the 'head' where we can append children by
            //using the script's parent.
            if (!head) {
                head = script.parentNode;
            }

            //Look for a data-main attribute to set main script for the page
            //to load. If it is there, the path to data main becomes the
            //baseUrl, if it is not already set.
            dataMain = script.getAttribute('data-main');
            if (dataMain) {
                //Preserve dataMain in case it is a path (i.e. contains '?')
                mainScript = dataMain;

                //Set final baseUrl if there is not already an explicit one.
                if (!cfg.baseUrl) {
                    //Pull off the directory of data-main for use as the
                    //baseUrl.
                    src = mainScript.split('/');
                    mainScript = src.pop();
                    subPath = src.length ? src.join('/')  + '/' : './';

                    cfg.baseUrl = subPath;
                }

                //Strip off any trailing .js since mainScript is now
                //like a module name.
                mainScript = mainScript.replace(jsSuffixRegExp, '');

                //If mainScript is still a path, fall back to dataMain
                if (req.jsExtRegExp.test(mainScript)) {
                    mainScript = dataMain;
                }

                //Put the data-main script in the files to load.
                cfg.deps = cfg.deps ? cfg.deps.concat(mainScript) : [mainScript];

                return true;
            }
        });
    }

    /**
     * The function that handles definitions of modules. Differs from
     * require() in that a string for the module should be the first argument,
     * and the function to execute after dependencies are loaded should
     * return a value to define the module corresponding to the first argument's
     * name.
     */
    define = function (name, deps, callback) {
        var node, context;

        //Allow for anonymous modules
        if (typeof name !== 'string') {
            //Adjust args appropriately
            callback = deps;
            deps = name;
            name = null;
        }

        //This module may not have dependencies
        if (!isArray(deps)) {
            callback = deps;
            deps = null;
        }

        //If no name, and callback is a function, then figure out if it a
        //CommonJS thing with dependencies.
        if (!deps && isFunction(callback)) {
            deps = [];
            //Remove comments from the callback string,
            //look for require calls, and pull them into the dependencies,
            //but only if there are function args.
            if (callback.length) {
                callback
                    .toString()
                    .replace(commentRegExp, '')
                    .replace(cjsRequireRegExp, function (match, dep) {
                        deps.push(dep);
                    });

                //May be a CommonJS thing even without require calls, but still
                //could use exports, and module. Avoid doing exports and module
                //work though if it just needs require.
                //REQUIRES the function to expect the CommonJS variables in the
                //order listed below.
                deps = (callback.length === 1 ? ['require'] : ['require', 'exports', 'module']).concat(deps);
            }
        }

        //If in IE 6-8 and hit an anonymous define() call, do the interactive
        //work.
        if (useInteractive) {
            node = currentlyAddingScript || getInteractiveScript();
            if (node) {
                if (!name) {
                    name = node.getAttribute('data-requiremodule');
                }
                context = contexts[node.getAttribute('data-requirecontext')];
            }
        }

        //Always save off evaluating the def call until the script onload handler.
        //This allows multiple modules to be in a file without prematurely
        //tracing dependencies, and allows for anonymous module support,
        //where the module name is not known until the script onload event
        //occurs. If no context, use the global queue, and get it processed
        //in the onscript load callback.
        if (context) {
            context.defQueue.push([name, deps, callback]);
            context.defQueueMap[name] = true;
        } else {
            globalDefQueue.push([name, deps, callback]);
        }
    };

    define.amd = {
        jQuery: true
    };

    /**
     * Executes the text. Normally just uses eval, but can be modified
     * to use a better, environment-specific call. Only used for transpiling
     * loader plugins, not for plain JS modules.
     * @param {String} text the text to execute/evaluate.
     */
    req.exec = function (text) {
        /*jslint evil: true */
        return eval(text);
    };

    //Set up with config info.
    req(cfg);
}(this));
", "ok": true, "headers": [ [ "content-type", "application/javascript" ] ], "status": 200, "status_text": "" } }, "base_uri": "https://localhost:8080/", "height": 183, "referenced_widgets": [ "4d1bd7a205b94210ba8e1fd946d75821", "1f2847673c374813ac442322e978eec7", "f48103aee6dc4c06b176bc115be332e0", "c6f8b3bf7fce4c928a0db3651813347d", "8d049af8ea834bf7b3a0fc7013fb3ff9", "1dc9d68906594c1eb6db7c9f31876d2a", "f9d8ea0a95924b0596ab4b7f091a94b7", "ba7f35525a7b4cb1951ed1a1e6a57ffd", "d3ee7a14538244b1b64abbeb24948102", "34507ce588b04412aaacea76987f27ea", "c32d39b32a144c4480cd8d6b1d6c199e", "693433c2ec204437ac7878a8bee61647", "19a09d359acf496bb0bc68c63dea1e78", "8e81da5616354ddb887419d81251096b", "2a084a03747d4c9984cc13136ccc4217", "679c7f033d2940f489382161964c5c0d", "2861d6bdfed84911ab25f8175d718e2e", "df575c6406c0426ca9f222b818896439", "5f214f2f5a964fdc97eda797248f720f", "40f8b46b4b8f44dcabd98d6a5e3044b3", "5a68a09e677d4118bc7e3efc18c35999", "73ff50d8f1dc4f4e99062845a024a20b", "b8aa22a0efcb4f43a752e21bdeb73589", "f2095ed84f644757888f5746c2a10ee4" ] }, "outputId": "4aafdc98-d7c7-4930-cdf0-47b9438e2835" }, "source": [ "#@title Step 4: Processing and Displaying Attention Heads\n", "model_version = 'bert-base-uncased'\n", "do_lower_case = True\n", "model = BertModel.from_pretrained(model_version, output_attentions=True)\n", "tokenizer = BertTokenizer.from_pretrained(model_version, do_lower_case=do_lower_case)\n", "\n", "\n", "sentence_a = \"The cat sleeps on the mat\"\n", "sentence_b = \"Le chat dors sur le tapis\"\n", "inputs = tokenizer.encode_plus(sentence_a, sentence_b, return_tensors='pt', add_special_tokens=True)\n", "\n", "\n", "\n", "token_type_ids = inputs['token_type_ids']\n", "input_ids = inputs['input_ids']\n", "attention = model(input_ids, token_type_ids=token_type_ids)[-1]\n", "input_id_list = input_ids[0].tolist() # Batch index 0\n", "tokens = tokenizer.convert_ids_to_tokens(input_id_list)\n", "call_html()\n", "\n", "head_view(attention, tokens)" ], "execution_count": null, "outputs": [ { "output_type": "display_data", "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "4d1bd7a205b94210ba8e1fd946d75821", "version_minor": 0, "version_major": 2 }, "text/plain": [ "HBox(children=(FloatProgress(value=0.0, description='Downloading', max=433.0, style=ProgressStyle(description_…" ] }, "metadata": { "tags": [] } }, { "output_type": "stream", "text": [ "\n" ], "name": "stdout" }, { "output_type": "display_data", "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "d3ee7a14538244b1b64abbeb24948102", "version_minor": 0, "version_major": 2 }, "text/plain": [ "HBox(children=(FloatProgress(value=0.0, description='Downloading', max=440473133.0, style=ProgressStyle(descri…" ] }, "metadata": { "tags": [] } }, { "output_type": "stream", "text": [ "\n" ], "name": "stdout" }, { "output_type": "display_data", "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "2861d6bdfed84911ab25f8175d718e2e", "version_minor": 0, "version_major": 2 }, "text/plain": [ "HBox(children=(FloatProgress(value=0.0, description='Downloading', max=231508.0, style=ProgressStyle(descripti…" ] }, "metadata": { "tags": [] } }, { "output_type": "stream", "text": [ "\n" ], "name": "stdout" }, { "output_type": "display_data", "data": { "text/html": [ "\n", " \n", " \n", " " ], "text/plain": [ "" ] }, "metadata": { "tags": [] } }, { "output_type": "display_data", "data": { "text/html": [ "\n", " \n", " Layer: \n", " \n", "

\n", " " ], "text/plain": [ "" ] }, "metadata": { "tags": [] } }, { "output_type": "display_data", "data": { "application/javascript": [ "window.params = {\"attention\": {\"all\": {\"attn\": [[[[0.042640648782253265, 0.09681650996208191, 0.03236351162195206, 0.01571996696293354, 0.08111880719661713, 0.10342955589294434, 0.0738406777381897, 0.20756109058856964, 0.01790483668446541, 0.027967726811766624, 0.030372662469744682, 0.030997319146990776, 0.034154053777456284, 0.017717381939291954, 0.030224351212382317, 0.01922592520713806, 0.13794498145580292], [0.11926430463790894, 0.12762202322483063, 0.09728197753429413, 0.08620084077119827, 0.15430551767349243, 0.15357902646064758, 0.1171526163816452, 0.11728273332118988, 0.002462986623868346, 0.0025408673100173473, 0.0038905502296984196, 0.002784406766295433, 0.0019145426340401173, 0.002655654214322567, 0.0045268540270626545, 0.0026093535125255585, 0.003925872500985861], [0.09301801770925522, 0.08041764795780182, 0.10636380314826965, 0.1770949810743332, 0.06251100450754166, 0.08526547998189926, 0.189857617020607, 0.15887802839279175, 0.0030797156505286694, 0.006695868447422981, 0.003967460244894028, 0.004050382412970066, 0.005367732606828213, 0.003237680299207568, 0.009736202657222748, 0.006193476263433695, 0.00426498195156455], [0.106561079621315, 0.05641898140311241, 0.19660750031471252, 0.11499432474374771, 0.18368546664714813, 0.05770811811089516, 0.15900199115276337, 0.08477166295051575, 0.0027165724895894527, 0.006085303612053394, 0.003760731313377619, 0.004821436014026403, 0.004529156256467104, 0.0029052270110696554, 0.00955934263765812, 0.0030809317249804735, 0.0027921488508582115], [0.08910197764635086, 0.07670493423938751, 0.11446008086204529, 0.10101595520973206, 0.28781038522720337, 0.07956993579864502, 0.10312031954526901, 0.12079973518848419, 0.0032753932755440474, 0.002210873644798994, 0.0031903095077723265, 0.0027171436231583357, 0.0029220920987427235, 0.003313075052574277, 0.0037452015094459057, 0.002506786026060581, 0.0035357533488422632], [0.0998498871922493, 0.11938027292490005, 0.1014232411980629, 0.09744156897068024, 0.20809046924114227, 0.1293657422065735, 0.11939773708581924, 0.10011722892522812, 0.0026749621611088514, 0.0020403428934514523, 0.0035907896235585213, 0.0028014585841447115, 0.001921422895975411, 0.0027241555508226156, 0.003754986450076103, 0.0023009770084172487, 0.003124766983091831], [0.0645175352692604, 0.025949278846383095, 0.3036326766014099, 0.2503582239151001, 0.050063785165548325, 0.025644758716225624, 0.09468463808298111, 0.13614340126514435, 0.007350584492087364, 0.004970385227352381, 0.002643715823069215, 0.00545605830848217, 0.0036450151819735765, 0.0067082783207297325, 0.0094120679423213, 0.00584537535905838, 0.0029742582701146603], [0.11769143491983414, 0.1344335973262787, 0.06413094699382782, 0.0317479632794857, 0.1513580083847046, 0.14977924525737762, 0.1319790929555893, 0.1862022876739502, 0.002302740700542927, 0.0028763131704181433, 0.0028735576197504997, 0.003654520958662033, 0.002978122793138027, 0.002326279180124402, 0.004565268289297819, 0.00349133531562984, 0.00760926678776741], [0.02562863752245903, 0.0004078300844412297, 0.0032536208163946867, 0.003330453997477889, 0.00017095707880798727, 0.00039548988570459187, 0.00199083611369133, 0.0016490630805492401, 0.167580708861351, 0.09408379346132278, 0.06544903665781021, 0.04623166471719742, 0.15448689460754395, 0.14431609213352203, 0.11098875850439072, 0.12445829063653946, 0.055577896535396576], [0.07014763355255127, 0.0005437441286630929, 0.003980056382715702, 0.0026207391638308764, 0.0005810425500385463, 0.000526122807059437, 0.0023223876487463713, 0.00579214608296752, 0.15124566853046417, 0.08132249861955643, 0.050969529896974564, 0.05689125508069992, 0.09932894259691238, 0.1200522631406784, 0.10018663853406906, 0.0984487235546112, 0.15504062175750732], [0.12969426810741425, 0.0007159645901992917, 0.002899829763919115, 0.0034432087559252977, 0.001374734565615654, 0.0007553455070592463, 0.0019275349332019687, 0.0035819439217448235, 0.11694613099098206, 0.11251003295183182, 0.061726413667201996, 0.061719123274087906, 0.14434251189231873, 0.0918242484331131, 0.05241181328892708, 0.10344237089157104, 0.11068445444107056], [0.08860807865858078, 0.00033340323716402054, 0.005995309446007013, 0.009146761149168015, 0.00066465261625126, 0.0003436091938056052, 0.0025671101175248623, 0.003198149148374796, 0.09682898223400116, 0.20771238207817078, 0.07781518250703812, 0.045778073370456696, 0.10230869054794312, 0.08032435178756714, 0.11268480122089386, 0.08862999826669693, 0.07706047594547272], [0.01774413511157036, 0.0004971520393155515, 0.005784652661532164, 0.005453730933368206, 0.0004703355662059039, 0.00048706456436775625, 0.0016657375963404775, 0.002886575646698475, 0.27312925457954407, 0.09089815616607666, 0.044461674988269806, 0.0464358851313591, 0.049647387117147446, 0.2217429131269455, 0.04777606204152107, 0.11602862924337387, 0.0748906061053276], [0.02785409986972809, 0.00048729090485721827, 0.003965858370065689, 0.00443654228001833, 0.00019893207354471087, 0.00046309587196446955, 0.0025553046725690365, 0.0016836462309584022, 0.1612589806318283, 0.09625481069087982, 0.06796625256538391, 0.04566244035959244, 0.1552688181400299, 0.13440664112567902, 0.12328854948282242, 0.12604457139968872, 0.048204127699136734], [0.07027672976255417, 0.00272547360509634, 0.006296050269156694, 0.012220696546137333, 0.002014654455706477, 0.0024726318661123514, 0.007371077314019203, 0.011058925651013851, 0.10080023854970932, 0.11292599141597748, 0.04508043825626373, 0.06322207301855087, 0.1674046814441681, 0.08792895823717117, 0.02495424449443817, 0.17958371341228485, 0.10366351157426834], [0.041481614112854004, 0.0005970805650576949, 0.004914623219519854, 0.006723812781274319, 0.00038710434455424547, 0.0006574143772013485, 0.0038090385496616364, 0.0036365508567541838, 0.11489778012037277, 0.1283486783504486, 0.08074445277452469, 0.06326085329055786, 0.11468322575092316, 0.10017228871583939, 0.13928304612636566, 0.09604785591363907, 0.10035452991724014], [0.06458704173564911, 0.0027690366841852665, 0.0016191770555451512, 0.000801018497440964, 0.0020696024876087904, 0.002871264237910509, 0.00262609519995749, 0.006281935144215822, 0.06894499808549881, 0.06760973483324051, 0.0733465924859047, 0.08226058632135391, 0.1185879334807396, 0.0593111515045166, 0.057768288999795914, 0.10335753858089447, 0.28518804907798767]], [[0.6899476051330566, 0.01711839810013771, 0.02227495238184929, 0.005241985898464918, 0.12664860486984253, 0.017489157617092133, 0.0050677284598350525, 0.00617841025814414, 0.0045233434066176414, 0.013983801938593388, 0.012540050782263279, 0.027683710679411888, 0.025330452248454094, 0.004315356723964214, 0.005250631365925074, 0.008778310380876064, 0.007627550046890974], [0.08791413903236389, 0.11619292199611664, 0.061934828758239746, 0.13592441380023956, 0.15491411089897156, 0.1203397586941719, 0.10690971463918686, 0.0958336591720581, 0.01591033674776554, 0.013104955665767193, 0.017140839248895645, 0.010103058069944382, 0.011109541170299053, 0.013218441978096962, 0.02168606035411358, 0.006693120580166578, 0.01107019279152155], [0.267109751701355, 0.0875839814543724, 0.038375139236450195, 0.14138434827327728, 0.0922815129160881, 0.08515460044145584, 0.06615043431520462, 0.09343228489160538, 0.009366502985358238, 0.04650744050741196, 0.006590259727090597, 0.009325448423624039, 0.011282031424343586, 0.00884530134499073, 0.021667787805199623, 0.004750333726406097, 0.010192859917879105], [0.12299053370952606, 0.03121461533010006, 0.12526974081993103, 0.1384095698595047, 0.042847469449043274, 0.03798247501254082, 0.08502328395843506, 0.28926846385002136, 0.008238401263952255, 0.022968605160713196, 0.004673621151596308, 0.009429254569113255, 0.020892977714538574, 0.006791743915528059, 0.0230410136282444, 0.007374530658125877, 0.02358374372124672], [0.019771628081798553, 0.042925987392663956, 0.10725907981395721, 0.48143431544303894, 0.019195804372429848, 0.05173555016517639, 0.10353858768939972, 0.07466888427734375, 0.00993169192224741, 0.02089563198387623, 0.005202791187912226, 0.005305493250489235, 0.010906443931162357, 0.008418725803494453, 0.028224799782037735, 0.004674621392041445, 0.005909830331802368], [0.09123093634843826, 0.10900420695543289, 0.08921576291322708, 0.1496429443359375, 0.13611900806427002, 0.10640604794025421, 0.11674714088439941, 0.10638976097106934, 0.011168643832206726, 0.011797369457781315, 0.011537309736013412, 0.0073274956084787846, 0.00945147406309843, 0.00891185738146305, 0.020916972309350967, 0.004690505098551512, 0.009442619979381561], [0.35553979873657227, 0.025596454739570618, 0.06601442396640778, 0.1842540055513382, 0.07079135626554489, 0.03259963542222977, 0.022814271971583366, 0.17761483788490295, 0.005418842658400536, 0.015257677994668484, 0.0014392860466614366, 0.006486260332167149, 0.0061572943814098835, 0.004111186135560274, 0.013198990374803543, 0.002174907363951206, 0.010530714876949787], [0.06799765676259995, 0.17805378139019012, 0.0691291019320488, 0.06600046157836914, 0.21040737628936768, 0.14877986907958984, 0.0630408450961113, 0.0686224177479744, 0.017828356474637985, 0.015499861910939217, 0.018729453906416893, 0.012192045338451862, 0.01176519226282835, 0.016981029883027077, 0.017346838489174843, 0.0073031471110880375, 0.010322626680135727], [0.6129382848739624, 0.004320988431572914, 0.01258255448192358, 0.015285599045455456, 0.0030379530508071184, 0.004460067022591829, 0.0025749183259904385, 0.01666790433228016, 0.008098828606307507, 0.058936167508363724, 0.016184460371732712, 0.029880866408348083, 0.06292136013507843, 0.006277794949710369, 0.043976105749607086, 0.008016981184482574, 0.09383922815322876], [0.14908069372177124, 0.0024843106511980295, 0.023524150252342224, 0.01869337633252144, 0.004766706842929125, 0.002341205021366477, 0.009652119129896164, 0.04065156355500221, 0.07159541547298431, 0.06913284212350845, 0.028304854407906532, 0.041655827313661575, 0.10864190757274628, 0.05730421468615532, 0.126399427652359, 0.023038694635033607, 0.22273264825344086], [0.08271943777799606, 0.004906867630779743, 0.041981372982263565, 0.06709666550159454, 0.007911065593361855, 0.005610652733594179, 0.015105141326785088, 0.022526629269123077, 0.04096611961722374, 0.32016775012016296, 0.02625729702413082, 0.054550983011722565, 0.07894545048475266, 0.03263111412525177, 0.08251748234033585, 0.023641759529709816, 0.09246420860290527], [0.551002025604248, 0.0070753698237240314, 0.021293554455041885, 0.018289392814040184, 0.005553947761654854, 0.0065783425234258175, 0.012154323048889637, 0.023979654535651207, 0.012174108996987343, 0.08879949152469635, 0.010085527785122395, 0.01277348306030035, 0.06179787218570709, 0.009651558473706245, 0.05894025042653084, 0.012710676528513432, 0.08714031428098679], [0.24295495450496674, 0.008797545917332172, 0.01897348091006279, 0.02240259200334549, 0.005888194777071476, 0.0081903962418437, 0.009061544202268124, 0.04251229017972946, 0.04628982022404671, 0.07775620371103287, 0.03597036749124527, 0.046389371156692505, 0.03849257156252861, 0.032815173268318176, 0.10736890137195587, 0.018913190811872482, 0.23722338676452637], [0.5907400846481323, 0.005485023837536573, 0.012210480868816376, 0.013892722316086292, 0.0033475093077868223, 0.005643690470606089, 0.002536748768761754, 0.01521963719278574, 0.008581125177443027, 0.06732795387506485, 0.017745228484272957, 0.03273386135697365, 0.06318624317646027, 0.007224984932690859, 0.04996907338500023, 0.00941492896527052, 0.0947408378124237], [0.16156518459320068, 0.02562759444117546, 0.04496842995285988, 0.03754839301109314, 0.026267273351550102, 0.025432039052248, 0.05358785018324852, 0.04366692155599594, 0.03720178082585335, 0.2182350754737854, 0.019072847440838814, 0.03881412371993065, 0.08296588063240051, 0.037006035447120667, 0.04472416266798973, 0.022943798452615738, 0.0803726390004158], [0.7009365558624268, 0.017248960211873055, 0.007276283577084541, 0.007549286354333162, 0.007020256016403437, 0.012982342392206192, 0.0027963262982666492, 0.020802771672606468, 0.012614535167813301, 0.023595063015818596, 0.007564424071460962, 0.018587982282042503, 0.03691153973340988, 0.01124848984181881, 0.03711971640586853, 0.0020862880628556013, 0.07365916669368744], [0.04765614867210388, 0.02357564866542816, 0.0076897325925529, 0.006844497285783291, 0.023701030761003494, 0.018322352319955826, 0.006876892875880003, 0.011391970328986645, 0.09617432951927185, 0.10392188280820847, 0.128093883395195, 0.08617661893367767, 0.08871164917945862, 0.093619704246521, 0.08727706968784332, 0.04504602402448654, 0.1249205619096756]], [[0.6999444961547852, 0.033271368592977524, 0.013909603469073772, 0.006980339530855417, 0.022110717371106148, 0.02150537818670273, 0.02339259162545204, 0.053715016692876816, 0.007926936261355877, 0.01761786825954914, 0.008515228517353535, 0.010299251414835453, 0.014730553142726421, 0.008134471252560616, 0.010132250376045704, 0.008362770080566406, 0.039451174437999725], [0.7096490859985352, 0.1286257803440094, 0.01919744350016117, 0.009776546619832516, 0.02203425206243992, 0.029362838715314865, 0.006909705698490143, 0.007599890232086182, 0.0020745599176734686, 0.00481249438598752, 0.006637603975832462, 0.00887293554842472, 0.003468174487352371, 0.003521648235619068, 0.008425338193774223, 0.007191610522568226, 0.021840089932084084], [0.27386799454689026, 0.46732890605926514, 0.01999024860560894, 0.013360227458178997, 0.06342943012714386, 0.007877282798290253, 0.040805548429489136, 0.010506179183721542, 0.005935221444815397, 0.0036369431763887405, 0.0033280719071626663, 0.004359117709100246, 0.005070291925221682, 0.017964519560337067, 0.04368644952774048, 0.014768614433705807, 0.004084874410182238], [0.6985294222831726, 0.051490768790245056, 0.07849828898906708, 0.012298560701310635, 0.043022263795137405, 0.01631149835884571, 0.006221814081072807, 0.045159924775362015, 0.0003402868751436472, 0.000611428520642221, 0.00032320714672096074, 0.0009421741706319153, 0.0026654524262994528, 0.000793307670392096, 0.0019224915886297822, 0.015949413180351257, 0.024919643998146057], [0.13385166227817535, 0.13671085238456726, 0.021839935332536697, 0.2342349886894226, 0.15750010311603546, 0.10865804553031921, 0.008402155712246895, 0.09069671481847763, 0.002944386564195156, 0.01758934184908867, 0.00010379388550063595, 0.0028518179897218943, 0.0009350733016617596, 0.0008568796911276877, 0.04928211122751236, 0.0006318472442217171, 0.032910313457250595], [0.26596003770828247, 0.02734042890369892, 0.006950710900127888, 0.01500026136636734, 0.5170687437057495, 0.06982631981372833, 0.010510291904211044, 0.07074353098869324, 0.002468053251504898, 0.0052584083750844, 0.001481827930547297, 9.464619506616145e-05, 0.0008707857341505587, 0.0007765362970530987, 0.0012102317996323109, 0.0026651860680431128, 0.0017740766052156687], [0.25579598546028137, 0.009068278595805168, 0.04568634554743767, 0.027461836114525795, 0.17888212203979492, 0.3413448929786682, 0.025524647906422615, 0.05428246408700943, 0.014690759591758251, 0.009647219441831112, 0.006497113034129143, 0.005052521359175444, 0.0011131246574223042, 0.001288561150431633, 0.004314302001148462, 0.008577005006372929, 0.01077277585864067], [0.3748631477355957, 0.021395862102508545, 0.002089103450998664, 0.005836137570440769, 0.013999508693814278, 0.04796065390110016, 0.3704995810985565, 0.12290674448013306, 0.008339639753103256, 0.008209849707782269, 0.002514739753678441, 0.010776760056614876, 0.0015108921797946095, 0.00013728003250434995, 0.0008048153249546885, 0.0006206078687682748, 0.00753468181937933], [0.24807600677013397, 0.0021809223107993603, 0.010174860246479511, 0.001623473595827818, 0.009058342315256596, 0.003580352058634162, 0.010590552352368832, 0.6100813150405884, 0.018959587439894676, 0.014104754664003849, 0.013513625599443913, 0.007730433717370033, 0.044733162969350815, 0.0009516052668914199, 0.000454386550700292, 0.0014315351145341992, 0.002755087101832032], [0.22682400047779083, 0.0006895898841321468, 0.0018355300417169929, 0.0032065254636108875, 0.0046776942908763885, 0.006206498946994543, 0.0023423905950039625, 0.08675538003444672, 0.5937461256980896, 0.006382175721228123, 0.012460564263164997, 0.014698871411383152, 0.004481582436710596, 0.025256581604480743, 0.0049661388620734215, 0.00046906445641070604, 0.005001252982765436], [0.28856441378593445, 0.011897936463356018, 0.0030935450922697783, 0.0022680729161947966, 0.04757849499583244, 0.010350900702178478, 0.0010078982450067997, 0.12768571078777313, 0.07826422154903412, 0.22995775938034058, 0.04328764230012894, 0.09640689939260483, 0.02851683832705021, 0.0080885523930192, 0.01143547985702753, 0.006880860775709152, 0.004714652895927429], [0.008578760549426079, 0.00121406523976475, 0.00010369140363764018, 0.0002546988253016025, 0.0010680478299036622, 0.001788561581633985, 0.0028198116924613714, 0.01274645421653986, 0.009919047355651855, 0.021305445581674576, 0.9207978248596191, 0.000465020741103217, 0.0006918059079907835, 0.008799072355031967, 0.0004177912778686732, 0.006095185875892639, 0.002934586489573121], [0.1607099175453186, 0.004049964249134064, 0.0008615506230853498, 0.0003936364664696157, 0.00018070013902615756, 0.0014086180599406362, 0.007494746707379818, 0.006544463336467743, 0.11627081036567688, 0.02389046736061573, 0.05305321142077446, 0.3347630202770233, 0.01155440229922533, 0.1400853544473648, 0.09302914887666702, 0.01309790089726448, 0.032612092792987823], [0.14028777182102203, 0.003955664113163948, 0.0034131580032408237, 0.0005156900151632726, 0.0006142694037407637, 4.92862964165397e-05, 0.0004105033876840025, 0.008855744265019894, 0.0012289606966078281, 0.005256796255707741, 0.006625024601817131, 0.027821024879813194, 0.7619102001190186, 0.014636924490332603, 0.010180609300732613, 0.009264778345823288, 0.004973613657057285], [0.31704413890838623, 0.006152885966002941, 0.008545051328837872, 0.004672444891184568, 0.0029587983153760433, 0.0029744692146778107, 0.00016444448556285352, 0.009892044588923454, 0.011383824050426483, 0.0017570228083059192, 0.009875562973320484, 0.01864154264330864, 0.1480650156736374, 0.2796936333179474, 0.04994041845202446, 0.050975698977708817, 0.07726306468248367], [0.03042708896100521, 0.006150448229163885, 0.014029327780008316, 0.01415711734443903, 0.016464460641145706, 0.001985558308660984, 0.007453904952853918, 0.00198889197781682, 0.005330587271600962, 0.07505982369184494, 0.008630000054836273, 0.25703197717666626, 0.04948176071047783, 0.0745435282588005, 0.3545231819152832, 0.016538724303245544, 0.06620363891124725], [0.5550248026847839, 0.01825815811753273, 0.002225968288257718, 0.000661232101265341, 0.006191920023411512, 0.00458022765815258, 0.001469333190470934, 0.003970153629779816, 0.00028877961449325085, 0.0017874451586976647, 0.00491069070994854, 0.004161215387284756, 0.013822426088154316, 0.012160233221948147, 0.02345888502895832, 0.13453733921051025, 0.2124912291765213]], [[0.5531274676322937, 0.038948748260736465, 0.03963252902030945, 0.022325852885842323, 0.045458946377038956, 0.018582282587885857, 0.04739651083946228, 0.030218904837965965, 0.01876020058989525, 0.024210235103964806, 0.014616807922720909, 0.014615807682275772, 0.038727737963199615, 0.016958223655819893, 0.027495944872498512, 0.0132807157933712, 0.03564314916729927], [0.23534129559993744, 0.25391167402267456, 0.20634758472442627, 0.07569573074579239, 0.016307909041643143, 0.022128622978925705, 0.02104955166578293, 0.010060575790703297, 0.008644415996968746, 0.02620367892086506, 0.011110111139714718, 0.005413709208369255, 0.015157378278672695, 0.01644430309534073, 0.05729871243238449, 0.008592666126787663, 0.010292124934494495], [0.28543907403945923, 0.5129810571670532, 0.12388613820075989, 0.014964940957725048, 0.006519661284983158, 0.0051577468402683735, 0.01266274694353342, 0.0031568193808197975, 0.00118518085218966, 0.002923388034105301, 0.001694236765615642, 0.002124629681929946, 0.00967483688145876, 0.002205616096034646, 0.007514026947319508, 0.005517646204680204, 0.002392255235463381], [0.0470183901488781, 0.24743255972862244, 0.6383838057518005, 0.02836628630757332, 0.005363557953387499, 0.010781673714518547, 0.005996208172291517, 0.003201662329956889, 0.0011405611876398325, 0.0009133138228207827, 0.0010775269474834204, 0.0008611854282207787, 0.002654226031154394, 0.00156002352014184, 0.0015406447928398848, 0.000864411354996264, 0.002843990456312895], [0.0033526041079312563, 0.11373357474803925, 0.2981511056423187, 0.5266714096069336, 0.018684813752770424, 0.014041881076991558, 0.011216058395802975, 0.004145070910453796, 0.0026330705732107162, 0.0016260731499642134, 0.00015214589075185359, 0.000457542686490342, 0.0008057541563175619, 0.0012821558630093932, 0.0016656132647767663, 0.0002884374698624015, 0.0010927255498245358], [0.012172370217740536, 0.038806330412626266, 0.10904736071825027, 0.4352467954158783, 0.3111500144004822, 0.05555146560072899, 0.016090288758277893, 0.011615954339504242, 0.0012498828582465649, 0.0035074332263320684, 0.0004545902193058282, 0.00010934298188658431, 0.0016188444569706917, 0.0004899620544165373, 0.0013882080093026161, 0.0007956181070767343, 0.0007056964677758515], [0.34951213002204895, 0.012486843392252922, 0.04216228425502777, 0.037645675241947174, 0.23600329458713531, 0.24226385354995728, 0.04729386046528816, 0.018154671415686607, 0.004687127191573381, 0.0027336678467690945, 0.0014662212925031781, 0.0006396231474354863, 0.0010707535548135638, 0.0010869363322854042, 0.0008611147059127688, 0.0005958595429547131, 0.0013360094744712114], [0.14323300123214722, 0.022829772904515266, 0.02651551365852356, 0.02545176073908806, 0.030542083084583282, 0.29597872495651245, 0.2913847267627716, 0.07135308533906937, 0.056868597865104675, 0.010234376415610313, 0.002936959732323885, 0.0041832937858998775, 0.009259468875825405, 0.002136248629540205, 0.0019678110256791115, 0.0012895981781184673, 0.0038349893875420094], [0.14364652335643768, 0.00447213975712657, 0.015378237701952457, 0.006198503077030182, 0.007982158102095127, 0.017818354070186615, 0.13508988916873932, 0.5409282445907593, 0.07128128409385681, 0.022254234179854393, 0.007588529493659735, 0.002043461659923196, 0.019120583310723305, 0.0023534067440778017, 0.0008825342520140111, 0.0005810395232401788, 0.002380817197263241], [0.09853097051382065, 0.0030160625465214252, 0.02138899452984333, 0.0075078485533595085, 0.001676246291026473, 0.018666837364435196, 0.0309753455221653, 0.36838388442993164, 0.3980045020580292, 0.026666738092899323, 0.008063904009759426, 0.0015151504194363952, 0.002796384273096919, 0.005472081713378429, 0.0017908450681716204, 0.00035887863487005234, 0.005185370799154043], [0.004498024936765432, 0.0007459388580173254, 0.0005258452729322016, 0.002558627165853977, 0.003943223040550947, 0.0035117941442877054, 0.00734774861484766, 0.0976485013961792, 0.47000575065612793, 0.3124501705169678, 0.06954536586999893, 0.00688760494813323, 0.00824044831097126, 0.0060750218108296394, 0.0029516934882849455, 0.001455658464692533, 0.0016086554387584329], [0.008639084175229073, 0.0019752781372517347, 0.006826938595622778, 0.000864996574819088, 0.0012613479048013687, 0.00697448942810297, 0.007796809542924166, 0.0354609340429306, 0.1621769815683365, 0.34291064739227295, 0.3599295914173126, 0.013132589869201183, 0.023393215611577034, 0.01077071763575077, 0.010074962861835957, 0.00524973263964057, 0.0025617198552936316], [0.19040097296237946, 0.001692387042567134, 0.00625053932890296, 0.0023625281173735857, 0.0007383263437077403, 0.0033193824347108603, 0.016812890768051147, 0.024481041356921196, 0.058065395802259445, 0.13811634480953217, 0.30562397837638855, 0.12837380170822144, 0.07123146206140518, 0.018925435841083527, 0.011775652877986431, 0.0027958799619227648, 0.019033970311284065], [0.06366129219532013, 0.0012058063875883818, 0.001200215658172965, 0.0004669454356189817, 0.00026555178919807076, 0.000212031343835406, 0.0034179112408310175, 0.009772730059921741, 0.008944135159254074, 0.03221636265516281, 0.06514137238264084, 0.07418863475322723, 0.6800320744514465, 0.04294847697019577, 0.010578269138932228, 0.0018520053708925843, 0.0038962122052907944], [0.047643065452575684, 0.0009506919304840267, 0.002307700924575329, 0.0009917699499055743, 0.002371498616412282, 0.0009200986823998392, 0.001367571298032999, 0.0159171000123024, 0.03498252481222153, 0.007331242319196463, 0.04356037825345993, 0.025966167449951172, 0.35480257868766785, 0.41760215163230896, 0.029087938368320465, 0.003710862947627902, 0.010486691258847713], [0.002624447690322995, 0.0011480419198051095, 0.0025820760056376457, 0.0015298562357202172, 0.0012255767360329628, 0.0006368904723785818, 0.0020696495193988085, 0.003914456348866224, 0.02860347181558609, 0.02604725770652294, 0.014786211773753166, 0.007875815033912659, 0.12581123411655426, 0.49345624446868896, 0.2483048290014267, 0.01538606733083725, 0.02399783954024315], [0.01755857840180397, 0.0011365225072950125, 0.0005246268701739609, 0.00020906470308545977, 0.00018786005966831, 0.00017037492943927646, 0.00012175613665021956, 0.0006612560828216374, 0.009288708679378033, 0.008440044708549976, 0.01343101728707552, 0.00490582687780261, 0.08473724871873856, 0.2772836983203888, 0.16869381070137024, 0.24652057886123657, 0.1661289930343628]], [[0.3624141812324524, 0.012535901740193367, 0.02622339129447937, 0.023359887301921844, 0.02091851457953453, 0.012288011610507965, 0.024015987291932106, 0.041488368064165115, 0.08691833168268204, 0.04299449920654297, 0.051547423005104065, 0.020742516964673996, 0.04904649034142494, 0.07918290793895721, 0.05141822621226311, 0.03543340787291527, 0.059471938759088516], [0.035816438496112823, 0.11800350248813629, 0.044975053519010544, 0.13583621382713318, 0.11961828917264938, 0.12800370156764984, 0.06930771470069885, 0.091176338493824, 0.022737184539437294, 0.026629827916622162, 0.02172294445335865, 0.024881193414330482, 0.03354039415717125, 0.02479497157037258, 0.03846859186887741, 0.031183989718556404, 0.03330357372760773], [0.1201598048210144, 0.04021308198571205, 0.021064747124910355, 0.09030243009328842, 0.09674698859453201, 0.03280641511082649, 0.06403307616710663, 0.02576272003352642, 0.03967718407511711, 0.04187723249197006, 0.020523540675640106, 0.070872001349926, 0.13243107497692108, 0.04611702635884285, 0.054625749588012695, 0.08023565262556076, 0.022551316767930984], [0.2558104395866394, 0.03588450327515602, 0.07239478081464767, 0.027127787470817566, 0.07910799235105515, 0.03960889205336571, 0.038419533520936966, 0.06509558856487274, 0.03172118589282036, 0.04581403359770775, 0.04239774867892265, 0.04441169276833534, 0.056603655219078064, 0.025636622682213783, 0.05578400939702988, 0.04242825508117676, 0.04175323247909546], [0.07175973802804947, 0.12695001065731049, 0.06962516903877258, 0.08992763608694077, 0.061048269271850586, 0.11724483221769333, 0.07279219478368759, 0.14686954021453857, 0.015917090699076653, 0.030525023117661476, 0.017779873684048653, 0.03806799650192261, 0.023232068866491318, 0.01768968440592289, 0.032533157616853714, 0.026374034583568573, 0.04166368395090103], [0.04598035663366318, 0.15256546437740326, 0.04135409742593765, 0.12453802675008774, 0.08941135555505753, 0.14862608909606934, 0.09383881837129593, 0.0897386372089386, 0.015865273773670197, 0.022009145468473434, 0.01312766782939434, 0.023622557520866394, 0.03047340363264084, 0.017952879890799522, 0.03219271078705788, 0.027798693627119064, 0.03090481460094452], [0.42808085680007935, 0.042736466974020004, 0.055460087954998016, 0.08529406040906906, 0.08022328466176987, 0.033800870180130005, 0.015010504983365536, 0.019122624769806862, 0.02632264606654644, 0.03478415310382843, 0.012839571572840214, 0.02776345983147621, 0.04488595947623253, 0.029277237132191658, 0.017974290996789932, 0.035526975989341736, 0.010896888561546803], [0.002761203097179532, 0.00048391782911494374, 0.0004983697435818613, 0.0005490362527780235, 0.0006300605600699782, 0.0009331773035228252, 0.0014834677567705512, 0.0776248574256897, 0.019254738464951515, 0.026327263563871384, 0.03957133740186691, 0.016347553580999374, 0.007111303508281708, 0.009307087399065495, 0.005713935010135174, 0.01732020266354084, 0.7740825414657593], [0.19387076795101166, 0.057324331253767014, 0.027362104505300522, 0.05215953290462494, 0.021029070019721985, 0.03585261479020119, 0.06193320080637932, 0.036798395216464996, 0.010659974068403244, 0.030238911509513855, 0.018659252673387527, 0.14139221608638763, 0.06656724959611893, 0.011400205083191395, 0.08376550674438477, 0.11414360255002975, 0.03684304282069206], [0.3142881691455841, 0.05663929134607315, 0.030714789405465126, 0.03112647496163845, 0.03478352725505829, 0.05229886248707771, 0.04520084708929062, 0.029717465862631798, 0.03471558168530464, 0.008498922921717167, 0.03623148426413536, 0.07341838628053665, 0.03728731349110603, 0.03837336227297783, 0.06595592945814133, 0.07795974612236023, 0.032789766788482666], [0.1515861302614212, 0.03841036558151245, 0.023936990648508072, 0.044587407261133194, 0.04884861037135124, 0.032511718571186066, 0.04029145464301109, 0.029132168740034103, 0.03469783440232277, 0.04939604923129082, 0.024633992463350296, 0.08683266490697861, 0.143334299325943, 0.0400252602994442, 0.07077339291572571, 0.09086612612009048, 0.050135575234889984], [0.1999911516904831, 0.05390523001551628, 0.05684918910264969, 0.06082169711589813, 0.02921750582754612, 0.04428960755467415, 0.0344964861869812, 0.023224303498864174, 0.0617038831114769, 0.05746985226869583, 0.06218429282307625, 0.04865669459104538, 0.05555034056305885, 0.06881751120090485, 0.05659811943769455, 0.057892169803380966, 0.02833206206560135], [0.39399954676628113, 0.025697950273752213, 0.0700189620256424, 0.06406722217798233, 0.0291362963616848, 0.022897573187947273, 0.026051117107272148, 0.018416333943605423, 0.02285711281001568, 0.04074666649103165, 0.04980146884918213, 0.06629952043294907, 0.016941891983151436, 0.027402490377426147, 0.05012977495789528, 0.052527423948049545, 0.023008637130260468], [0.16595052182674408, 0.06346289068460464, 0.030761510133743286, 0.039006561040878296, 0.019854631274938583, 0.03930297866463661, 0.06337015330791473, 0.045221034437417984, 0.010037682950496674, 0.0325680673122406, 0.020191669464111328, 0.15027892589569092, 0.059525396674871445, 0.010879214853048325, 0.0817238986492157, 0.11869318783283234, 0.049171701073646545], [0.09889552742242813, 0.06973684579133987, 0.069208525121212, 0.1107822135090828, 0.05574621632695198, 0.049872152507305145, 0.024334967136383057, 0.022679755464196205, 0.053051188588142395, 0.054932352155447006, 0.05127495899796486, 0.058714669197797775, 0.06867963820695877, 0.06266015768051147, 0.016280340030789375, 0.08117597550153732, 0.05197448655962944], [0.27592384815216064, 0.036020610481500626, 0.025736989453434944, 0.023358013480901718, 0.00982001330703497, 0.0292969923466444, 0.031171463429927826, 0.025641735643148422, 0.08022500574588776, 0.060013800859451294, 0.03729303926229477, 0.07992181181907654, 0.05378372594714165, 0.09005829691886902, 0.07051649689674377, 0.03422647342085838, 0.03699176013469696], [0.0025693487841635942, 0.0003728805750142783, 0.0002990306238643825, 0.0003148563264403492, 0.0002849936718121171, 0.000614148797467351, 0.0008464950369670987, 0.025200465694069862, 0.031975869089365005, 0.03877921402454376, 0.07089529931545258, 0.030728720128536224, 0.011361554265022278, 0.017042607069015503, 0.010072625242173672, 0.029468225315213203, 0.7291737794876099]], [[0.22146828472614288, 0.09557773172855377, 0.03721252456307411, 0.010895133949816227, 0.07651723176240921, 0.09909039735794067, 0.09845726937055588, 0.0745856985449791, 0.020788459107279778, 0.024861996993422508, 0.014216684736311436, 0.040189336985349655, 0.01024805847555399, 0.020380595698952675, 0.01735594868659973, 0.05288022756576538, 0.08527443557977676], [0.349394828081131, 0.07191049307584763, 0.040404897183179855, 0.0432015024125576, 0.1005854532122612, 0.06671977043151855, 0.03876152262091637, 0.16284143924713135, 0.013470688834786415, 0.020637663081288338, 0.008580397814512253, 0.008747578598558903, 0.008660301566123962, 0.0104757659137249, 0.004358518403023481, 0.009990829974412918, 0.041258305311203], [0.25236666202545166, 0.04885222017765045, 0.028824834153056145, 0.11896419525146484, 0.04094208776950836, 0.06895846128463745, 0.08705347776412964, 0.06107240542769432, 0.022229233756661415, 0.06969449669122696, 0.010668213479220867, 0.026688095182180405, 0.034861255437135696, 0.01960885338485241, 0.07318580895662308, 0.017862146720290184, 0.01816752552986145], [0.05968631058931351, 0.019690733402967453, 0.09633783251047134, 0.11084913462400436, 0.016676180064678192, 0.024355174973607063, 0.2863754332065582, 0.04760407656431198, 0.05504525452852249, 0.03317360579967499, 0.014614752493798733, 0.040386952459812164, 0.025106582790613174, 0.0449637696146965, 0.05195833742618561, 0.05384143814444542, 0.019334420561790466], [0.06758517771959305, 0.024695217609405518, 0.10462625324726105, 0.2847922742366791, 0.024211958050727844, 0.02454109489917755, 0.15730398893356323, 0.08206135034561157, 0.019399205222725868, 0.04808569326996803, 0.015354345552623272, 0.03115616738796234, 0.013969022780656815, 0.015760473906993866, 0.03557848557829857, 0.02278032898902893, 0.028098877519369125], [0.34686604142189026, 0.05750812217593193, 0.04747125506401062, 0.04895783215761185, 0.1022556722164154, 0.04081473872065544, 0.04102770984172821, 0.1867659091949463, 0.014471071772277355, 0.022346893325448036, 0.008297596126794815, 0.007435521110892296, 0.008637937717139721, 0.010762296617031097, 0.004218767397105694, 0.010377290658652782, 0.04178538918495178], [0.1493857502937317, 0.03248755261301994, 0.08947691321372986, 0.13280922174453735, 0.06091093644499779, 0.0553278774023056, 0.050607044249773026, 0.07705699652433395, 0.025356870144605637, 0.06409049034118652, 0.009429235942661762, 0.08282861858606339, 0.06852283328771591, 0.021467959508299828, 0.04371937736868858, 0.016851557418704033, 0.019670788198709488], [0.33623167872428894, 0.07561185210943222, 0.028091223910450935, 0.020133037120103836, 0.15367555618286133, 0.0813445895910263, 0.03583360090851784, 0.12775355577468872, 0.019131029024720192, 0.010492919944226742, 0.009413770399987698, 0.011702450923621655, 0.010775784030556679, 0.015646036714315414, 0.008195308037102222, 0.014027360826730728, 0.04194021224975586], [0.8023107051849365, 0.019373027607798576, 0.0025156119372695684, 0.0023727398365736008, 0.0032487292774021626, 0.02204030565917492, 0.0067085037007927895, 0.023359062150120735, 0.0045005762949585915, 0.012361705303192139, 0.004826097749173641, 0.015309019014239311, 0.006140597630292177, 0.0033337583299726248, 0.004910883028060198, 0.004779252223670483, 0.06190936267375946], [0.3708947002887726, 0.025083893910050392, 0.009627724066376686, 0.02590187080204487, 0.014447234570980072, 0.024308377876877785, 0.002569545991718769, 0.09100065380334854, 0.043360017240047455, 0.014688343740999699, 0.017659839242696762, 0.06939060240983963, 0.02606888674199581, 0.03311001881957054, 0.02343243546783924, 0.011741011403501034, 0.19671481847763062], [0.08892334252595901, 0.010166550055146217, 0.009855003096163273, 0.0682399794459343, 0.015200200490653515, 0.009594868868589401, 0.00905545987188816, 0.05146120488643646, 0.041895415633916855, 0.2228061556816101, 0.04351642727851868, 0.04928012564778328, 0.03705386444926262, 0.03508487716317177, 0.06771911680698395, 0.06095615401864052, 0.1791912466287613], [0.5418769121170044, 0.01601666957139969, 0.009150439873337746, 0.016776908189058304, 0.01569364033639431, 0.015624862164258957, 0.008765839971601963, 0.04400664195418358, 0.023828376084566116, 0.05414149910211563, 0.04032573848962784, 0.01391985546797514, 0.032621100544929504, 0.019276736304163933, 0.025139162316918373, 0.021299758926033974, 0.10153576731681824], [0.5501156449317932, 0.026163598522543907, 0.008576257154345512, 0.004659620579332113, 0.014734677970409393, 0.029290853068232536, 0.012003937736153603, 0.06651995331048965, 0.013105153106153011, 0.04000590741634369, 0.026503683999180794, 0.009788459166884422, 0.0036600125022232533, 0.010910391807556152, 0.013585356064140797, 0.02751201018691063, 0.14286457002162933], [0.7758069634437561, 0.02178819850087166, 0.003100266680121422, 0.0029092745389789343, 0.0038584470748901367, 0.023123592138290405, 0.007107515819370747, 0.02533472329378128, 0.004310137126594782, 0.014792204834520817, 0.00430460786446929, 0.017553681507706642, 0.007735233288258314, 0.0036361338570713997, 0.007432498503476381, 0.00672340439632535, 0.07048307359218597], [0.10401000827550888, 0.0037033334374427795, 0.015493758022785187, 0.11827439814805984, 0.0313827246427536, 0.004779853392392397, 0.009028956294059753, 0.016965186223387718, 0.048568468540906906, 0.1676759570837021, 0.02703569270670414, 0.0975814238190651, 0.167486771941185, 0.04472776874899864, 0.08908514678478241, 0.00761641887947917, 0.046584151685237885], [0.523699164390564, 0.022767236456274986, 0.0027363852132111788, 0.006813987623900175, 0.00969173014163971, 0.023758579045534134, 0.004062699154019356, 0.05521663650870323, 0.0249981340020895, 0.018409091979265213, 0.006632436532527208, 0.029245242476463318, 0.025296254083514214, 0.021985916420817375, 0.015527701936662197, 0.006176759954541922, 0.20298199355602264], [0.3961465656757355, 0.02734614536166191, 0.009385865181684494, 0.006295287050306797, 0.037237539887428284, 0.025111032649874687, 0.008546407334506512, 0.05030955374240875, 0.050516992807388306, 0.034983932971954346, 0.020415445789694786, 0.027928628027439117, 0.031707193702459335, 0.04860866814851761, 0.021578678861260414, 0.03710201010107994, 0.16678006947040558]], [[0.15183575451374054, 0.46849802136421204, 0.0030945604667067528, 0.0008100521517917514, 0.0020251739770174026, 0.33723846077919006, 0.0023610142525285482, 0.002125396393239498, 0.0027228249236941338, 0.003253430360928178, 0.00252532004378736, 0.006180603988468647, 0.004932955373078585, 0.002614939119666815, 0.00628370838239789, 0.0014395955950021744, 0.00205813511274755], [0.02151884324848652, 0.03354554995894432, 0.04301507771015167, 0.10622286051511765, 0.056409262120723724, 0.03551318868994713, 0.0384879969060421, 0.03723745048046112, 0.0874704122543335, 0.08928412199020386, 0.07292473316192627, 0.05680923908948898, 0.05995906516909599, 0.08004922419786453, 0.06518056243658066, 0.06115090847015381, 0.05522146821022034], [0.06864052265882492, 0.008109799586236477, 0.13258016109466553, 0.09440374374389648, 0.017020031809806824, 0.009068461135029793, 0.06857091188430786, 0.04040054604411125, 0.05879812687635422, 0.06848244369029999, 0.01864909566938877, 0.0630573034286499, 0.08155646175146103, 0.053320229053497314, 0.05525437742471695, 0.11630966514348984, 0.04577820003032684], [0.27274149656295776, 0.008846202865242958, 0.03377395495772362, 0.15767936408519745, 0.009655912406742573, 0.008760345168411732, 0.049490418285131454, 0.01542940828949213, 0.03226921334862709, 0.05230236053466797, 0.030614422634243965, 0.10226268321275711, 0.04636583849787712, 0.032384242862463, 0.05604005977511406, 0.07140760123729706, 0.01997647061944008], [0.050567951053380966, 0.02203758992254734, 0.03784537687897682, 0.095024473965168, 0.04556785151362419, 0.018033409491181374, 0.04900776222348213, 0.05462734401226044, 0.044080570340156555, 0.11471997201442719, 0.05538792163133621, 0.060299839824438095, 0.08012497425079346, 0.04651759937405586, 0.0941704735159874, 0.04694158211350441, 0.08504533022642136], [0.020432479679584503, 0.032504867762327194, 0.05106063932180405, 0.10444325953722, 0.06340580433607101, 0.032096978276968, 0.05334796756505966, 0.030322860926389694, 0.08697664737701416, 0.08812955021858215, 0.07004262506961823, 0.05981476232409477, 0.06249788776040077, 0.07779262959957123, 0.07522901147603989, 0.048624299466609955, 0.04327766224741936], [0.0072828903794288635, 0.007258305791765451, 0.04488436132669449, 0.12844499945640564, 0.0456131249666214, 0.0074371593073010445, 0.20878440141677856, 0.044451698660850525, 0.08148134499788284, 0.06542021036148071, 0.02337281033396721, 0.032936304807662964, 0.0500723198056221, 0.06744685024023056, 0.11428835988044739, 0.023170849308371544, 0.04765408858656883], [0.05660533532500267, 0.020602023229002953, 0.03657577186822891, 0.0603347048163414, 0.04797637462615967, 0.017314614728093147, 0.029355747625231743, 0.16981235146522522, 0.027038734406232834, 0.05662652850151062, 0.04447447508573532, 0.06533629447221756, 0.08327851444482803, 0.026884645223617554, 0.03523285314440727, 0.045164819806814194, 0.17738620936870575], [0.016270743682980537, 0.005787154193967581, 0.024584157392382622, 0.13429123163223267, 0.05384276434779167, 0.005131443031132221, 0.09800086170434952, 0.024946004152297974, 0.17717379331588745, 0.08928237110376358, 0.041335444897413254, 0.024807604029774666, 0.05104060098528862, 0.15616516768932343, 0.04507097229361534, 0.026637963950634003, 0.02563171647489071], [0.007343251258134842, 0.00997698213905096, 0.02885105274617672, 0.1474658101797104, 0.03641022741794586, 0.0114969527348876, 0.1515871286392212, 0.018221678212285042, 0.09226132929325104, 0.16788306832313538, 0.02956513874232769, 0.05164198577404022, 0.07984573394060135, 0.0778571367263794, 0.05325423181056976, 0.021001344546675682, 0.015336939133703709], [0.03085823357105255, 0.013755046762526035, 0.05807757005095482, 0.16315485537052155, 0.04563186690211296, 0.011047900654375553, 0.10825290530920029, 0.016059817746281624, 0.06565750390291214, 0.05843520164489746, 0.13953715562820435, 0.07260048389434814, 0.0420500822365284, 0.0528855174779892, 0.06949817389249802, 0.03663808852434158, 0.01585960201919079], [0.0951249897480011, 0.010509581305086613, 0.07797087728977203, 0.13199560344219208, 0.05527045950293541, 0.010348842479288578, 0.0719812735915184, 0.05745045840740204, 0.033593352884054184, 0.05967242270708084, 0.038192570209503174, 0.10231940448284149, 0.06335455924272537, 0.029674727469682693, 0.06450008600950241, 0.05722283944487572, 0.040818002074956894], [0.04886099696159363, 0.005254683084785938, 0.04428006708621979, 0.08492496609687805, 0.06681586802005768, 0.005440168082714081, 0.10211720317602158, 0.037304237484931946, 0.0623994879424572, 0.06731677055358887, 0.08097345381975174, 0.0452965684235096, 0.14427170157432556, 0.053012434393167496, 0.0650520846247673, 0.05750428885221481, 0.029174963012337685], [0.014032246544957161, 0.005548299755901098, 0.026132917031645775, 0.1361457258462906, 0.05952201038599014, 0.004827907774597406, 0.0944448858499527, 0.02723493054509163, 0.17695733904838562, 0.08778490126132965, 0.03840772435069084, 0.023439645767211914, 0.05571724846959114, 0.15007783472537994, 0.045711662620306015, 0.025663010776042938, 0.028351765125989914], [0.03983579948544502, 0.010064511559903622, 0.04523497447371483, 0.10800223052501678, 0.030232300981879234, 0.010920158587396145, 0.13549508154392242, 0.03564969077706337, 0.03668481111526489, 0.07009463012218475, 0.03362143412232399, 0.05591435357928276, 0.09989331662654877, 0.026779253035783768, 0.16782771050930023, 0.06860166788101196, 0.025148121640086174], [0.053188011050224304, 0.025105983018875122, 0.05498620867729187, 0.0748213678598404, 0.04776511341333389, 0.024813514202833176, 0.14287874102592468, 0.028620056807994843, 0.041766878217458725, 0.06151973828673363, 0.03982318937778473, 0.048364605754613876, 0.07417276501655579, 0.03288606181740761, 0.14530937373638153, 0.07942074537277222, 0.024557707831263542], [0.028322333469986916, 0.019281625747680664, 0.04767056554555893, 0.09834294021129608, 0.07611241936683655, 0.016271905973553658, 0.040955595672130585, 0.17107445001602173, 0.02836928330361843, 0.06596244871616364, 0.045054078102111816, 0.057401709258556366, 0.07091746479272842, 0.026767101138830185, 0.038281116634607315, 0.03188716992735863, 0.13732783496379852]], [[0.4219793379306793, 0.00042031393968500197, 0.012754668481647968, 0.013446620665490627, 0.008614479564130306, 0.00034394764224998653, 0.02357897162437439, 0.1876530796289444, 0.015162059105932713, 0.013038101606070995, 0.01811956614255905, 0.01179222110658884, 0.021664060652256012, 0.01592121832072735, 0.022597385570406914, 0.0371260903775692, 0.175788015127182], [0.3963456451892853, 0.058182019740343094, 0.0376095324754715, 0.020101111382246017, 0.05374126136302948, 0.06250861287117004, 0.07416395097970963, 0.13788622617721558, 0.010731831192970276, 0.009106392972171307, 0.014868086203932762, 0.012916970066726208, 0.008341750130057335, 0.010753404349088669, 0.010939477942883968, 0.03372671455144882, 0.048076942563056946], [0.17816555500030518, 0.0066682882606983185, 0.19567355513572693, 0.27481186389923096, 0.03811279684305191, 0.007363385055214167, 0.061939582228660583, 0.05634428188204765, 0.017225751653313637, 0.048884570598602295, 0.02404920943081379, 0.010476725175976753, 0.0072446842677891254, 0.018427714705467224, 0.02651158906519413, 0.010922207497060299, 0.01717817783355713], [0.22563618421554565, 0.006434955634176731, 0.14752331376075745, 0.11195889860391617, 0.03842564672231674, 0.0072996043600142, 0.09541713446378708, 0.10304304957389832, 0.01008765771985054, 0.030607309192419052, 0.06301691383123398, 0.03333214297890663, 0.013188469223678112, 0.009952506050467491, 0.048431310802698135, 0.023158103227615356, 0.032486774027347565], [0.47825151681900024, 0.01827845349907875, 0.02266727387905121, 0.0640328973531723, 0.06126278266310692, 0.013983933255076408, 0.048161741346120834, 0.09199195355176926, 0.014873746782541275, 0.007186683360487223, 0.05276689678430557, 0.01021625380963087, 0.010516783222556114, 0.014724992215633392, 0.020882146432995796, 0.030359597876667976, 0.03984232246875763], [0.3278120756149292, 0.06746284663677216, 0.03803589567542076, 0.02756735123693943, 0.04307401925325394, 0.06304634362459183, 0.08616501837968826, 0.16426125168800354, 0.01062772423028946, 0.0095875458791852, 0.016214709728956223, 0.014889762736856937, 0.010852629318833351, 0.01113430317491293, 0.013067127205431461, 0.04038236662745476, 0.05581899732351303], [0.14427945017814636, 0.014682902954518795, 0.13479658961296082, 0.13704460859298706, 0.05867443606257439, 0.015959259122610092, 0.03219415992498398, 0.2645726799964905, 0.01078811101615429, 0.022878075018525124, 0.009549505077302456, 0.010713727213442326, 0.007585249841213226, 0.011409432627260685, 0.03577147051692009, 0.02057187631726265, 0.06852848082780838], [0.256969153881073, 0.13058331608772278, 0.02623622864484787, 0.03898841515183449, 0.02939792536199093, 0.12544505298137665, 0.08189690858125687, 0.040305476635694504, 0.04579513892531395, 0.024622928351163864, 0.02521948516368866, 0.015646565705537796, 0.02071412280201912, 0.0475456565618515, 0.02276449091732502, 0.040912922471761703, 0.02695617824792862], [0.32555246353149414, 0.00733779463917017, 0.019659025594592094, 0.015998508781194687, 0.02902216650545597, 0.005893188528716564, 0.018417172133922577, 0.08419197052717209, 0.00990066397935152, 0.046123337000608444, 0.037808168679475784, 0.03308148682117462, 0.04145854339003563, 0.01030014269053936, 0.04175945743918419, 0.07230271399021149, 0.20119312405586243], [0.3004795014858246, 0.0086039574816823, 0.09115209430456161, 0.012931648641824722, 0.021006718277931213, 0.00829199980944395, 0.038590360432863235, 0.026586445048451424, 0.04924612492322922, 0.05942846089601517, 0.0447976179420948, 0.057260554283857346, 0.04138820245862007, 0.043168723583221436, 0.07489985972642899, 0.055241119116544724, 0.06692652404308319], [0.15479177236557007, 0.009073935449123383, 0.010230125859379768, 0.010474643670022488, 0.03937844559550285, 0.007451041601598263, 0.021844100207090378, 0.13201923668384552, 0.020737232640385628, 0.02071734331548214, 0.03985373303294182, 0.035455554723739624, 0.037478942424058914, 0.021864555776119232, 0.021733874455094337, 0.08225103467702866, 0.33464449644088745], [0.18487930297851562, 0.0055061145685613155, 0.02677842229604721, 0.07138465344905853, 0.0817866101861, 0.005371967796236277, 0.014900784008204937, 0.07587260752916336, 0.023534979671239853, 0.06999438256025314, 0.056403160095214844, 0.029908763244748116, 0.02301517315208912, 0.0213029682636261, 0.07624495774507523, 0.05997120961546898, 0.1731439232826233], [0.38577157258987427, 0.009635855443775654, 0.01232993509620428, 0.03012324497103691, 0.019429601728916168, 0.00821708794683218, 0.017908232286572456, 0.022904671728610992, 0.04930565506219864, 0.05864814296364784, 0.0598006471991539, 0.024526583030819893, 0.02399115450680256, 0.049988482147455215, 0.056823138147592545, 0.08934486657381058, 0.0812511220574379], [0.3383937180042267, 0.0084907915443182, 0.021348334848880768, 0.013287722133100033, 0.027609264478087425, 0.0070372759364545345, 0.017336726188659668, 0.08152962476015091, 0.011399364098906517, 0.0424652136862278, 0.035647790879011154, 0.034198421984910965, 0.040026433765888214, 0.011195002123713493, 0.04483586922287941, 0.06926406174898148, 0.1959344446659088], [0.032923612743616104, 0.009761340916156769, 0.033804308623075485, 0.0679296925663948, 0.026128407567739487, 0.008684392087161541, 0.07500454783439636, 0.0517345555126667, 0.07062672823667526, 0.0658923014998436, 0.07291299104690552, 0.06662863492965698, 0.11759982258081436, 0.06500908732414246, 0.07412537187337875, 0.05620782449841499, 0.10502630472183228], [0.13437867164611816, 0.02324984222650528, 0.012686249800026417, 0.02832750976085663, 0.04141411930322647, 0.01751963421702385, 0.02810688689351082, 0.09821933507919312, 0.04347569867968559, 0.038896240293979645, 0.05599648132920265, 0.03021777980029583, 0.03821050375699997, 0.04158833995461464, 0.03035077266395092, 0.045609310269355774, 0.29175257682800293], [0.17753541469573975, 0.05048844590783119, 0.008133327588438988, 0.009791803546249866, 0.010845424607396126, 0.04014540836215019, 0.029471341520547867, 0.013422233983874321, 0.14362472295761108, 0.05295664444565773, 0.06605540961027145, 0.02726936712861061, 0.04869583994150162, 0.13766781985759735, 0.0398147776722908, 0.07570957392454147, 0.06837249547243118]], [[0.4642263948917389, 0.15011587738990784, 0.03047654591500759, 0.013400952331721783, 0.009389850310981274, 0.18438056111335754, 0.03871025890111923, 0.00894933007657528, 0.014150694012641907, 0.008076917380094528, 0.006490075495094061, 0.008659686893224716, 0.025774667039513588, 0.011727891862392426, 0.006001539994031191, 0.012597724795341492, 0.006870914250612259], [0.006905724294483662, 0.05280597507953644, 0.07246076315641403, 0.25075769424438477, 0.213982954621315, 0.09163138270378113, 0.09479180723428726, 0.1713242083787918, 0.005458919797092676, 0.007970266975462437, 0.003719099098816514, 0.007347114384174347, 0.005414131097495556, 0.003277415642514825, 0.003650110447779298, 0.004056256730109453, 0.0044461810030043125], [0.0320899598300457, 0.049885429441928864, 0.06988368928432465, 0.3438400328159332, 0.10231362283229828, 0.09733299911022186, 0.14850829541683197, 0.10681233555078506, 0.00663163373246789, 0.006381039507687092, 0.00351930339820683, 0.0048299068585038185, 0.007464840542525053, 0.004364532884210348, 0.0059597063809633255, 0.007609616033732891, 0.0025730649940669537], [0.028231756761670113, 0.041846033185720444, 0.15159739553928375, 0.1434335708618164, 0.07815071940422058, 0.0799737274646759, 0.2379385083913803, 0.17769062519073486, 0.004105762112885714, 0.011196449398994446, 0.00501780491322279, 0.009534628130495548, 0.007748558185994625, 0.0020959863904863596, 0.009524949826300144, 0.007285501807928085, 0.004627921152859926], [0.004376014694571495, 0.02738312818109989, 0.08627881854772568, 0.21873681247234344, 0.029875515028834343, 0.04191931337118149, 0.20251964032649994, 0.32626673579216003, 0.007694512140005827, 0.012361187487840652, 0.006731376051902771, 0.006748461164534092, 0.006117776967585087, 0.0040097408927977085, 0.007444332353770733, 0.00570234190672636, 0.005834260489791632], [0.004580933600664139, 0.05106291547417641, 0.06682868301868439, 0.16923664510250092, 0.10087470710277557, 0.054742686450481415, 0.10758553445339203, 0.3854140639305115, 0.010294039733707905, 0.011628208681941032, 0.004730667918920517, 0.009281697683036327, 0.0049501266330480576, 0.0044130790047347546, 0.004333956632763147, 0.0048819174990057945, 0.005160150118172169], [0.07475162297487259, 0.11402847617864609, 0.08560726046562195, 0.309386670589447, 0.06192252039909363, 0.112920843064785, 0.05925436317920685, 0.13829898834228516, 0.006826133467257023, 0.007336321286857128, 0.0036889330949634314, 0.005693697836250067, 0.0066011929884552956, 0.0028285947628319263, 0.004328699316829443, 0.003951238468289375, 0.002574456622824073], [0.10571454465389252, 0.23942486941814423, 0.05093260854482651, 0.06883835792541504, 0.06267644464969635, 0.29513832926750183, 0.05219503864645958, 0.050584424287080765, 0.015356901101768017, 0.008389630354940891, 0.009617320261895657, 0.009352055378258228, 0.010324811562895775, 0.007052164059132338, 0.004579141270369291, 0.0067511857487261295, 0.0030721784569323063], [0.009705080650746822, 0.0012213971931487322, 0.0015657899202778935, 0.005194930825382471, 0.003359283786267042, 0.001350846840068698, 0.0056806886568665504, 0.01008942723274231, 0.04874679818749428, 0.062025249004364014, 0.12701448798179626, 0.15471945703029633, 0.16935895383358002, 0.05679499730467796, 0.07243680953979492, 0.1535092443227768, 0.11722658574581146], [0.01809130236506462, 0.001262002275325358, 0.003507893066853285, 0.01015250664204359, 0.001669099205173552, 0.0013958039926365018, 0.0034891057293862104, 0.00396891450509429, 0.05036746338009834, 0.05656111612915993, 0.044578149914741516, 0.20836926996707916, 0.11153224110603333, 0.07293415069580078, 0.06722274422645569, 0.2790299952030182, 0.06586828827857971], [0.007637783419340849, 0.0018096726853400469, 0.002747780177742243, 0.011290295980870724, 0.0012870709178969264, 0.0010950509458780289, 0.001856288406997919, 0.015191523358225822, 0.05480312183499336, 0.05592730641365051, 0.03805926442146301, 0.18480776250362396, 0.11843009293079376, 0.0984257236123085, 0.0685182586312294, 0.13795800507068634, 0.20015497505664825], [0.022611940279603004, 0.0025238615926355124, 0.004148697014898062, 0.00925571471452713, 0.0019382340833544731, 0.002068712841719389, 0.00439135218039155, 0.007416078820824623, 0.07023143768310547, 0.10995566844940186, 0.04266273230314255, 0.06169414892792702, 0.12709099054336548, 0.11079767346382141, 0.07347092777490616, 0.2134959101676941, 0.13624590635299683], [0.015786336734890938, 0.0011368925916031003, 0.0015711382729932666, 0.005273908376693726, 0.000806732103228569, 0.0007117970380932093, 0.001423405483365059, 0.002525951014831662, 0.07697726786136627, 0.04530157893896103, 0.03796704486012459, 0.06349309533834457, 0.038835134357213974, 0.13342352211475372, 0.12362530827522278, 0.33428314328193665, 0.11685775965452194], [0.007247959729284048, 0.0008353959419764578, 0.0011472796322777867, 0.003482480300590396, 0.0016809632070362568, 0.000651380920317024, 0.003088153200224042, 0.00551245454698801, 0.02451328933238983, 0.030066078528761864, 0.062267545610666275, 0.061817146837711334, 0.11286430060863495, 0.05333792045712471, 0.10106418281793594, 0.2870910167694092, 0.24333250522613525], [0.019530219957232475, 0.0028466288931667805, 0.010542494244873524, 0.049359098076820374, 0.002222999231889844, 0.0027230039704591036, 0.00337947066873312, 0.004580409731715918, 0.047660671174526215, 0.09290005266666412, 0.06211331859230995, 0.09507135301828384, 0.10339681804180145, 0.09029088914394379, 0.040602464228868484, 0.190764382481575, 0.18201576173305511], [0.05771249160170555, 0.002397199161350727, 0.004390460904687643, 0.009716321714222431, 0.004288450814783573, 0.0018453211523592472, 0.0018793577328324318, 0.008720731362700462, 0.03622519597411156, 0.05535881966352463, 0.04244125261902809, 0.08322694152593613, 0.09775102883577347, 0.06345096975564957, 0.044456660747528076, 0.0827159509062767, 0.40342286229133606], [0.09034334868192673, 0.006762105971574783, 0.0022370461374521255, 0.0037526944652199745, 0.0024745946284383535, 0.006730477791279554, 0.0013155407505109906, 0.002513111801818013, 0.0729508176445961, 0.03344748169183731, 0.06496766954660416, 0.08229215443134308, 0.1665639728307724, 0.12602883577346802, 0.048981137573719025, 0.14047116041183472, 0.14816780388355255]], [[0.03507465124130249, 0.02125316672027111, 0.0711810365319252, 0.008124702610075474, 0.028302202001214027, 0.019387779757380486, 0.028941886499524117, 0.007855070754885674, 0.06748174130916595, 0.036652371287345886, 0.07082614302635193, 0.07026960700750351, 0.14037807285785675, 0.052011922001838684, 0.015374376438558102, 0.31704339385032654, 0.009841871447861195], [0.30932706594467163, 0.026621421799063683, 0.041242171078920364, 0.03280939534306526, 0.02902062050998211, 0.04835886508226395, 0.030427388846874237, 0.08751203864812851, 0.042487069964408875, 0.035302869975566864, 0.03572053089737892, 0.040987443178892136, 0.04589516669511795, 0.0345478281378746, 0.01720789447426796, 0.08370008319616318, 0.05883212760090828], [0.05251365154981613, 0.01540257129818201, 0.33565106987953186, 0.048401568084955215, 0.025340106338262558, 0.02088344655930996, 0.027371453121304512, 0.058214735239744186, 0.041148267686367035, 0.24191902577877045, 0.010634061880409718, 0.014496655203402042, 0.010111714713275433, 0.03621352091431618, 0.02442062459886074, 0.011479733511805534, 0.025797780603170395], [0.19920426607131958, 0.03732261434197426, 0.01813897304236889, 0.20547422766685486, 0.02700815163552761, 0.029461078345775604, 0.12540896236896515, 0.15546664595603943, 0.0195823572576046, 0.05235731601715088, 0.012233883142471313, 0.006648073438555002, 0.010281623341143131, 0.019302314147353172, 0.02138448879122734, 0.011219250038266182, 0.04950566962361336], [0.25200411677360535, 0.0171364676207304, 0.05895107239484787, 0.036313097923994064, 0.021695924922823906, 0.017153698951005936, 0.04884760454297066, 0.12295135855674744, 0.03609376773238182, 0.04300033301115036, 0.04544870927929878, 0.048110101372003555, 0.03929496556520462, 0.02800506353378296, 0.0494893379509449, 0.0533183328807354, 0.08218605071306229], [0.26061415672302246, 0.05302901193499565, 0.07112182676792145, 0.030084414407610893, 0.02134588360786438, 0.028823940083384514, 0.03562429919838905, 0.06957235187292099, 0.03604262322187424, 0.03367941081523895, 0.03049767017364502, 0.051386669278144836, 0.06122678890824318, 0.03230244666337967, 0.028339462354779243, 0.10965961217880249, 0.046649497002363205], [0.07550113648176193, 0.0574786402285099, 0.03537502884864807, 0.09370726346969604, 0.013874487951397896, 0.024387361481785774, 0.2261875867843628, 0.02652926929295063, 0.02232443541288376, 0.0462244376540184, 0.012610047124326229, 0.006658135913312435, 0.019209476187825203, 0.01981288194656372, 0.21394480764865875, 0.0866384208202362, 0.019536582753062248], [0.3897954523563385, 0.025401389226317406, 0.04335080087184906, 0.036979757249355316, 0.02599744126200676, 0.02523444965481758, 0.0356181338429451, 0.019050469622015953, 0.03206954896450043, 0.02987455390393734, 0.03306412324309349, 0.042639825493097305, 0.054655857384204865, 0.037458181381225586, 0.03410709649324417, 0.07756749540567398, 0.0571354478597641], [0.12976402044296265, 0.012619482353329659, 0.041926220059394836, 0.00654192129150033, 0.012399904429912567, 0.00921829417347908, 0.01212246436625719, 0.02007998898625374, 0.12323251366615295, 0.07074414938688278, 0.025516817346215248, 0.05065397918224335, 0.09774885326623917, 0.16823755204677582, 0.023121541365981102, 0.11990492790937424, 0.07616745680570602], [0.13293124735355377, 0.03645618259906769, 0.2655351758003235, 0.018429987132549286, 0.014506008476018906, 0.028547951951622963, 0.007614663802087307, 0.026449257507920265, 0.04769675061106682, 0.1458970606327057, 0.022883228957653046, 0.041723813861608505, 0.03475232794880867, 0.06567822396755219, 0.02448601834475994, 0.03521248325705528, 0.05119956284761429], [0.17031683027744293, 0.05871598422527313, 0.15613187849521637, 0.0437115803360939, 0.011033320799469948, 0.03992927819490433, 0.11453671008348465, 0.0392589196562767, 0.04303819686174393, 0.05116339772939682, 0.03361740708351135, 0.019817644730210304, 0.03733835741877556, 0.03912129998207092, 0.06405390053987503, 0.023233909159898758, 0.05498141795396805], [0.26652926206588745, 0.018265899270772934, 0.08591584116220474, 0.03026704490184784, 0.01592058129608631, 0.01656624674797058, 0.016426628455519676, 0.08285809308290482, 0.04046237841248512, 0.1206236258149147, 0.025002310052514076, 0.019135933369398117, 0.029701221734285355, 0.027329592034220695, 0.07956552505493164, 0.04300723224878311, 0.08242262154817581], [0.34223321080207825, 0.007403571158647537, 0.007057654205709696, 0.02260451205074787, 0.011644713580608368, 0.01019713282585144, 0.036437779664993286, 0.024436188861727715, 0.10541103780269623, 0.04736291989684105, 0.03320001810789108, 0.009939096868038177, 0.08591890335083008, 0.058175090700387955, 0.04184796288609505, 0.09818746894598007, 0.05794272944331169], [0.12496235221624374, 0.01265787798911333, 0.0344216488301754, 0.01073166262358427, 0.015664631500840187, 0.010663534514605999, 0.020246164873242378, 0.02499452605843544, 0.21983669698238373, 0.13158147037029266, 0.022832116112113, 0.06116678565740585, 0.05226214602589607, 0.09263266623020172, 0.01572021096944809, 0.08193518966436386, 0.06769031286239624], [0.12971986830234528, 0.01953803189098835, 0.03266414627432823, 0.01434003934264183, 0.008641194552183151, 0.01585811749100685, 0.03875471651554108, 0.056302741169929504, 0.02605738490819931, 0.05778011307120323, 0.01755901239812374, 0.03622179850935936, 0.032669126987457275, 0.015651147812604904, 0.445513516664505, 0.0037283182609826326, 0.04900069162249565], [0.1585540920495987, 0.014617123641073704, 0.05991144850850105, 0.037549640983343124, 0.021935634315013885, 0.013210732489824295, 0.0725705474615097, 0.019603772088885307, 0.05697696655988693, 0.05389226973056793, 0.033428389579057693, 0.08881169557571411, 0.09632416814565659, 0.04154964163899422, 0.03833574429154396, 0.17068997025489807, 0.02203807607293129], [0.39049816131591797, 0.01921779476106167, 0.022208914160728455, 0.01832391507923603, 0.014105512760579586, 0.015184410847723484, 0.028397979214787483, 0.030795136466622353, 0.05581765994429588, 0.05468479171395302, 0.05097431316971779, 0.031241632997989655, 0.09070908278226852, 0.050291579216718674, 0.030384419485926628, 0.060416318476200104, 0.03674842044711113]], [[0.6835547089576721, 0.030619695782661438, 0.024648237973451614, 0.012478676624596119, 0.017658190801739693, 0.019794683903455734, 0.020877372473478317, 0.022738777101039886, 0.01563696563243866, 0.024021334946155548, 0.01455566193908453, 0.017218327149748802, 0.01777968741953373, 0.012504181824624538, 0.013269094750285149, 0.031215067952871323, 0.0214292760938406], [0.0025672775227576494, 0.037981852889060974, 0.6476801037788391, 0.20529739558696747, 0.05104002356529236, 0.009053058922290802, 0.0066690873354673386, 0.02334451675415039, 0.000893070362508297, 0.00032262824242934585, 0.00040878268191590905, 0.0015672162408009171, 0.001281112665310502, 0.002200143178924918, 0.0009534967830404639, 0.0023844081442803144, 0.006355911958962679], [0.047561198472976685, 0.19023966789245605, 0.08773227035999298, 0.2548995614051819, 0.22638654708862305, 0.07205486297607422, 0.030362559482455254, 0.02076762728393078, 0.03085460141301155, 0.00501651968806982, 0.00017405090329702944, 0.0020866587292402983, 0.006739509757608175, 0.016070852056145668, 0.003930972423404455, 0.002131069777533412, 0.002991439076140523], [0.44619742035865784, 0.0140035105869174, 0.03543118014931679, 0.026564523577690125, 0.3607853949069977, 0.04167652875185013, 0.006196468137204647, 0.05929796025156975, 0.0009952345862984657, 0.0004245598684065044, 0.0002643977350089699, 0.00018885159806814045, 0.000269232114078477, 0.000381864927476272, 0.0007136853528209031, 0.004742712713778019, 0.001866512349806726], [0.00279593956656754, 0.004599690902978182, 0.028453757986426353, 0.08102523535490036, 0.08169020712375641, 0.321857213973999, 0.3893742561340332, 0.059846848249435425, 0.005436278413981199, 0.0019886037334799767, 0.005381810944527388, 0.00024507747730240226, 0.00026763873756863177, 0.0003397251130081713, 0.005489187315106392, 0.007650651037693024, 0.00355784990824759], [0.001991596771404147, 0.0020991384517401457, 0.006820362992584705, 0.041917480528354645, 0.046517979353666306, 0.0389304980635643, 0.5942765474319458, 0.2518586814403534, 0.003242288716137409, 0.00744320685043931, 0.0007108635036274791, 0.0021038311533629894, 0.0004685911117121577, 2.3702839826000854e-05, 0.00029038795037195086, 0.0005804987158626318, 0.0007244090083986521], [0.017098218202590942, 0.0030449756886810064, 0.04204912856221199, 0.000555754522792995, 0.007588332053273916, 0.024241598322987556, 0.014368959702551365, 0.7750883102416992, 0.0998205915093422, 0.003383415983989835, 0.003352218307554722, 0.0030420543625950813, 0.0017866584239527583, 0.003230432514101267, 9.501189924776554e-05, 0.0007286597392521799, 0.0005256577278487384], [0.1946885585784912, 0.0007938350317999721, 0.0055067166686058044, 0.010565552860498428, 0.013212727382779121, 0.028560053557157516, 0.06848783791065216, 0.27705031633377075, 0.3122110068798065, 0.06034360080957413, 0.009430354461073875, 0.011671703308820724, 0.0023629574570804834, 0.002527414122596383, 0.0021532184910029173, 0.0002488536119926721, 0.00018520181765779853], [0.004473669454455376, 0.001349526341073215, 0.0005596168339252472, 0.003626809921115637, 0.0044386982917785645, 0.003662031376734376, 0.12421920150518417, 0.06682175397872925, 0.017130644991993904, 0.6020509004592896, 0.06115524843335152, 0.06610681116580963, 0.023587079718708992, 0.0008474200149066746, 0.008453533053398132, 0.011165961623191833, 0.00035119641688652337], [0.03805796802043915, 0.0027955675031989813, 0.001077538006938994, 0.0001239215926034376, 0.0013042258797213435, 0.008427001535892487, 0.006346818991005421, 0.037076160311698914, 0.11112441122531891, 0.021574050188064575, 0.26302972435951233, 0.36982065439224243, 0.05451921746134758, 0.045027270913124084, 0.0049868300557136536, 0.03032030165195465, 0.004388283006846905], [0.0002339819329790771, 0.0001503203238826245, 3.634392123785801e-05, 0.0001687453914200887, 1.8997658116859384e-05, 5.495042205438949e-05, 0.0001846592640504241, 0.001094429986551404, 0.004904120694845915, 0.006834257394075394, 0.00628278125077486, 0.9237989187240601, 0.04089167341589928, 0.007585433311760426, 0.005260462872684002, 0.0006855250103399158, 0.0018143767956644297], [0.018246766179800034, 0.003626000601798296, 0.0010573231847956777, 0.00020592061628121883, 0.00044333809637464583, 8.768463885644451e-05, 0.0010477087926119566, 0.011776024475693703, 0.012682265602052212, 0.02705039642751217, 0.0704200491309166, 0.01181273814290762, 0.44889014959335327, 0.1765323132276535, 0.03051019459962845, 0.17728671431541443, 0.00832442007958889], [0.009013975970447063, 0.0017323887441307306, 0.00813030730932951, 0.0006099447491578758, 0.0003372595820110291, 0.00035771142574958503, 0.00025945118977688253, 0.001733841490931809, 0.006379575002938509, 0.013327370397746563, 0.05485324189066887, 0.04168546199798584, 0.01541446428745985, 0.4224446415901184, 0.07960078120231628, 0.30029019713401794, 0.043829433619976044], [0.003527288557961583, 0.0033705581445246935, 0.0017958969110623002, 0.006987975910305977, 0.0008359006606042385, 0.00036122617893852293, 0.0026796746533364058, 0.00015825964510440826, 0.00014985023881308734, 0.0070972018875181675, 0.0021169153042137623, 0.03400379419326782, 0.051015760749578476, 0.02188577875494957, 0.3721461594104767, 0.4560808837413788, 0.035786934196949005], [0.013157173991203308, 0.00765022961422801, 0.007690007798373699, 0.0005113592487759888, 0.007597107905894518, 0.0020123065914958715, 0.000723605917301029, 0.0008679111488163471, 0.000268133997451514, 0.0003639897040557116, 0.0024393678177148104, 0.0017095449147745967, 0.04405433312058449, 0.03876866400241852, 0.01856165938079357, 0.7617796063423157, 0.09184505045413971], [0.013514102436602116, 0.003335647750645876, 0.004363333340734243, 0.0032574981451034546, 0.00024108888464979827, 0.0017462241230532527, 0.002117993077263236, 0.0005242348415777087, 0.0002929646288976073, 0.00036590106901712716, 0.0012298704823479056, 0.006304456852376461, 0.03709431365132332, 0.015022866427898407, 0.029339397326111794, 0.03288932517170906, 0.8483607172966003], [0.23818586766719818, 0.004009730648249388, 0.005934701766818762, 0.012230360880494118, 0.004234594758599997, 0.0007666320889256895, 0.002873557386919856, 0.005815331358462572, 0.000986849539913237, 0.0026981448754668236, 0.0005416061612777412, 0.003058027010411024, 0.0315338633954525, 0.013829936273396015, 0.07865103334188461, 0.17703162133693695, 0.41761818528175354]], [[0.5748903751373291, 0.2055213898420334, 0.0021189977414906025, 0.0013741275761276484, 0.010712096467614174, 0.14480623602867126, 0.00690052006393671, 0.02354525960981846, 0.004093955270946026, 0.0024314592592418194, 0.0010480450000613928, 0.0020388357806950808, 0.001960600260645151, 0.0028788712806999683, 0.0007706018513999879, 0.0005029322928749025, 0.014405693858861923], [0.6202282905578613, 0.05801479518413544, 0.01603803224861622, 0.0995836928486824, 0.03990320861339569, 0.04373876750469208, 0.009945983067154884, 0.046737030148506165, 0.0019837168511003256, 0.020817169919610023, 0.00585089111700654, 0.0029879482463002205, 0.0019064333755522966, 0.0014513868372887373, 0.008081994950771332, 0.0016799644799903035, 0.02105073817074299], [0.07212022691965103, 0.17141683399677277, 0.011774636805057526, 0.17223061621189117, 0.09503298997879028, 0.06763198226690292, 0.04786262288689613, 0.20975786447525024, 0.015425659716129303, 0.004829015117138624, 0.0037946358788758516, 0.019266359508037567, 0.012203112244606018, 0.01224062591791153, 0.025870582088828087, 0.02267357148230076, 0.03586873412132263], [0.46808117628097534, 0.04571835696697235, 0.030704684555530548, 0.045404739677906036, 0.08194413036108017, 0.04125532507896423, 0.1387435644865036, 0.09956207871437073, 0.004566682502627373, 0.012431211769580841, 0.0023831191938370466, 0.001737649436108768, 0.0028251931071281433, 0.003089276608079672, 0.0021989336237311363, 0.005020700395107269, 0.01433322299271822], [0.1803184598684311, 0.034573424607515335, 0.05301624536514282, 0.4264952838420868, 0.02292993664741516, 0.05125298723578453, 0.0260234996676445, 0.07284117490053177, 0.0025138994678854942, 0.055029790848493576, 0.00866913702338934, 0.004213879816234112, 0.0008359877392649651, 0.001163340755738318, 0.028745518997311592, 0.004427660722285509, 0.026949668303132057], [0.44204655289649963, 0.0537542998790741, 0.01040646806359291, 0.11555084586143494, 0.08234629780054092, 0.06093606352806091, 0.029317263513803482, 0.14425598084926605, 0.0015941639430820942, 0.0319666713476181, 0.004808458965271711, 0.0021366700530052185, 0.0019780765287578106, 0.0004961491213180125, 0.007310742978006601, 0.0012951588723808527, 0.00979999452829361], [0.11010923981666565, 0.03641044721007347, 0.027566038072109222, 0.2185249775648117, 0.02677285484969616, 0.07854060083627701, 0.011256965808570385, 0.3570291996002197, 0.014237076975405216, 0.03179732710123062, 0.0046987696550786495, 0.009977028705179691, 0.005547455046325922, 0.003923743963241577, 0.03134789690375328, 0.010670391842722893, 0.0215899795293808], [0.5494299530982971, 0.05859263986349106, 0.012013251893222332, 0.02968718856573105, 0.0263986736536026, 0.11010942608118057, 0.021669108420610428, 0.12640173733234406, 0.01015305146574974, 0.018706828355789185, 0.006329825147986412, 0.007065310142934322, 0.002457199152559042, 0.0019661211408674717, 0.002662734128534794, 0.0008065528818406165, 0.015550383366644382], [0.06536279618740082, 0.0025010586250573397, 0.00534816225990653, 0.02648935280740261, 0.0015693290624767542, 0.003755953861400485, 0.042422108352184296, 0.21213607490062714, 0.009258792735636234, 0.29660090804100037, 0.0207565538585186, 0.0481487475335598, 0.15190944075584412, 0.002029404742643237, 0.022240549325942993, 0.028559746220707893, 0.060911014676094055], [0.1276891827583313, 0.007563093677163124, 0.0052275643683969975, 0.005885203834623098, 0.0030459442641586065, 0.018730707466602325, 0.030419515445828438, 0.32947471737861633, 0.04104619845747948, 0.0050090462900698185, 0.08824228495359421, 0.049041520804166794, 0.018300162628293037, 0.017343206331133842, 0.04928332939743996, 0.014393845573067665, 0.1893044263124466], [0.0932212695479393, 0.007898903451859951, 0.006572507321834564, 0.017050648108124733, 0.005419469904154539, 0.006595875136554241, 0.008183586411178112, 0.05790412798523903, 0.10048915445804596, 0.3671805262565613, 0.049166664481163025, 0.0824677050113678, 0.04933436959981918, 0.04703320190310478, 0.013704411685466766, 0.03412412479519844, 0.053653474897146225], [0.2898176312446594, 0.006054127123206854, 0.0036440405528992414, 0.0148227633908391, 0.0055983830243349075, 0.0041818018071353436, 0.02088664285838604, 0.16474846005439758, 0.06509563326835632, 0.07743409276008606, 0.04931618645787239, 0.0034628785215318203, 0.010789560154080391, 0.07114007323980331, 0.043165579438209534, 0.029482915997505188, 0.14035911858081818], [0.031310074031353, 0.0015320440288633108, 0.001122340327128768, 0.01144096814095974, 0.00019982451340183616, 0.0020434160251170397, 0.005857226438820362, 0.02850714884698391, 0.18676814436912537, 0.021792737767100334, 0.017353367060422897, 0.09742826968431473, 0.014666317962110043, 0.23537909984588623, 0.16398537158966064, 0.05463998019695282, 0.125973641872406], [0.047279637306928635, 0.0026445304974913597, 0.004015921615064144, 0.01847054250538349, 0.0007131362217478454, 0.0010796872666105628, 0.014234290458261967, 0.03782493621110916, 0.0033129299990832806, 0.14728915691375732, 0.017355022951960564, 0.06086990237236023, 0.3371700942516327, 0.006533367559313774, 0.06553712487220764, 0.08295073360204697, 0.15271908044815063], [0.21096786856651306, 0.008210297673940659, 0.030143508687615395, 0.05999598652124405, 0.003010110929608345, 0.007182965520769358, 0.003914268221706152, 0.09623613208532333, 0.004739530850201845, 0.1234077662229538, 0.013820938766002655, 0.04265225678682327, 0.021119462326169014, 0.01205094251781702, 0.06651522219181061, 0.008013414219021797, 0.28801923990249634], [0.04040185734629631, 0.0029071501921862364, 0.017151542007923126, 0.031191686168313026, 0.000608003931120038, 0.0015823027351871133, 0.016976885497570038, 0.025271734222769737, 0.013457790948450565, 0.10208696126937866, 0.005976181477308273, 0.04973730817437172, 0.0627124160528183, 0.03763953223824501, 0.30612167716026306, 0.008525248616933823, 0.27765169739723206], [0.5110345482826233, 0.021214107051491737, 0.0031484689097851515, 0.006975341122597456, 0.005062417592853308, 0.008420225232839584, 0.0024418008979409933, 0.01910398341715336, 0.00479544885456562, 0.025157257914543152, 0.010209484957158566, 0.012089238502085209, 0.015126225538551807, 0.016794148832559586, 0.020974386483430862, 0.01814226433634758, 0.299310564994812]]], [[[0.2942064106464386, 0.039160460233688354, 0.03317175433039665, 0.016283294185996056, 0.1586073637008667, 0.05797572433948517, 0.06269232928752899, 0.04130171984434128, 0.039460714906454086, 0.02741401083767414, 0.042905692011117935, 0.023507410660386086, 0.03312396630644798, 0.03594551607966423, 0.01842297986149788, 0.030025390908122063, 0.04579522833228111], [0.11234013736248016, 0.05400193855166435, 0.22000369429588318, 0.0892023965716362, 0.04441475495696068, 0.04633288457989693, 0.12221872061491013, 0.13413378596305847, 0.009912466630339622, 0.028676168993115425, 0.01018514670431614, 0.013805604539811611, 0.012227486819028854, 0.009516485035419464, 0.012937908060848713, 0.019573643803596497, 0.060516759753227234], [0.07918878644704819, 0.13633184134960175, 0.07489942014217377, 0.010963517241179943, 0.02356603369116783, 0.06768477708101273, 0.009014979004859924, 0.4256519377231598, 0.00580556457862258, 0.007527557667344809, 0.002217942615970969, 0.00226124981418252, 0.00528138130903244, 0.0034260950051248074, 0.003916064742952585, 0.007136075291782618, 0.1351267695426941], [0.1494211107492447, 0.04864032566547394, 0.04137279465794563, 0.18501988053321838, 0.1036091148853302, 0.027430381625890732, 0.08121100813150406, 0.20665158331394196, 0.004257259424775839, 0.003457094542682171, 0.022281158715486526, 0.0232387762516737, 0.010450298897922039, 0.0038015139289200306, 0.008541776798665524, 0.017915673553943634, 0.06270021200180054], [0.16649559140205383, 0.15137293934822083, 0.04484995827078819, 0.04200722277164459, 0.05168299004435539, 0.09953606128692627, 0.05699102580547333, 0.2679547071456909, 0.005656800698488951, 0.006703716702759266, 0.002015704521909356, 0.00437629921361804, 0.007285885978490114, 0.0030511224176734686, 0.009616243652999401, 0.004954732954502106, 0.07544898241758347], [0.18440648913383484, 0.06000363826751709, 0.16277562081813812, 0.040696606040000916, 0.046857479959726334, 0.053631287068128586, 0.08474541455507278, 0.14826610684394836, 0.015691353008151054, 0.019942766055464745, 0.012469463050365448, 0.015263124369084835, 0.02123332768678665, 0.013269676826894283, 0.016003357246518135, 0.023385388776659966, 0.08135880529880524], [0.10291789472103119, 0.1424521803855896, 0.01512816920876503, 0.010362650267779827, 0.024956541135907173, 0.11401620507240295, 0.014625986106693745, 0.39091184735298157, 0.02003214880824089, 0.0047315312549471855, 0.0016476793680340052, 0.0016453261487185955, 0.005502951797097921, 0.007993718609213829, 0.0084117716178298, 0.007989581674337387, 0.12667381763458252], [0.2583490014076233, 0.051390502601861954, 0.03961044177412987, 0.03366640955209732, 0.18128560483455658, 0.0538775697350502, 0.08490049093961716, 0.03567209839820862, 0.024326015263795853, 0.01971268467605114, 0.06535258889198303, 0.023481445387005806, 0.02565721981227398, 0.022959930822253227, 0.029794815927743912, 0.021767769008874893, 0.028195347636938095], [0.048217400908470154, 0.031461894512176514, 0.0073286741971969604, 0.01988249458372593, 0.00930216908454895, 0.029779816046357155, 0.014332936145365238, 0.2404378205537796, 0.06219132989645004, 0.01606019213795662, 0.0033377560321241617, 0.030328329652547836, 0.13096046447753906, 0.029362710192799568, 0.00808022078126669, 0.0675598531961441, 0.2513759732246399], [0.45828524231910706, 0.026813596487045288, 0.00527245132252574, 0.0036671084817498922, 0.0740370973944664, 0.02479485049843788, 0.014194597490131855, 0.16702677309513092, 0.016573546454310417, 0.004108494613319635, 0.004460279364138842, 0.010082172229886055, 0.019173365086317062, 0.010918173007667065, 0.0027174027636647224, 0.009232476353645325, 0.1486424207687378], [0.24192474782466888, 0.02460918016731739, 0.0049170455895364285, 0.053803227841854095, 0.010416906327009201, 0.01665031537413597, 0.013680667616426945, 0.12323537468910217, 0.18243002891540527, 0.01017761044204235, 0.01942775398492813, 0.009469333104789257, 0.025349965319037437, 0.11039222031831741, 0.006360658910125494, 0.029608098790049553, 0.11754690110683441], [0.19179785251617432, 0.03607403114438057, 0.007599355187267065, 0.05878721550107002, 0.02901655063033104, 0.022668182849884033, 0.009196601808071136, 0.25340649485588074, 0.013587312772870064, 0.007168347481638193, 0.0202492605894804, 0.015386853367090225, 0.0069312360137701035, 0.01162862591445446, 0.007474367041140795, 0.026734009385108948, 0.28229376673698425], [0.25642696022987366, 0.016078144311904907, 0.005840683821588755, 0.00486754160374403, 0.016855480149388313, 0.021550389006733894, 0.022262882441282272, 0.11863991618156433, 0.1705816686153412, 0.058954812586307526, 0.011266568675637245, 0.010386349633336067, 0.02374374121427536, 0.12205943465232849, 0.013529784977436066, 0.022475330159068108, 0.10448024421930313], [0.10627448558807373, 0.029222659766674042, 0.005022116936743259, 0.01839040406048298, 0.010139775462448597, 0.02755764126777649, 0.009408225305378437, 0.21240746974945068, 0.07160402834415436, 0.009935073554515839, 0.0032641200814396143, 0.021732883527874947, 0.1268530637025833, 0.030779622495174408, 0.004728104919195175, 0.08964622020721436, 0.22303417325019836], [0.37329256534576416, 0.05806131660938263, 0.00556525494903326, 0.006698992568999529, 0.015171640552580357, 0.03911692649126053, 0.057422224432229996, 0.22861680388450623, 0.02002881094813347, 0.003992111887782812, 0.0037069320678710938, 0.007012305781245232, 0.00407114252448082, 0.009052271023392677, 0.003640666836872697, 0.015430751256644726, 0.14911936223506927], [0.09488152712583542, 0.034952495247125626, 0.008762393146753311, 0.02270941622555256, 0.014650014229118824, 0.03508550301194191, 0.01500701904296875, 0.15862365067005157, 0.1304904669523239, 0.054211974143981934, 0.007967072539031506, 0.01394654717296362, 0.04352014884352684, 0.10877382010221481, 0.024538293480873108, 0.01472149882465601, 0.21715816855430603], [0.601395308971405, 0.015973228961229324, 0.012312193401157856, 0.009063299745321274, 0.05247541889548302, 0.021258752793073654, 0.019890563562512398, 0.023481935262680054, 0.02930409647524357, 0.016061773523688316, 0.03538122400641441, 0.01889180764555931, 0.032414760440588, 0.027522653341293335, 0.02159687504172325, 0.026658568531274796, 0.03631753474473953]], [[0.5807123780250549, 0.04120909795165062, 0.008867739699780941, 0.01733371801674366, 0.014305485412478447, 0.029534505680203438, 0.01442652102559805, 0.09234919399023056, 0.026837516576051712, 0.011541194282472134, 0.012685221619904041, 0.028400318697094917, 0.018343161791563034, 0.020003899931907654, 0.007370438892394304, 0.031067311763763428, 0.04501226171851158], [0.37534400820732117, 0.024888383224606514, 0.499559611082077, 0.006246662698686123, 0.009760579094290733, 0.0105689512565732, 0.0017580765997990966, 0.041352529078722, 0.0001288901548832655, 1.9783539755735546e-05, 7.075294433889212e-06, 0.0015742983669042587, 0.00016656685329508036, 0.0002697305171750486, 3.338929309393279e-05, 0.0007923780358396471, 0.027528982609510422], [0.5255334377288818, 0.06502868980169296, 0.04805760830640793, 0.03629973530769348, 0.047795362770557404, 0.10859572887420654, 0.015880245715379715, 0.08095794916152954, 0.013673626817762852, 0.0020732588600367308, 1.0221308912150562e-05, 0.004519252106547356, 0.0001549973094370216, 0.00304097100161016, 0.004427528940141201, 0.0001912375882966444, 0.04376015439629555], [0.5188033580780029, 0.010038449428975582, 0.002708421554416418, 0.013112924993038177, 0.36794209480285645, 0.04042849689722061, 0.0024678409099578857, 0.03982872888445854, 0.00046830365317873657, 0.00026383629301562905, 1.0797327377076726e-05, 3.960409958381206e-05, 0.000123862293548882, 0.00010555232438491657, 0.00015543651534244418, 0.002345107262954116, 0.0011572744697332382], [0.10290932655334473, 0.0029478082433342934, 0.002373269060626626, 0.0008280323818325996, 0.02518332377076149, 0.8410524129867554, 0.0053685675375163555, 0.013445304706692696, 8.01453206804581e-05, 4.956334669259377e-05, 0.0004158160008955747, 2.610838600958232e-05, 5.9792532738356385e-06, 5.7897059377864935e-06, 0.0003356564266141504, 0.0023850970901548862, 0.002587782684713602], [0.14715026319026947, 0.004375548101961613, 0.006219474598765373, 0.00407334603369236, 0.010198993608355522, 0.05666745454072952, 0.6661040186882019, 0.09151221066713333, 0.0005084021249786019, 0.002335856668651104, 6.258254870772362e-05, 0.0056824227795004845, 0.00014237957657314837, 2.4906428279791726e-06, 7.442053902195767e-05, 0.0008465154096484184, 0.004043702036142349], [0.1569405198097229, 0.0031464421190321445, 0.009687644429504871, 0.0004722019948530942, 0.0017986721359193325, 0.00691115390509367, 0.006497683469206095, 0.8090660572052002, 0.0016536825569346547, 0.0016000033356249332, 0.00013849970127921551, 0.0008609144133515656, 1.6799260265543126e-05, 2.2572765374206938e-05, 6.893676527397474e-07, 0.0002916282683145255, 0.0008948236354626715], [0.8889017701148987, 4.895301754004322e-05, 0.0002072897186735645, 0.0006039748550392687, 0.0006328316521830857, 0.0016732490621507168, 0.0008611673838458955, 0.0894637256860733, 0.013337576761841774, 0.0007119080983102322, 5.4896667279535905e-05, 0.0020501252729445696, 9.651265281718224e-05, 0.0001773969706846401, 1.278231866308488e-05, 2.30489549721824e-05, 0.0011427255813032389], [0.056153230369091034, 0.0002748226106632501, 0.00010805700730998069, 1.051290655595949e-05, 0.0008257327717728913, 0.0009225833928212523, 0.003326368983834982, 0.025836175307631493, 0.003687663469463587, 0.8992720246315002, 0.0010510360589250922, 0.0024148609954863787, 0.000846002425532788, 0.00010444538202136755, 0.0046789576299488544, 6.968784873606637e-05, 0.0004179610696155578], [0.11362070590257645, 0.010236592032015324, 0.0005324642406776547, 4.938308393320767e-06, 0.002385772531852126, 0.03629636764526367, 0.0027964620385318995, 0.09169431030750275, 0.003992430865764618, 0.013213243335485458, 0.7041981220245361, 0.007061961572617292, 0.00031745037995278835, 0.0012300860835239291, 0.0007365556666627526, 0.0007005892693996429, 0.010981929488480091], [0.0016391429817304015, 0.0003202382358722389, 1.8410186385153793e-05, 6.494987019323162e-07, 5.893495540476579e-07, 0.00023699835583101958, 1.5843927030800842e-05, 0.0011491361074149609, 0.00010579307854641229, 0.00010398239828646183, 0.00030339680961333215, 0.9921616911888123, 0.0006543122581206262, 6.531050166813657e-05, 7.265472231665626e-05, 2.4089669750537723e-05, 0.0031278685200959444], [0.6928556561470032, 0.05536835268139839, 0.0010622063418850303, 0.00011805184476543218, 3.433192614465952e-05, 0.00027247238904237747, 0.0004102397651877254, 0.11748170852661133, 0.0007884047809056938, 0.0011498411186039448, 0.0009269376751035452, 0.004287737421691418, 0.08373153954744339, 0.005113203544169664, 0.0016939418856054544, 0.004109365865588188, 0.030596064403653145], [0.3296129107475281, 0.0017103067366406322, 0.001048304489813745, 0.0005151937948539853, 0.0002531539066694677, 0.0010566882556304336, 5.435044840851333e-06, 0.04597603902220726, 0.0064400359988212585, 0.00016019880422390997, 0.0004900129279121757, 0.002726091304793954, 0.0020133943762630224, 0.5553218126296997, 0.0010315522085875273, 0.004665195941925049, 0.04697369039058685], [0.10348385572433472, 0.0016293178778141737, 0.0014750288100913167, 6.713111361023039e-05, 0.0007799062877893448, 0.0009861844591796398, 0.0007190742180682719, 0.0007262665894813836, 0.0002717929019127041, 0.04833551496267319, 0.00011137499677715823, 0.0038250633515417576, 0.003235937561839819, 0.009693200699985027, 0.800777018070221, 0.0027022138237953186, 0.021181104704737663], [0.11845811456441879, 0.03623312711715698, 0.0005437781801447272, 3.384737283340655e-05, 0.10683819651603699, 0.010278448462486267, 3.426504190429114e-05, 0.004006266128271818, 1.1007806278939825e-05, 0.00017413990281056613, 0.0003026370541192591, 0.0008250735700130463, 0.0021501961164176464, 0.0004862987552769482, 0.00499759241938591, 0.6025999784469604, 0.11202705651521683], [0.4829951822757721, 0.018130507320165634, 0.0010809306986629963, 0.0004454187292139977, 1.4198865756043233e-05, 0.009204886853694916, 0.0008735992014408112, 0.007062417455017567, 0.00018919719150289893, 1.1025390449503902e-05, 9.030234650708735e-05, 0.0023876202758401632, 0.0003625082899816334, 0.001364001422189176, 0.00043839000863954425, 0.005595126189291477, 0.46975481510162354], [0.9607875347137451, 0.001442094799131155, 0.00026001298101618886, 0.0010312420781701803, 0.0003748074232134968, 0.0001437041792087257, 0.00023468966537620872, 0.005791218485683203, 8.685141074238345e-05, 6.448047497542575e-05, 1.6306505585816922e-06, 0.0007040995405986905, 0.0006305401329882443, 9.136833250522614e-05, 0.00026675942353904247, 0.0008705088985152543, 0.02721867524087429]], [[0.7185878157615662, 0.02183319441974163, 0.01043851301074028, 0.009065371006727219, 0.013156350702047348, 0.019503843039274216, 0.00973360612988472, 0.02891448512673378, 0.026824763044714928, 0.011336914263665676, 0.016757961362600327, 0.012981743551790714, 0.02045232430100441, 0.01836247742176056, 0.012017393484711647, 0.023685093969106674, 0.026348087936639786], [0.20503754913806915, 0.045343562960624695, 0.013054374605417252, 0.12613950669765472, 0.06906301528215408, 0.03549940884113312, 0.011455625295639038, 0.1908731609582901, 0.02940906397998333, 0.03421906754374504, 0.04253161698579788, 0.023407593369483948, 0.029890717938542366, 0.016298439353704453, 0.01451877225190401, 0.029133891686797142, 0.08412458747625351], [0.16809307038784027, 0.03815235570073128, 0.007063749246299267, 0.06280918419361115, 0.04072794318199158, 0.04073070362210274, 0.031230933964252472, 0.15413278341293335, 0.07205695658922195, 0.031824029982089996, 0.04444906488060951, 0.01685229130089283, 0.09054188430309296, 0.07142390310764313, 0.033193450421094894, 0.01541399210691452, 0.08130370080471039], [0.16384652256965637, 0.039215974509716034, 0.03818850219249725, 0.015100257471203804, 0.0674314945936203, 0.03755272552371025, 0.029478436335921288, 0.1745021939277649, 0.03720156103372574, 0.02902248688042164, 0.1436021775007248, 0.03061773255467415, 0.03010174073278904, 0.030388785526156425, 0.014857119880616665, 0.029757888987660408, 0.08913430571556091], [0.2911185026168823, 0.05079743266105652, 0.019740605726838112, 0.06522077322006226, 0.06794581562280655, 0.052945900708436966, 0.012776588089764118, 0.1323663592338562, 0.035251107066869736, 0.02701755054295063, 0.029728448018431664, 0.010967324487864971, 0.05439675971865654, 0.021521709859371185, 0.03704003989696503, 0.03818592056632042, 0.05297906696796417], [0.2500564455986023, 0.05407721921801567, 0.014342760667204857, 0.13691078126430511, 0.07112404704093933, 0.04487422853708267, 0.011892417445778847, 0.1402219533920288, 0.028884680941700935, 0.02508612722158432, 0.04562627896666527, 0.016080211848020554, 0.025460783392190933, 0.01715720258653164, 0.019535791128873825, 0.02987912856042385, 0.06878980994224548], [0.2926269769668579, 0.04051666334271431, 0.006198144983500242, 0.17157699167728424, 0.06531418859958649, 0.05105903372168541, 0.003978185821324587, 0.1036854013800621, 0.015256752260029316, 0.001727095223031938, 0.018350234255194664, 0.0031101771164685488, 0.11367517709732056, 0.014649041928350925, 0.013706203550100327, 0.031246080994606018, 0.05332352593541145], [0.37957513332366943, 0.019051088020205498, 0.016184326261281967, 0.027420559898018837, 0.012990938499569893, 0.017659984529018402, 0.00643672002479434, 0.1412927210330963, 0.04876267910003662, 0.023993901908397675, 0.11988711357116699, 0.019860200583934784, 0.03006940893828869, 0.027107445523142815, 0.01870284415781498, 0.02353781834244728, 0.06746706366539001], [0.15483614802360535, 0.03448833152651787, 0.021084820851683617, 0.02340063266456127, 0.03486000746488571, 0.029577482491731644, 0.015768539160490036, 0.019442999735474586, 0.08799503743648529, 0.05374117195606232, 0.07713525742292404, 0.006903733126819134, 0.3438015878200531, 0.04599263146519661, 0.009979521855711937, 0.029269399121403694, 0.011722663417458534], [0.29459184408187866, 0.02919173799455166, 0.0029196240939199924, 0.010601839981973171, 0.035028330981731415, 0.025251595303416252, 0.006742431782186031, 0.02819422446191311, 0.08141349256038666, 0.004742322489619255, 0.037918973714113235, 0.003895762376487255, 0.3174453675746918, 0.05936523154377937, 0.010600786656141281, 0.036253657191991806, 0.015842894092202187], [0.42960917949676514, 0.0707779973745346, 0.014211025089025497, 0.057334061712026596, 0.024310622364282608, 0.06716123223304749, 0.003025626763701439, 0.01715385727584362, 0.016845136880874634, 0.009447413496673107, 0.06951411068439484, 0.015528563410043716, 0.03785258159041405, 0.014569601975381374, 0.047628991305828094, 0.09305128455162048, 0.011978820897638798], [0.5906081199645996, 0.05667980760335922, 0.01000857912003994, 0.02584378980100155, 0.037622712552547455, 0.047004155814647675, 0.005465665832161903, 0.026525944471359253, 0.02192653715610504, 0.007482872344553471, 0.04053524509072304, 0.022266842424869537, 0.03644001483917236, 0.013884156011044979, 0.016780512407422066, 0.02253296598792076, 0.018392154946923256], [0.43626952171325684, 0.05828136205673218, 0.010051964782178402, 0.026334578171372414, 0.043492771685123444, 0.038921479135751724, 0.016574105247855186, 0.01216467097401619, 0.03822875767946243, 0.04473632574081421, 0.07633769512176514, 0.0048355706967413425, 0.06783358007669449, 0.03637557849287987, 0.02087695151567459, 0.05562075600028038, 0.013064419850707054], [0.25771477818489075, 0.02758728340268135, 0.01262574177235365, 0.013424351811408997, 0.02330782450735569, 0.021995695307850838, 0.008802018128335476, 0.010746505111455917, 0.08095359057188034, 0.02560603618621826, 0.060701634734869, 0.004561653360724449, 0.35815349221229553, 0.045421190559864044, 0.014096408151090145, 0.026111792773008347, 0.008190049789845943], [0.3968939781188965, 0.05589500814676285, 0.015484144911170006, 0.024567078799009323, 0.04629111662507057, 0.03775835037231445, 0.003694878425449133, 0.03292253240942955, 0.025954164564609528, 0.005798643920570612, 0.06226382404565811, 0.011265062727034092, 0.05703730508685112, 0.020007332786917686, 0.0383986234664917, 0.13976548612117767, 0.026002492755651474], [0.21753010153770447, 0.05323253944516182, 0.010753534734249115, 0.04870949313044548, 0.054636772722005844, 0.04739021137356758, 0.006550057325512171, 0.030185826122760773, 0.11537887156009674, 0.04335810989141464, 0.04652369022369385, 0.011112015694379807, 0.16087791323661804, 0.0689692348241806, 0.03168438374996185, 0.03546340763568878, 0.017643921077251434], [0.504054069519043, 0.029357917606830597, 0.013304616324603558, 0.01969185657799244, 0.01639614626765251, 0.02725977450609207, 0.008326550014317036, 0.06166190281510353, 0.0350794643163681, 0.021031586453318596, 0.10826375335454941, 0.02171511948108673, 0.017729321494698524, 0.0188139621168375, 0.030704084783792496, 0.023235877975821495, 0.0433739572763443]], [[0.6482798457145691, 0.056268446147441864, 0.003863082267343998, 0.0026292616967111826, 0.039393048733472824, 0.08332168310880661, 0.012232718989253044, 0.036429259926080704, 0.01495497114956379, 0.0034526928793638945, 0.022732499986886978, 0.006106847431510687, 0.008678500540554523, 0.010644311085343361, 0.007000730838626623, 0.014619525521993637, 0.029392564669251442], [0.5142877101898193, 0.07849133014678955, 0.02068483643233776, 0.012633571401238441, 0.05599180981516838, 0.10365787148475647, 0.0544997900724411, 0.07132288813591003, 0.014599810354411602, 0.005192063748836517, 0.004522743634879589, 0.005961945280432701, 0.011081777513027191, 0.008508051745593548, 0.004094616509974003, 0.016000935807824135, 0.018468184396624565], [0.10855378955602646, 0.03128325566649437, 0.058896731585264206, 0.15028676390647888, 0.03277243301272392, 0.023637644946575165, 0.26459750533103943, 0.09785537421703339, 0.04340459033846855, 0.008874040096998215, 0.008821253664791584, 0.023001469671726227, 0.035862091928720474, 0.026109417900443077, 0.012029734440147877, 0.044511765241622925, 0.02950209006667137], [0.24740450084209442, 0.08083457499742508, 0.1066495031118393, 0.019950374960899353, 0.08000475913286209, 0.06995340436697006, 0.19463695585727692, 0.06610240042209625, 0.015075323171913624, 0.013580495491623878, 0.005866491701453924, 0.008991090580821037, 0.0201895572245121, 0.010048913769423962, 0.01784593053162098, 0.019156377762556076, 0.023709291592240334], [0.45973992347717285, 0.0838560238480568, 0.03167903050780296, 0.019002815708518028, 0.028639955446124077, 0.08407121151685715, 0.08578956872224808, 0.09881563484668732, 0.015408722683787346, 0.005072189494967461, 0.003282167948782444, 0.013174903579056263, 0.01473909243941307, 0.008775072172284126, 0.005173812620341778, 0.01753339171409607, 0.02524654008448124], [0.5217321515083313, 0.09265062212944031, 0.017699306830763817, 0.007632432971149683, 0.0415399968624115, 0.1067304015159607, 0.03292735666036606, 0.08387015759944916, 0.01637311466038227, 0.004499124363064766, 0.004953694995492697, 0.0064506810158491135, 0.011463688686490059, 0.009908732026815414, 0.003970184829086065, 0.013669596053659916, 0.023928778246045113], [0.21077480912208557, 0.02199426479637623, 0.2693169414997101, 0.05162999406456947, 0.07286183536052704, 0.020699888467788696, 0.18559083342552185, 0.03266936540603638, 0.030355406925082207, 0.016375990584492683, 0.008912145160138607, 0.012442930601537228, 0.014205786399543285, 0.014281070791184902, 0.013571527786552906, 0.01751352660357952, 0.006803711876273155], [0.738476037979126, 0.028772180899977684, 0.0059258416295051575, 0.005328870378434658, 0.06409876048564911, 0.04519809037446976, 0.016800470650196075, 0.02020532265305519, 0.008163309656083584, 0.0025025801733136177, 0.015291946940124035, 0.004845321178436279, 0.005921830423176289, 0.007073286455124617, 0.004532909952104092, 0.016456278041005135, 0.010407062247395515], [0.2738190293312073, 0.0019465177319943905, 0.021412242203950882, 0.018045809119939804, 0.011627238243818283, 0.0013136413181200624, 0.06891851872205734, 0.007880487479269505, 0.03933050110936165, 0.020788244903087616, 0.04275975376367569, 0.04082588106393814, 0.17708967626094818, 0.029133278876543045, 0.02443244494497776, 0.21165108680725098, 0.009025638923048973], [0.7809253334999084, 0.0022239754907786846, 0.0026762064080685377, 0.0025304199662059546, 0.021669309586286545, 0.001493410556577146, 0.015894845128059387, 0.005270833615213633, 0.028719596564769745, 0.0018381065456196666, 0.020441675558686256, 0.014622879214584827, 0.04177571088075638, 0.019133802503347397, 0.008058125153183937, 0.026967113837599754, 0.005758552812039852], [0.5344483852386475, 0.0030383518896996975, 0.016202306374907494, 0.004435799550265074, 0.007209454197436571, 0.001891518710181117, 0.02948477864265442, 0.013029280118644238, 0.052405521273612976, 0.007596147246658802, 0.029785839840769768, 0.07504831254482269, 0.09318602085113525, 0.035661567002534866, 0.01922645792365074, 0.05800577253103256, 0.019344469532370567], [0.7889672517776489, 0.0016604720149189234, 0.005169859621673822, 0.002659570425748825, 0.007253003306686878, 0.001258523785509169, 0.010985956527292728, 0.004596759099513292, 0.020941907539963722, 0.004492275416851044, 0.027013882994651794, 0.0073482622392475605, 0.035710811614990234, 0.012800206430256367, 0.0056478530168533325, 0.05694454908370972, 0.006548953242599964], [0.686377227306366, 0.0013815655838698149, 0.0031483580823987722, 0.004112154711037874, 0.018335191532969475, 0.0013337370473891497, 0.027473770081996918, 0.0023715607821941376, 0.04432801529765129, 0.005097361747175455, 0.0301175769418478, 0.015001141466200352, 0.05351507291197777, 0.04007567837834358, 0.006751265376806259, 0.0562930703163147, 0.004287160001695156], [0.45378702878952026, 0.0022841976024210453, 0.010307500138878822, 0.0074905999936163425, 0.013973871245980263, 0.0017781691858544946, 0.060081906616687775, 0.006397246848791838, 0.04203445091843605, 0.014567393809556961, 0.03784400224685669, 0.029349369928240776, 0.1302204579114914, 0.025764714926481247, 0.01733858697116375, 0.1396312415599823, 0.0071492986753582954], [0.7541185617446899, 0.00429846066981554, 0.010749293491244316, 0.0036266499664634466, 0.014584883116185665, 0.003240632824599743, 0.024570491164922714, 0.015176241286098957, 0.02355566807091236, 0.003427969990298152, 0.02503485418856144, 0.01805908977985382, 0.02976321242749691, 0.01465181726962328, 0.00450160913169384, 0.034061249345541, 0.016579294577240944], [0.1961231827735901, 0.007989483885467052, 0.012025274336338043, 0.011479729786515236, 0.013616496697068214, 0.0038176991511136293, 0.02841930463910103, 0.012658189982175827, 0.17357461154460907, 0.021702662110328674, 0.05500391870737076, 0.0586363710463047, 0.21332289278507233, 0.10048922151327133, 0.022290268912911415, 0.04815292730927467, 0.02069772779941559], [0.8800064921379089, 0.007257149554789066, 0.001227105502039194, 0.0009484358015470207, 0.021190879866480827, 0.00999467633664608, 0.0030454162042587996, 0.006028722506016493, 0.006778924725949764, 0.0026835864409804344, 0.017334172502160072, 0.0037876174319535494, 0.00748257664963603, 0.005199508275836706, 0.003970787860453129, 0.014934180304408073, 0.008129695430397987]], [[0.5322138071060181, 0.05933433398604393, 0.010671328753232956, 0.010927482508122921, 0.03034929186105728, 0.06263474375009537, 0.01133034098893404, 0.04588230326771736, 0.04248083382844925, 0.014661109074950218, 0.02040206640958786, 0.013272065669298172, 0.023626890033483505, 0.04685518518090248, 0.016681471839547157, 0.009675146080553532, 0.04900157451629639], [0.9938390851020813, 0.004061238840222359, 0.0003432311932556331, 0.00012922285532113165, 0.0011145296739414334, 0.00025364678003825247, 8.129048364935443e-05, 3.760206936931354e-06, 3.2183340863412013e-06, 1.5198660548776388e-05, 9.523035942038405e-07, 2.779620081128087e-05, 2.3479910851165187e-06, 3.648057600003085e-06, 8.977054676506668e-05, 1.846124723670073e-05, 1.2597896784427576e-05], [0.7676234841346741, 0.21918344497680664, 0.004388763569295406, 0.00033558369614183903, 0.001214080723002553, 0.00039992135134525597, 0.003672762541100383, 0.0009896543342620134, 4.782309042639099e-05, 9.155010047834367e-05, 4.821628681384027e-05, 8.420432277489454e-05, 0.0002386862033745274, 0.00013925391249358654, 0.00032304777414537966, 0.00023822775983717293, 0.0009813919896259904], [0.40121373534202576, 0.23819109797477722, 0.04959322512149811, 0.057301830500364304, 0.12643630802631378, 0.06179250031709671, 0.0002992528607137501, 0.03793821856379509, 0.0009409622871316969, 7.92603605077602e-05, 3.2129068131325766e-05, 5.359593706089072e-05, 0.0003190866264048964, 0.001614773995243013, 7.724146416876465e-05, 0.0004582251131068915, 0.02365851402282715], [0.06375028192996979, 0.008976681157946587, 0.003667504759505391, 0.8871744871139526, 0.016758857294917107, 0.015066245570778847, 0.00020275152928661555, 0.002365291817113757, 0.0010034320876002312, 0.0002209538797615096, 3.8030430005164817e-07, 2.3840839276090264e-05, 2.6758620151667856e-05, 0.00019337960111442953, 0.000254443206358701, 1.5094071386556607e-05, 0.000299659906886518], [0.12632784247398376, 0.0032044644467532635, 0.002884334186092019, 0.025388970971107483, 0.7636216282844543, 0.06612717360258102, 0.0037245412822812796, 0.0059339189901947975, 0.000208754456252791, 0.0017219517612829804, 2.2113670638646e-05, 4.07795869250549e-06, 1.8508389985072426e-05, 5.5507618526462466e-05, 0.00014228637155611068, 0.0004887224640697241, 0.00012521845928858966], [0.7504099011421204, 0.0003849198401439935, 0.001604793593287468, 0.000517133332323283, 0.0416472926735878, 0.17969587445259094, 0.015804870054125786, 0.007611360400915146, 0.00036103447200730443, 0.0004629761097021401, 0.00014215364353731275, 1.985096969292499e-05, 2.304069539604825e-06, 4.643395732273348e-05, 5.9009191318182275e-05, 5.155307735549286e-05, 0.0011784540256485343], [0.8821566700935364, 0.0017508944729343057, 0.0001217467724927701, 8.866190910339355e-05, 0.002436433918774128, 0.012974086217582226, 0.014909601770341396, 0.08267084509134293, 0.0007912681321613491, 0.0002855501079466194, 0.00010120873776031658, 0.0005827238783240318, 0.0002209423837484792, 1.0959885003103409e-05, 2.3135657102102414e-05, 2.3013921236270107e-05, 0.0008523568976670504], [0.7817734479904175, 0.00028792876400984824, 0.00028915563598275185, 3.149616895825602e-05, 0.00019328990310896188, 0.00039737255428917706, 0.00222280016168952, 0.21118281781673431, 0.0025429772213101387, 0.0005032427725382149, 6.459236465161666e-05, 8.669061935506761e-05, 0.00019611688912846148, 1.480718856328167e-05, 6.188515726535115e-06, 4.014252681372454e-06, 0.00020310617401264608], [0.7818301320075989, 6.580071931239218e-05, 0.000216232831007801, 0.0004095242475159466, 9.056257840711623e-05, 0.0013834183337166905, 0.00039748172275722027, 0.03528838977217674, 0.17597194015979767, 0.003006905084475875, 0.00020858384959865361, 0.00023858265194576234, 3.998364263679832e-05, 0.00031036860309541225, 7.727265619905666e-05, 3.848614142043516e-06, 0.00046105170622467995], [0.6835386157035828, 8.250562677858397e-05, 2.0654072159231873e-06, 2.0827848857152276e-05, 0.00011750896373996511, 0.0002098310797009617, 2.7722477170755155e-05, 0.014225042425096035, 0.013533206656575203, 0.282716304063797, 0.0025119122583419085, 0.0012656727340072393, 0.00020385101379361004, 0.00014132836076896638, 0.0011584586463868618, 2.4897783077904023e-05, 0.0002202541072620079], [0.002755305962637067, 0.00010995811317116022, 2.8522592856461415e-06, 2.0898632158150576e-07, 4.821248239750275e-06, 0.0006785945151932538, 2.5067362003028393e-05, 0.0008490686886943877, 0.0002457723021507263, 0.0013420209288597107, 0.9932505488395691, 0.000194040680071339, 0.00010481755452929065, 0.00015917800192255527, 2.7955224140896462e-05, 8.863612310960889e-05, 0.0001610802864888683], [0.8889709115028381, 0.0010295920073986053, 1.7409825886716135e-05, 2.002551991608925e-05, 4.3120994632772636e-06, 0.0009308324079029262, 0.00021298634237609804, 0.002540617249906063, 0.0007428344688378274, 0.0003329126047901809, 0.004486019257456064, 0.08645206689834595, 0.0024344518315047026, 0.0009742826805450022, 0.00023639085702598095, 4.58802460343577e-05, 0.010568439960479736], [0.9405987858772278, 0.0007398508605547249, 3.7416426494019106e-05, 3.6118006391916424e-05, 3.491567986202426e-05, 5.727924417442409e-06, 1.9410481399972923e-05, 0.011941532604396343, 2.960453275591135e-05, 0.00022587859712075442, 7.24075798643753e-05, 0.002490999409928918, 0.03911217674612999, 0.0013571237213909626, 0.00035801593912765384, 7.163146074162796e-05, 0.002868492854759097], [0.9501668810844421, 3.099176683463156e-05, 3.646867844508961e-05, 2.174437940993812e-05, 6.733972259098664e-05, 4.87365759909153e-05, 5.966923595224216e-07, 0.0009204033412970603, 0.000405986764235422, 2.584258618298918e-05, 1.7954338545678183e-05, 7.146958523662761e-05, 0.0004567163996398449, 0.04421154037117958, 0.0016181188402697444, 4.1068658902077004e-05, 0.0018581242766231298], [0.007667409256100655, 0.0008544888696633279, 0.0001818528544390574, 0.00015856089885346591, 0.002547712530940771, 0.0005943468422628939, 0.00011536364763742313, 0.0006223046220839024, 3.4030377719318494e-05, 0.002404649741947651, 0.0006005006143823266, 0.00017793492588680238, 0.00240223272703588, 0.006532533559948206, 0.9651476740837097, 0.0031476859003305435, 0.006810721475630999], [0.9668761491775513, 0.0026801300700753927, 2.2525275198859163e-05, 9.780722393770702e-06, 0.0005209475057199597, 0.0005234674317762256, 1.4087261661188677e-05, 0.0009302936960011721, 1.614800021343399e-05, 4.7282206651289016e-05, 0.0001923212839756161, 8.943941065808758e-05, 0.00022120511857792735, 0.0004131650784984231, 0.001382562331855297, 0.0081690214574337, 0.017891457304358482]], [[0.7547858357429504, 0.009864872321486473, 0.0054228943772614, 0.01671447791159153, 0.006792051717638969, 0.010559231974184513, 0.003916178364306688, 0.10249600559473038, 0.003391697071492672, 0.003089041216298938, 0.006658147554844618, 0.004883144982159138, 0.002287093084305525, 0.0027324501425027847, 0.006697478704154491, 0.007319116964936256, 0.05239033326506615], [0.08516756445169449, 0.056832633912563324, 0.029573997482657433, 0.045987095683813095, 0.052306391298770905, 0.09999803453683853, 0.02295447140932083, 0.515548050403595, 0.002257472835481167, 0.006224208045750856, 0.005903398618102074, 0.003002558369189501, 0.0011153602972626686, 0.001535334624350071, 0.002816055901348591, 0.0010675416560843587, 0.06770980358123779], [0.4737701416015625, 0.06567241996526718, 0.010594836436212063, 0.029000218957662582, 0.022467730566859245, 0.08191531151533127, 0.03092263825237751, 0.1958407461643219, 0.01950962096452713, 0.0016905348747968674, 0.0034030135720968246, 0.005839413497596979, 0.0013458247995004058, 0.00862917210906744, 0.016522999852895737, 0.005661205388605595, 0.027214230969548225], [0.21973469853401184, 0.06196912005543709, 0.020438281819224358, 0.04035696014761925, 0.05862792208790779, 0.13625621795654297, 0.03700202330946922, 0.33872920274734497, 0.0016601807437837124, 0.004187688231468201, 0.002491835504770279, 0.002028044778853655, 0.0033214904833585024, 0.0011582830920815468, 0.0017884550616145134, 0.004360937979072332, 0.06588860601186752], [0.12031736969947815, 0.04219011962413788, 0.018179818987846375, 0.024624444544315338, 0.02486785128712654, 0.09344056993722916, 0.1815103143453598, 0.42021048069000244, 0.0044550406746566296, 0.0021856268867850304, 0.0030118508730083704, 0.0103220846503973, 0.00108128204010427, 0.0015761815011501312, 0.0028439683374017477, 0.0031991302967071533, 0.04598383605480194], [0.1319248080253601, 0.14204636216163635, 0.03194127604365349, 0.038015034049749374, 0.07182426005601883, 0.2037440687417984, 0.0371730737388134, 0.26704585552215576, 0.004954897798597813, 0.005547970533370972, 0.005295769777148962, 0.005893618334084749, 0.0014092357596382499, 0.0028523108921945095, 0.006510399281978607, 0.0032029093708842993, 0.040618088096380234], [0.1496758759021759, 0.048582617193460464, 0.03475802019238472, 0.03501974046230316, 0.06768042594194412, 0.0673070028424263, 0.045701343566179276, 0.14727887511253357, 0.10842838883399963, 0.012343262322247028, 0.006155111361294985, 0.07159121334552765, 0.010248725302517414, 0.06651417911052704, 0.0400426872074604, 0.07041127979755402, 0.01826133206486702], [0.4400784373283386, 0.041901037096977234, 0.006975466851145029, 0.05934727564454079, 0.013717503286898136, 0.03826424106955528, 0.029440617188811302, 0.2506181299686432, 0.0035429615527391434, 0.005923509132117033, 0.013415023684501648, 0.005728854797780514, 0.003088337369263172, 0.0024096479173749685, 0.005949001759290695, 0.010541481897234917, 0.06905857473611832], [0.29571259021759033, 0.00459167780354619, 0.009576745331287384, 0.006729051936417818, 0.003299379488453269, 0.0018090351950377226, 0.006344038527458906, 0.11935541033744812, 0.05753304064273834, 0.052367717027664185, 0.011496080085635185, 0.226090669631958, 0.009002748876810074, 0.03081720694899559, 0.06823030859231949, 0.03684920817613602, 0.06019514054059982], [0.5871413946151733, 0.017467418685555458, 0.010061989538371563, 0.008851512335240841, 0.021066589280962944, 0.0156331118196249, 0.0075898123905062675, 0.0758996456861496, 0.034988582134246826, 0.001633160631172359, 0.027430277317762375, 0.03720615431666374, 0.002294766018167138, 0.014355944469571114, 0.0341450534760952, 0.04447627067565918, 0.059758394956588745], [0.05961215868592262, 0.0029146946035325527, 0.0018578103045001626, 0.0008145995670929551, 0.0014347018441185355, 0.002391014713793993, 0.006449660286307335, 0.024876069277524948, 0.05961885303258896, 0.010143116116523743, 0.0030843070708215237, 0.532457172870636, 0.013193241320550442, 0.03973464295268059, 0.07789616286754608, 0.13865229487419128, 0.024869419634342194], [0.6850425004959106, 0.006926869973540306, 0.015194338746368885, 0.00325207132846117, 0.005424173083156347, 0.0056083472445607185, 0.0034356960095465183, 0.10793834924697876, 0.007308503147214651, 0.009034384042024612, 0.024001508951187134, 0.007774698548018932, 0.01121484860777855, 0.005287432577461004, 0.02718975953757763, 0.00790848582983017, 0.06745807826519012], [0.5220034718513489, 0.011350404471158981, 0.020897135138511658, 0.007640219293534756, 0.004908623173832893, 0.004910303745418787, 0.0038586559239774942, 0.061488740146160126, 0.011166245676577091, 0.01019758265465498, 0.01955796778202057, 0.0735439658164978, 0.002632837975397706, 0.01213520485907793, 0.0838482528924942, 0.03891734033823013, 0.1109430193901062], [0.5259912610054016, 0.003858619136735797, 0.0057672723196446896, 0.007925664074718952, 0.002772738691419363, 0.0010913294972851872, 0.002737729111686349, 0.06518160551786423, 0.025037206709384918, 0.022859372198581696, 0.004437198396772146, 0.15586748719215393, 0.004478969611227512, 0.01785382069647312, 0.052680715918540955, 0.04928094893693924, 0.05217800289392471], [0.3079712390899658, 0.005045867525041103, 0.003825489431619644, 0.005349821411073208, 0.008479290641844273, 0.0047453916631639, 0.004876612685620785, 0.030276503413915634, 0.04090511053800583, 0.004555382300168276, 0.013405626639723778, 0.16041071712970734, 0.02725648134946823, 0.03921004384756088, 0.012301505543291569, 0.2912074029445648, 0.04017749801278114], [0.4210013151168823, 0.00943884439766407, 0.0537964403629303, 0.0037189212162047625, 0.00750102661550045, 0.008036928251385689, 0.004910516552627087, 0.06729831546545029, 0.01641993038356304, 0.016574537381529808, 0.05032430216670036, 0.06922657042741776, 0.04528547823429108, 0.02794649451971054, 0.07216256856918335, 0.03959736227989197, 0.08676045387983322], [0.7766464352607727, 0.01463171374052763, 0.0028898667078465223, 0.017473895102739334, 0.005510909482836723, 0.011998109519481659, 0.0023421430960297585, 0.04297836497426033, 0.002867629285901785, 0.0036693245638161898, 0.006553813815116882, 0.004545251838862896, 0.001970538403838873, 0.0029834469314664602, 0.007688202429562807, 0.015110191889107227, 0.08014018088579178]], [[0.7727504372596741, 0.00858038105070591, 0.008474203757941723, 0.08900085091590881, 0.006152392365038395, 0.007445027586072683, 0.01290886476635933, 0.03962743654847145, 0.003136805957183242, 0.004099338781088591, 0.005153907462954521, 0.004629241302609444, 0.005629480816423893, 0.0036341461818665266, 0.003073272993788123, 0.006103338673710823, 0.019601019099354744], [0.7571249008178711, 0.006165395025163889, 0.006377467419952154, 0.20041045546531677, 0.011193528771400452, 0.0001714882382657379, 0.00045254462747834623, 0.017748812213540077, 6.345866495394148e-06, 2.214358937635552e-05, 1.5330775795519003e-06, 1.4185648979037069e-05, 3.0172430342645384e-05, 1.3593024050351232e-05, 2.6174966478720307e-05, 3.9219714381033555e-05, 0.00020206038607284427], [0.8827226758003235, 0.011961707845330238, 0.0038428890984505415, 0.0423479825258255, 0.027766501531004906, 0.013665142469108105, 0.002174933673813939, 0.015072389505803585, 1.706885450403206e-05, 3.157136598019861e-05, 2.0502813640632667e-05, 1.6153117030626163e-05, 2.7608568871073658e-06, 1.665282979956828e-05, 2.5792258384171873e-05, 8.78935752552934e-05, 0.00022741139400750399], [0.866417646408081, 0.00900233443826437, 0.004781858529895544, 0.032116495072841644, 0.014910922385752201, 0.022375911474227905, 0.034004345536231995, 0.015567164868116379, 2.1950445443508215e-05, 3.420053326408379e-05, 6.7084656620863825e-06, 6.353843491524458e-05, 6.186431619426003e-06, 2.6617815365170827e-06, 1.7567826944286935e-05, 0.00011519487452460453, 0.0005552679067477584], [0.24361801147460938, 0.008528145030140877, 0.011232786811888218, 0.14387273788452148, 0.010411730036139488, 0.026654507964849472, 0.06520693749189377, 0.48821792006492615, 8.200912270694971e-05, 0.00017415017646271735, 2.0941733964718878e-05, 0.00038427338586188853, 5.561183934332803e-05, 7.95800588093698e-06, 8.004183655430097e-06, 0.00024596485309302807, 0.001278352807275951], [0.10533745586872101, 2.1844052753294818e-05, 0.004666682332754135, 0.08646493405103683, 0.0056816949509084225, 0.003854665905237198, 0.017150860279798508, 0.7756156921386719, 0.00040859426371753216, 4.9661932280287147e-05, 1.2085332855349407e-05, 0.00014235515845939517, 2.277098064951133e-05, 1.323454034718452e-05, 2.1933972220722353e-06, 3.1844076147535816e-05, 0.0005234709242358804], [0.7225679755210876, 7.276464020833373e-05, 0.0004051068681292236, 0.0486372672021389, 0.010843642055988312, 0.008268492296338081, 0.01186375506222248, 0.19530434906482697, 0.0012752986513078213, 0.0006019490538164973, 1.8358503439230844e-05, 3.2692038075765595e-05, 3.1323241273639724e-05, 1.655262713029515e-05, 9.927593964675907e-06, 9.369588951813057e-06, 4.111229281988926e-05], [0.7284811735153198, 0.0004558685759548098, 0.0003409234923310578, 0.004853690974414349, 0.010716617107391357, 0.033291079103946686, 0.03605383262038231, 0.14945097267627716, 0.003909823950380087, 0.012587624602019787, 0.016525037586688995, 0.001618077396415174, 0.00031643203692510724, 0.00015267208800651133, 0.00018170903786085546, 0.0006065759807825089, 0.00045785840484313667], [0.8143633008003235, 5.058960596215911e-05, 6.601514905923977e-05, 0.00011247646762058139, 7.365751662291586e-05, 0.002744140103459358, 0.003993631806224585, 0.0474659763276577, 0.006942323409020901, 0.00759667856618762, 0.07336601614952087, 0.03639700263738632, 0.0003467315691523254, 0.00013484105875249952, 0.00022518700279761106, 0.0006242834497243166, 0.005497250705957413], [0.8592362403869629, 0.000318439764669165, 6.11943905823864e-05, 0.0005421902751550078, 9.548538946546614e-05, 0.00031821129960007966, 0.005460259970277548, 0.0934712141752243, 0.0044052088633179665, 0.002436827402561903, 0.00501026539131999, 0.020191127434372902, 0.006386762484908104, 0.0002276133600389585, 0.0001223232684424147, 0.00022394463303498924, 0.0014927684096619487], [0.7854921817779541, 4.700837052951101e-06, 2.3539672838523984e-05, 1.887868893390987e-05, 9.844242413237225e-06, 1.9562367015169002e-05, 4.437869574758224e-05, 0.08131130784749985, 0.02372152730822563, 0.0025420449674129486, 0.01236686296761036, 0.03351642191410065, 0.03439062088727951, 0.023202955722808838, 0.00028791805380024016, 0.00033490938949398696, 0.0027124183252453804], [0.9087998270988464, 6.950809620320797e-06, 1.8277785784448497e-05, 0.00027538620634004474, 4.0405193431070074e-05, 3.802767605520785e-05, 4.5045173465041444e-05, 0.003248580265790224, 0.013191413134336472, 0.00484170438721776, 0.010461622849106789, 0.013002006337046623, 0.00804837979376316, 0.028533047065138817, 0.005456393118947744, 0.0013203005073592067, 0.002672546776011586], [0.899337649345398, 0.0004280074208509177, 1.5242433619278017e-05, 0.00025497114984318614, 0.00013280603161547333, 0.00010954194294754416, 0.0001598851231392473, 0.002347684931010008, 0.0002547506883274764, 0.006213400047272444, 0.014653104357421398, 0.009877592325210571, 0.004160686396062374, 0.004716258496046066, 0.027887441217899323, 0.014736087992787361, 0.014714706689119339], [0.8118686079978943, 0.0001314570545218885, 0.00012550312385428697, 3.756134901777841e-05, 1.564570629852824e-05, 0.0001305926125496626, 0.00011821193766081706, 0.0016397623112425208, 3.7506906664930284e-05, 0.00017408770509064198, 0.005983081646263599, 0.015478395856916904, 0.0015276235062628984, 0.004042705520987511, 0.006746071856468916, 0.03724897280335426, 0.11469421535730362], [0.9067817330360413, 8.468512532999739e-05, 7.332599489018321e-05, 0.0001577970542712137, 4.4326957322482485e-06, 1.2403487744450103e-05, 0.0001045143508235924, 0.0006467544590122998, 6.317425868473947e-05, 3.9384194678859785e-05, 8.317382889799774e-05, 0.0042150286026299, 0.004913404583930969, 0.003568806452676654, 0.0024683072697371244, 0.004602170083671808, 0.07218081504106522], [0.9178918600082397, 1.5581457773805596e-05, 0.0001291971857426688, 0.0005804685642942786, 3.7173002056078985e-05, 1.0767875210149214e-05, 4.340753730502911e-05, 0.002072532195597887, 7.196416117949411e-05, 5.8155150327365845e-05, 4.099506259080954e-05, 0.0005912611377425492, 0.004896979779005051, 0.007019140291959047, 0.004506316967308521, 0.005761944688856602, 0.05627242848277092], [0.920553982257843, 7.148887380026281e-05, 0.0005422616959549487, 0.004240149166435003, 0.00048113736556842923, 0.00047682979493401945, 0.00041780731407925487, 0.0012514679692685604, 0.0006324729765765369, 0.0003759305109269917, 0.00042534328531473875, 0.000647897191811353, 0.0013609044253826141, 0.008396115154027939, 0.009276730939745903, 0.011861667968332767, 0.03898792713880539]], [[0.8752707242965698, 0.010145715437829494, 0.005290976259857416, 0.008911198936402798, 0.009171922691166401, 0.01042933575809002, 0.0028390849474817514, 0.03172311186790466, 0.004482789896428585, 0.0015522890025749803, 0.004269621334969997, 0.0034597674384713173, 0.0038721219170838594, 0.003467997768893838, 0.0038933816831558943, 0.006058037281036377, 0.015161937102675438], [0.12199195474386215, 0.10259052366018295, 0.0732407495379448, 0.11539902538061142, 0.08404726535081863, 0.08322817832231522, 0.09814020246267319, 0.228071928024292, 0.015235588885843754, 0.01320422813296318, 0.00803247932344675, 0.0050290413200855255, 0.011816405691206455, 0.010095865465700626, 0.00433484697714448, 0.006313996855169535, 0.019227614626288414], [0.19048869609832764, 0.1132664903998375, 0.034792397171258926, 0.10661054402589798, 0.08579465001821518, 0.08308365195989609, 0.07952732592821121, 0.22200128436088562, 0.01260889507830143, 0.016600003466010094, 0.006054143887013197, 0.006848284974694252, 0.01018479187041521, 0.006280827336013317, 0.0049152979627251625, 0.006135324481874704, 0.01480743009597063], [0.14377117156982422, 0.14006492495536804, 0.04763822630047798, 0.0871535986661911, 0.06705541163682938, 0.08288515359163284, 0.08975692838430405, 0.2298852652311325, 0.013663147576153278, 0.018573755398392677, 0.008791024796664715, 0.007961658760905266, 0.015092577785253525, 0.008341463282704353, 0.005843704100698233, 0.014051779173314571, 0.019470177590847015], [0.13443051278591156, 0.11599360406398773, 0.0981329083442688, 0.09486522525548935, 0.045501094311475754, 0.06330541521310806, 0.10006223618984222, 0.21180610358715057, 0.03202548250555992, 0.018302710726857185, 0.012764256447553635, 0.008794573135674, 0.01669544167816639, 0.015479540452361107, 0.0052488804794847965, 0.008742270059883595, 0.017849694937467575], [0.15074889361858368, 0.09295818209648132, 0.07630269974470139, 0.11040675640106201, 0.0776258334517479, 0.07391699403524399, 0.08898284286260605, 0.231175497174263, 0.01645219698548317, 0.011076070368289948, 0.008417180739343166, 0.006378249730914831, 0.011853267438709736, 0.01100036408752203, 0.005741327069699764, 0.007449564523994923, 0.019514095038175583], [0.1970466524362564, 0.12849080562591553, 0.024045836180448532, 0.059321995824575424, 0.08589904010295868, 0.10990596562623978, 0.05104204639792442, 0.2871304154396057, 0.009855733253061771, 0.007175843697041273, 0.0028255407232791185, 0.003019479801878333, 0.0072609977796673775, 0.005628496874123812, 0.004100958351045847, 0.004671034403145313, 0.01257922314107418], [0.6705631017684937, 0.044899310916662216, 0.0219928827136755, 0.03749915212392807, 0.04241444915533066, 0.04187631234526634, 0.019852908328175545, 0.08123528957366943, 0.004100162535905838, 0.0019887255039066076, 0.0036571682430803776, 0.0027003043796867132, 0.0035193979274481535, 0.0034368811175227165, 0.00423912750557065, 0.00586186908185482, 0.010162982158362865], [0.1978415846824646, 0.04601369798183441, 0.008730841800570488, 0.011434298008680344, 0.023564912378787994, 0.033846549689769745, 0.004836800508201122, 0.12212932109832764, 0.05675040930509567, 0.05211790278553963, 0.019618095830082893, 0.03532019257545471, 0.0737646296620369, 0.025149980559945107, 0.03066072426736355, 0.057265907526016235, 0.2009541541337967], [0.262468159198761, 0.0637156218290329, 0.02018571086227894, 0.028907230123877525, 0.016689879819750786, 0.04343951866030693, 0.00859843660145998, 0.14422668516635895, 0.0613672249019146, 0.024906964972615242, 0.039697229862213135, 0.024464523419737816, 0.046197280287742615, 0.02328525111079216, 0.019431978464126587, 0.058470141142606735, 0.11394807696342468], [0.25981080532073975, 0.04037712514400482, 0.008166557177901268, 0.01575249806046486, 0.014789687469601631, 0.026177292689681053, 0.008472079411149025, 0.052758339792490005, 0.08939885348081589, 0.047802381217479706, 0.023678692057728767, 0.09680598974227905, 0.08022461831569672, 0.0599735863506794, 0.03231457993388176, 0.06283147633075714, 0.08066533505916595], [0.3184558153152466, 0.03179793804883957, 0.006532294675707817, 0.021300408989191055, 0.019234439358115196, 0.020586656406521797, 0.006849492434412241, 0.06715408712625504, 0.08047652989625931, 0.03255534544587135, 0.058423276990652084, 0.027272721752524376, 0.05204018950462341, 0.047450654208660126, 0.04165048897266388, 0.07342234253883362, 0.09479725360870361], [0.15729250013828278, 0.044398095458745956, 0.010941185988485813, 0.01879245415329933, 0.015220096334815025, 0.026137061417102814, 0.005978063680231571, 0.04783995449542999, 0.12114252150058746, 0.051962222903966904, 0.033028390258550644, 0.05308720842003822, 0.05677604302763939, 0.07138530910015106, 0.04082570597529411, 0.10034617781639099, 0.1448470652103424], [0.31807082891464233, 0.06866411119699478, 0.008636459708213806, 0.013184705749154091, 0.027427302673459053, 0.0509958453476429, 0.0044118547812104225, 0.0864173173904419, 0.04712434485554695, 0.020672496408224106, 0.013058923184871674, 0.029771368950605392, 0.05219049006700516, 0.028300724923610687, 0.028539758175611496, 0.049859873950481415, 0.15267358720302582], [0.20261572301387787, 0.033346958458423615, 0.0078211585059762, 0.010914979502558708, 0.020004548132419586, 0.02041877619922161, 0.010511903092265129, 0.04238000512123108, 0.06677564978599548, 0.05525178089737892, 0.02893720380961895, 0.08278454840183258, 0.12399426102638245, 0.052768904715776443, 0.03813236579298973, 0.10307128727436066, 0.10026988387107849], [0.086479052901268, 0.026898792013525963, 0.014558898285031319, 0.03878985345363617, 0.02113404870033264, 0.012427226640284061, 0.012107809074223042, 0.11183538287878036, 0.10491549968719482, 0.1290065199136734, 0.022619588300585747, 0.02078002132475376, 0.09285224974155426, 0.057389579713344574, 0.037040211260318756, 0.03523697331547737, 0.17592838406562805], [0.6081128120422363, 0.016238315030932426, 0.004509724676609039, 0.011494138278067112, 0.013666429556906223, 0.014681649394333363, 0.003276598174124956, 0.04067643731832504, 0.020164117217063904, 0.009898505173623562, 0.014938415959477425, 0.01751135103404522, 0.020025357604026794, 0.015707440674304962, 0.02811858057975769, 0.0406319759786129, 0.12034818530082703]], [[0.2965709865093231, 0.05796460807323456, 0.028041359037160873, 0.06846131384372711, 0.049271389842033386, 0.0419328548014164, 0.024971364066004753, 0.14257101714611053, 0.026336094364523888, 0.021561192348599434, 0.023983754217624664, 0.028174186125397682, 0.027149595320224762, 0.023773105815052986, 0.015901505947113037, 0.022946517914533615, 0.10038920491933823], [0.1620849072933197, 0.07640958577394485, 0.053408633917570114, 0.0429617278277874, 0.03849251568317413, 0.049491479992866516, 0.02253851853311062, 0.35348573327064514, 0.022746887058019638, 0.010367569513618946, 0.006514961831271648, 0.0072015211917459965, 0.010149077512323856, 0.020953739061951637, 0.009680919349193573, 0.005482788663357496, 0.10802946984767914], [0.32796457409858704, 0.0664539486169815, 0.002340588951483369, 0.09588932245969772, 0.096953846514225, 0.06315144896507263, 0.10226022452116013, 0.11719150841236115, 0.012873605825006962, 0.00693670054897666, 0.010051289573311806, 0.006668702233582735, 0.02084077149629593, 0.011893057264387608, 0.016445191577076912, 0.012358658015727997, 0.029726533219218254], [0.13260376453399658, 0.08388900011777878, 0.024045556783676147, 0.0023716071154922247, 0.03167177736759186, 0.05999348685145378, 0.02283409796655178, 0.340464323759079, 0.017512794584035873, 0.015509285032749176, 0.018998095765709877, 0.01168860960751772, 0.09478573501110077, 0.013187335804104805, 0.010122915729880333, 0.0238198172301054, 0.09650178253650665], [0.45002344250679016, 0.04326621815562248, 0.023874878883361816, 0.026409652084112167, 0.015789851546287537, 0.03234647214412689, 0.02699875459074974, 0.2045053094625473, 0.017781440168619156, 0.011987457983195782, 0.004687793552875519, 0.00794391892850399, 0.014181495644152164, 0.01691306009888649, 0.017679505050182343, 0.007848555222153664, 0.0777621790766716], [0.2955580949783325, 0.05694920942187309, 0.07093565165996552, 0.05220045521855354, 0.029995739459991455, 0.04047683626413345, 0.020987287163734436, 0.24816808104515076, 0.019382355734705925, 0.011134741827845573, 0.003946724347770214, 0.006022250279784203, 0.007192783523350954, 0.018431687727570534, 0.012680443935096264, 0.005082977470010519, 0.10085470229387283], [0.33947139978408813, 0.06250279396772385, 0.19213902950286865, 0.06544287502765656, 0.029661916196346283, 0.06670798361301422, 0.004766721744090319, 0.06272544711828232, 0.008210159838199615, 0.02393733523786068, 0.006362354848533869, 0.016089720651507378, 0.06075449287891388, 0.007174529600888491, 0.010380832478404045, 0.019972067326307297, 0.023700203746557236], [0.09940823167562485, 0.09289997816085815, 0.045589741319417953, 0.13956570625305176, 0.10594592988491058, 0.08467381447553635, 0.04422207176685333, 0.15545640885829926, 0.023374300450086594, 0.029611827805638313, 0.02025497704744339, 0.020671185106039047, 0.027196241542696953, 0.025844756513834, 0.009663451462984085, 0.020381417125463486, 0.055239904671907425], [0.4941464960575104, 0.017830027267336845, 0.010883533395826817, 0.015018312260508537, 0.005468357354402542, 0.010275592096149921, 0.015259749256074429, 0.12616495788097382, 0.00798334926366806, 0.007314665243029594, 0.02900024689733982, 0.030528120696544647, 0.03532757610082626, 0.005817428696900606, 0.02185441367328167, 0.034009139984846115, 0.13311809301376343], [0.789246678352356, 0.007169062737375498, 0.004507251549512148, 0.01064229290932417, 0.009931791573762894, 0.0059505645185709, 0.01041458360850811, 0.048848774284124374, 0.010202658362686634, 0.0004368418885860592, 0.006939977873116732, 0.008138621225953102, 0.011395268142223358, 0.010558436624705791, 0.007068782113492489, 0.008172587491571903, 0.05037596449255943], [0.7570291757583618, 0.005169500596821308, 0.004183517768979073, 0.0017245536437258124, 0.005001619458198547, 0.0037477731239050627, 0.002091288100928068, 0.062359947711229324, 0.021220941096544266, 0.00537097966298461, 0.002068424364551902, 0.010308386757969856, 0.03712970018386841, 0.011901701800525188, 0.005133411381393671, 0.010311917401850224, 0.055247075855731964], [0.5845919251441956, 0.009964639320969582, 0.006419515702873468, 0.005631175823509693, 0.010992555879056454, 0.0051114861853420734, 0.0025202161632478237, 0.08422207832336426, 0.02567335031926632, 0.01109843235462904, 0.020684778690338135, 0.012485543265938759, 0.04537379369139671, 0.020455460995435715, 0.005697763524949551, 0.018804602324962616, 0.13027268648147583], [0.5869547724723816, 0.009998509660363197, 0.008663649670779705, 0.005790055729448795, 0.00996569637209177, 0.005736503284424543, 0.0019345359178259969, 0.07315226644277573, 0.01612280309200287, 0.006665478926151991, 0.009646130725741386, 0.027015507221221924, 0.01428042259067297, 0.01069749053567648, 0.015059876255691051, 0.03172402083873749, 0.1665922999382019], [0.5891892313957214, 0.011307432316243649, 0.008555843494832516, 0.01591670885682106, 0.005247768945991993, 0.006901806220412254, 0.010654964484274387, 0.09543900191783905, 0.005321124568581581, 0.006281343754380941, 0.027220914140343666, 0.028006751090288162, 0.02189008891582489, 0.004013807978481054, 0.019893022254109383, 0.0387670174241066, 0.10539322346448898], [0.7979394197463989, 0.004585400223731995, 0.0035583695862442255, 0.012907174415886402, 0.00285825552418828, 0.0024788815062493086, 0.00041983116534538567, 0.03758448362350464, 0.014598624780774117, 0.005396198946982622, 0.007394668646156788, 0.0077080740593373775, 0.03056112676858902, 0.010578134097158909, 0.00033840228570625186, 0.006862669251859188, 0.05423015356063843], [0.6078534126281738, 0.009102982468903065, 0.005256517790257931, 0.005688854493200779, 0.005931193940341473, 0.005734494887292385, 0.004310488235205412, 0.06900998204946518, 0.023739254102110863, 0.007173456717282534, 0.03328167647123337, 0.03403979912400246, 0.031842414289712906, 0.021391989663243294, 0.017976811155676842, 0.008574506267905235, 0.10909217596054077], [0.20441009104251862, 0.030978698283433914, 0.011559037491679192, 0.04631765931844711, 0.036239732056856155, 0.025698883458971977, 0.008637321181595325, 0.1222122460603714, 0.031708549708127975, 0.03536667302250862, 0.037282370030879974, 0.04394269734621048, 0.05799673870205879, 0.037924546748399734, 0.021652495488524437, 0.05294360965490341, 0.19512876868247986]], [[0.3097732663154602, 0.07611045241355896, 0.028505505993962288, 0.051321763545274734, 0.05360238999128342, 0.06786826997995377, 0.02993333712220192, 0.10924328118562698, 0.020359760150313377, 0.025305403396487236, 0.014417652040719986, 0.021259214729070663, 0.028137091547250748, 0.014933962374925613, 0.018348360434174538, 0.029939303174614906, 0.10094097256660461], [0.32382121682167053, 0.052873674780130386, 0.026390526443719864, 0.02858990617096424, 0.024377157911658287, 0.03987927362322807, 0.03398594260215759, 0.14598192274570465, 0.02593742311000824, 0.03536425530910492, 0.02090512029826641, 0.018381379544734955, 0.035993870347738266, 0.023887448012828827, 0.031661566346883774, 0.018134748563170433, 0.11383458226919174], [0.36130234599113464, 0.034167371690273285, 0.003056361572816968, 0.016692889854311943, 0.03206206485629082, 0.03063301369547844, 0.041085463017225266, 0.20007984340190887, 0.025691639631986618, 0.021005313843488693, 0.011044769547879696, 0.02232205681502819, 0.03607909381389618, 0.019792387261986732, 0.019628971815109253, 0.02782295271754265, 0.09753341972827911], [0.36917203664779663, 0.035691455006599426, 0.01841595396399498, 0.0023412376176565886, 0.023779405280947685, 0.028775649145245552, 0.03528599441051483, 0.1225910410284996, 0.03664986044168472, 0.08388462662696838, 0.02336886152625084, 0.03264681622385979, 0.05369969829916954, 0.029495390132069588, 0.01450154185295105, 0.013139416463673115, 0.07656116038560867], [0.3503187596797943, 0.05832456052303314, 0.042019329965114594, 0.06252440810203552, 0.024023819714784622, 0.045568518340587616, 0.028528563678264618, 0.08869403600692749, 0.04464893043041229, 0.032980676740407944, 0.020669126883149147, 0.014231042936444283, 0.03118501603603363, 0.03955991193652153, 0.02464015781879425, 0.011275513097643852, 0.0808076485991478], [0.2954652011394501, 0.0557560995221138, 0.023065125569701195, 0.03262137621641159, 0.029068676754832268, 0.039246104657649994, 0.03266097232699394, 0.12911869585514069, 0.03800166770815849, 0.0426485538482666, 0.022462062537670135, 0.01992109604179859, 0.037244442850351334, 0.03146583214402199, 0.041810229420661926, 0.014161068946123123, 0.11528283357620239], [0.5176048874855042, 0.02175033465027809, 0.06592509895563126, 0.017054669559001923, 0.01047627255320549, 0.01671774312853813, 0.002046997891739011, 0.14995016157627106, 0.02593217045068741, 0.028596408665180206, 0.01253886241465807, 0.011914929375052452, 0.009603501297533512, 0.019538739696145058, 0.008933363482356071, 0.009966001845896244, 0.07144976407289505], [0.11796353757381439, 0.07321932911872864, 0.05922737345099449, 0.05399736016988754, 0.06588315218687057, 0.06435232609510422, 0.0630432665348053, 0.11272327601909637, 0.02995479479432106, 0.03994587063789368, 0.03458644822239876, 0.023355834186077118, 0.042384177446365356, 0.02815030701458454, 0.049182213842868805, 0.03708231821656227, 0.10494840145111084], [0.4405638575553894, 0.06216609477996826, 0.024902554228901863, 0.02215476892888546, 0.03576335683465004, 0.05957154557108879, 0.04554615914821625, 0.14371877908706665, 0.0002048244496108964, 0.0029951068572700024, 0.021087462082505226, 0.03582749515771866, 0.00459287129342556, 0.00023303313355427235, 0.015625810250639915, 0.032139793038368225, 0.05290646106004715], [0.42117029428482056, 0.04143610596656799, 0.006092498078942299, 0.11454833298921585, 0.034141480922698975, 0.050804637372493744, 0.015443027950823307, 0.1753954440355301, 0.0033253333531320095, 0.00017724001372698694, 0.008149168454110622, 0.011504090391099453, 0.004735163878649473, 0.003161653643473983, 0.010470982640981674, 0.027541300281882286, 0.07190326601266861], [0.5646669864654541, 0.033477023243904114, 0.007792834658175707, 0.004369085654616356, 0.0158088319003582, 0.02870163880288601, 0.006828259211033583, 0.11246144026517868, 0.02150244452059269, 0.017794713377952576, 0.003527245484292507, 0.017825162038207054, 0.04423310607671738, 0.015048909932374954, 0.01779608242213726, 0.016137506812810898, 0.07202871888875961], [0.4031176269054413, 0.048444714397192, 0.007807986345142126, 0.013952632434666157, 0.03815782070159912, 0.03946596011519432, 0.01272290013730526, 0.15799611806869507, 0.05029098689556122, 0.005117946304380894, 0.0179555993527174, 0.0013285815948620439, 0.018555017188191414, 0.036394815891981125, 0.026839597150683403, 0.01676013693213463, 0.10509151965379715], [0.4245763421058655, 0.02908717654645443, 0.019622495397925377, 0.09167809784412384, 0.01870063692331314, 0.027208421379327774, 0.00341343623585999, 0.16979795694351196, 0.011765527538955212, 0.0020747811067849398, 0.02077442780137062, 0.004534529522061348, 0.0005970205529592931, 0.006293031387031078, 0.007846449501812458, 0.05287493020296097, 0.10915481299161911], [0.5070799589157104, 0.05312657356262207, 0.01926664635539055, 0.019094401970505714, 0.03115960955619812, 0.053129106760025024, 0.03742797300219536, 0.13670435547828674, 0.0002592909731902182, 0.002618880942463875, 0.019042203202843666, 0.025540700182318687, 0.0034402059391140938, 0.00025993879535235465, 0.012535981833934784, 0.02498341165482998, 0.054330844432115555], [0.5291358828544617, 0.017840318381786346, 0.021332835778594017, 0.011296787299215794, 0.00956611055880785, 0.017879504710435867, 0.0036630607210099697, 0.1037057489156723, 0.05965113267302513, 0.020368333905935287, 0.007805848494172096, 0.024187324568629265, 0.03566086292266846, 0.0507967509329319, 0.0027069412171840668, 0.01876167766749859, 0.06564092636108398], [0.34555691480636597, 0.04936022311449051, 0.006963954772800207, 0.008069342002272606, 0.028998441994190216, 0.0356253907084465, 0.01248763594776392, 0.10747640579938889, 0.07810013741254807, 0.027188358828425407, 0.0448719747364521, 0.017323995009064674, 0.043513376265764236, 0.054780300706624985, 0.05175561085343361, 0.006437510251998901, 0.08149056881666183], [0.12881092727184296, 0.06950867176055908, 0.03859624266624451, 0.05177735164761543, 0.06152310222387314, 0.060178741812705994, 0.04639644920825958, 0.12166110426187515, 0.03004666231572628, 0.032973237335681915, 0.04428943246603012, 0.024865174666047096, 0.03649240732192993, 0.028815530240535736, 0.05670784041285515, 0.04551705718040466, 0.12184006720781326]], [[0.7110787630081177, 0.027927642688155174, 0.009552759118378162, 0.014623880386352539, 0.015180319547653198, 0.024047110229730606, 0.007120250258594751, 0.08197902143001556, 0.007845137268304825, 0.005259108263999224, 0.007155620492994785, 0.015986017882823944, 0.006887119263410568, 0.004580909386277199, 0.00681166211143136, 0.00976585689932108, 0.04419892281293869], [0.7689254879951477, 0.026723772287368774, 0.017129555344581604, 0.056290559470653534, 0.01882445625960827, 0.03465570509433746, 0.007394162006676197, 0.052174292504787445, 0.004640564788132906, 0.0010288794292137027, 0.000809956225566566, 0.0023578691761940718, 0.0010662488639354706, 0.0018314884509891272, 0.0006359520484693348, 0.0024800170212984085, 0.0030311348382383585], [0.5054264068603516, 0.05908431485295296, 0.01152538787573576, 0.0853986069560051, 0.03131937235593796, 0.08983318507671356, 0.08388520032167435, 0.022242950275540352, 0.013807837851345539, 0.006013629026710987, 0.004907405935227871, 0.006878827698528767, 0.0017845273250713944, 0.006666569039225578, 0.006865022704005241, 0.05985284596681595, 0.004507862962782383], [0.05515551194548607, 0.2709078788757324, 0.03020547516644001, 0.059518877416849136, 0.24618178606033325, 0.1714058220386505, 0.05475560203194618, 0.08084814995527267, 0.00529443146660924, 0.00539929885417223, 0.0033454729709774256, 0.0037494811695069075, 0.0022265207953751087, 0.0018627563258633018, 0.0026252958923578262, 0.0008570272475481033, 0.0056606768630445], [0.08860161900520325, 0.16614261269569397, 0.10864855349063873, 0.19587187469005585, 0.026470569893717766, 0.06053324416279793, 0.13963888585567474, 0.18032796680927277, 0.0072058020159602165, 0.003577714553102851, 0.003631172701716423, 0.00317259319126606, 0.002401125617325306, 0.0014812910230830312, 0.00213058665394783, 0.005159215070307255, 0.0050051165744662285], [0.16291791200637817, 0.2019491344690323, 0.07593633234500885, 0.13697446882724762, 0.043486788868904114, 0.08810041844844818, 0.02907589077949524, 0.21042227745056152, 0.028409093618392944, 0.0030243671499192715, 0.002265175571665168, 0.0020514256320893764, 0.002718042116612196, 0.004621841479092836, 0.0014017298817634583, 0.002168684732168913, 0.004476376809179783], [0.3419612646102905, 0.06726314127445221, 0.027406899258494377, 0.2288077175617218, 0.08786772191524506, 0.1352757066488266, 0.015128012746572495, 0.04535001516342163, 0.014032970182597637, 0.00563087360933423, 0.0009072761749848723, 0.014148584567010403, 0.002122335135936737, 0.004114200826734304, 0.002215046901255846, 0.005772580858319998, 0.001995665952563286], [0.8622167706489563, 0.013754216954112053, 0.004114906303584576, 0.007589736022055149, 0.011954630725085735, 0.031024159863591194, 0.006212920416146517, 0.045845817774534225, 0.0017793033039197326, 0.0013376649003475904, 0.002112209564074874, 0.004218058194965124, 0.0007563874823972583, 0.0004498552007135004, 0.0003593052679207176, 0.0019464321667328477, 0.00432771909981966], [0.6874514222145081, 0.019140642136335373, 0.004409873392432928, 0.007162123452872038, 0.007483558729290962, 0.042244695127010345, 0.004043205175548792, 0.1125873550772667, 0.0024270396679639816, 0.0065034315921366215, 0.005022839177399874, 0.05935463309288025, 0.006321829743683338, 0.0009106681682169437, 0.0017249510856345296, 0.015043573454022408, 0.01816817745566368], [0.7241984605789185, 0.01891995407640934, 0.006214221939444542, 0.013416610658168793, 0.013520952314138412, 0.03287103772163391, 0.005572477821260691, 0.12138576805591583, 0.009694283828139305, 0.0005665191565640271, 0.004584840033203363, 0.01580716483294964, 0.0030038179829716682, 0.0033317464403808117, 0.0007722878362983465, 0.005620711017400026, 0.02051912620663643], [0.6256803870201111, 0.01544839609414339, 0.004969930276274681, 0.0071341427974402905, 0.01677573285996914, 0.03486441448330879, 0.009676879271864891, 0.21265809237957, 0.00704914191737771, 0.0015971852699294686, 0.0010824110358953476, 0.016238976269960403, 0.002351901028305292, 0.0026962580159306526, 0.0017549839103594422, 0.014265239238739014, 0.025755880400538445], [0.5370416641235352, 0.04764509201049805, 0.014612805098295212, 0.006854006554931402, 0.023454369977116585, 0.05693439021706581, 0.009042345918715, 0.16389532387256622, 0.03377465531229973, 0.007931066676974297, 0.037468332797288895, 0.004621597006917, 0.004921542014926672, 0.009726802818477154, 0.007861110381782055, 0.003718385938555002, 0.03049648553133011], [0.5746358633041382, 0.03487038239836693, 0.0035669011995196342, 0.013662824407219887, 0.010050659999251366, 0.03838569298386574, 0.012951690703630447, 0.14810103178024292, 0.01723843440413475, 0.0076309689320623875, 0.010396546684205532, 0.019433971494436264, 0.0013526052935048938, 0.008914723992347717, 0.005154790822416544, 0.0383501872420311, 0.05530280992388725], [0.6587098240852356, 0.011414572596549988, 0.002196417422965169, 0.0038269194774329662, 0.0033770480658859015, 0.022111741825938225, 0.0022246174048632383, 0.14017727971076965, 0.002240057336166501, 0.0038571900222450495, 0.00511352252215147, 0.030566321685910225, 0.007710849866271019, 0.0014465353451669216, 0.0033194960560649633, 0.02675457112491131, 0.07495308667421341], [0.863104522228241, 0.005362910684198141, 0.0018485154723748565, 0.0034691973123699427, 0.010621408931910992, 0.01624123938381672, 0.0020722737535834312, 0.04495903104543686, 0.0032996542286127806, 0.00044658457045443356, 0.0011551921488717198, 0.008624700829386711, 0.00197734241373837, 0.0024578964803367853, 0.0003257961943745613, 0.003830981906503439, 0.030202768743038177], [0.34848809242248535, 0.018607933074235916, 0.0037660030648112297, 0.008738338015973568, 0.015155485831201077, 0.030680716037750244, 0.01632816158235073, 0.12528109550476074, 0.09329309314489365, 0.013585846871137619, 0.010476295836269855, 0.014774148352444172, 0.08450304716825485, 0.15278802812099457, 0.01722089946269989, 0.006792874541133642, 0.03951994702219963], [0.8442959785461426, 0.003645744640380144, 0.0007190723554231226, 0.001659274217672646, 0.002519639441743493, 0.0075706737115979195, 0.0011258405866101384, 0.057927630841732025, 0.0027019220869988203, 0.001275690970942378, 0.0027634426951408386, 0.004853555932641029, 0.0022383565083146095, 0.0030044398736208677, 0.002414053538814187, 0.004696938209235668, 0.056587785482406616]], [[0.38903576135635376, 0.07133189588785172, 0.01173278596252203, 0.011611643247306347, 0.03593862056732178, 0.04239055886864662, 0.009422672912478447, 0.17532263696193695, 0.007530996110290289, 0.01816808432340622, 0.009301709942519665, 0.01393071562051773, 0.030338410288095474, 0.007863985374569893, 0.011107753030955791, 0.015544839203357697, 0.13942685723304749], [0.24447773396968842, 0.024418605491518974, 0.005231281742453575, 0.009238220751285553, 0.06914270669221878, 0.13657811284065247, 0.008181962184607983, 0.18644508719444275, 0.0212774109095335, 0.031458813697099686, 0.021971862763166428, 0.02044578455388546, 0.02386799454689026, 0.014258131384849548, 0.013974858447909355, 0.007653276436030865, 0.1613781601190567], [0.2509060204029083, 0.02114516869187355, 0.41429293155670166, 0.0007701053400523961, 0.004686474800109863, 0.02570347674190998, 0.002771765924990177, 0.09230562299489975, 0.005403014365583658, 0.049270082265138626, 0.00910914409905672, 0.007360628806054592, 0.0007370669627562165, 0.0032858906779438257, 0.007006220519542694, 0.004124969244003296, 0.10112148523330688], [0.295585960149765, 0.06220528110861778, 0.0041531529277563095, 0.26937955617904663, 0.00662674754858017, 0.022593991830945015, 0.006198883056640625, 0.12951865792274475, 0.012682843022048473, 0.019845308735966682, 0.020230544731020927, 0.003786548273637891, 0.007568683009594679, 0.014782285317778587, 0.019013414159417152, 0.003552051493898034, 0.10227608680725098], [0.12344686686992645, 0.04357608035206795, 0.00291255721822381, 0.0010759409051388502, 0.4924272298812866, 0.0130477175116539, 0.002582717686891556, 0.12221682816743851, 0.026410656049847603, 0.0045435745269060135, 0.004750614520162344, 0.0027752534952014685, 0.013904971070587635, 0.015176290646195412, 0.007841837592422962, 0.0011693054111674428, 0.12214155495166779], [0.2540737986564636, 0.30792585015296936, 0.008542569354176521, 0.010614048689603806, 0.03832758963108063, 0.02322378195822239, 0.002646507928147912, 0.11939003318548203, 0.01744941622018814, 0.02782294526696205, 0.007238340564072132, 0.01058943010866642, 0.02251370996236801, 0.01374638732522726, 0.008555963635444641, 0.006178680341690779, 0.12116090953350067], [0.3064430058002472, 0.07677601277828217, 0.018216054886579514, 0.037853650748729706, 0.00851031020283699, 0.008972520008683205, 0.3541720509529114, 0.06866519898176193, 0.0051787071861326694, 0.011201721616089344, 0.003618421498686075, 0.0007962197414599359, 0.004803159274160862, 0.007356555201113224, 0.008031015284359455, 0.0015090003143996, 0.07789646834135056], [0.2596839666366577, 0.0846216008067131, 0.04221224784851074, 0.06803110241889954, 0.03722790256142616, 0.03896056115627289, 0.014936673454940319, 0.041980061680078506, 0.01769174076616764, 0.04076812043786049, 0.02048587240278721, 0.021915532648563385, 0.03866751492023468, 0.02959289588034153, 0.04400334879755974, 0.04380278289318085, 0.15541806817054749], [0.03808952495455742, 0.015375161543488503, 0.0007993357139639556, 0.0007905364618636668, 0.003858056152239442, 0.006685994099825621, 0.00027509077335707843, 0.009094920009374619, 0.3042495846748352, 0.0004373933479655534, 0.0013684971490874887, 0.0008673985139466822, 0.00312657724134624, 0.6027000546455383, 0.00045409484300762415, 0.00035374233266338706, 0.01147408876568079], [0.2508527934551239, 0.02940240129828453, 0.06534533202648163, 0.008422117680311203, 0.006698856130242348, 0.0397794209420681, 0.011135798878967762, 0.2847210764884949, 0.002817385597154498, 0.1251867413520813, 0.0036841267719864845, 0.0011069232132285833, 0.006826427299529314, 0.006079099606722593, 0.017484400421380997, 0.0008016785141080618, 0.1396554410457611], [0.21936455368995667, 0.04131676256656647, 0.011005657725036144, 0.015474973246455193, 0.008186543360352516, 0.01935935765504837, 0.0036965063773095608, 0.13971573114395142, 0.01750980317592621, 0.009601340629160404, 0.4141910970211029, 0.001424020854756236, 0.005060848314315081, 0.016052646562457085, 0.017519867047667503, 0.0016003827331587672, 0.05891992524266243], [0.20305441319942474, 0.06034668907523155, 0.010924513451755047, 0.00515152420848608, 0.007079590577632189, 0.0265373345464468, 0.0019292245851829648, 0.08756648749113083, 0.01015008706599474, 0.002419837052002549, 0.002502580638974905, 0.5255581736564636, 0.0014382272493094206, 0.003980069886893034, 0.002235535765066743, 0.018066557124257088, 0.03105914033949375], [0.2183578759431839, 0.0287260003387928, 0.0027343754190951586, 0.00624172855168581, 0.031237689778208733, 0.01669887639582157, 0.003237685654312372, 0.15117305517196655, 0.032024599611759186, 0.0254222322255373, 0.010399964638054371, 0.0009532354888506234, 0.32737693190574646, 0.007788069546222687, 0.01549386978149414, 0.0024165955837816, 0.1197172999382019], [0.03817454352974892, 0.012452982366085052, 0.000669431930873543, 0.0006629990530200303, 0.0024828994646668434, 0.0058865416795015335, 0.000513443083036691, 0.012288613244891167, 0.8036785125732422, 0.0012242600787431002, 0.0017775868764147162, 0.0007082129595801234, 0.0013355595292523503, 0.10647126287221909, 0.00018847138562705368, 0.000123789650388062, 0.011360909789800644], [0.2685365676879883, 0.02049204334616661, 0.004818696994334459, 0.005789389368146658, 0.012365080416202545, 0.05163702368736267, 0.020101044327020645, 0.2846330404281616, 0.002266187686473131, 0.011631825007498264, 0.018799886107444763, 0.0033910851925611496, 0.03144542872905731, 0.0012760047102347016, 0.08715970814228058, 0.0005714880535379052, 0.17508552968502045], [0.37668177485466003, 0.013363177888095379, 0.0018170825205743313, 0.0016844520578160882, 0.009707030840218067, 0.013781100511550903, 0.0018701067892834544, 0.12596026062965393, 0.0014569963095709682, 0.002299179555848241, 0.003745058784261346, 0.017324060201644897, 0.004923918750137091, 0.0009610761771909893, 0.0003490169474389404, 0.39674392342567444, 0.027331868186593056], [0.24386854469776154, 0.0739862322807312, 0.023206090554594994, 0.050268206745386124, 0.04014477878808975, 0.05262504890561104, 0.050710152834653854, 0.21451424062252045, 0.021700013428926468, 0.04436247795820236, 0.018844349309802055, 0.013754173181951046, 0.036516059190034866, 0.017738400027155876, 0.02154546231031418, 0.012895023450255394, 0.0633205994963646]]], [[[0.9355422854423523, 0.0006494321278296411, 0.0004744344623759389, 0.001345394761301577, 0.001354434061795473, 0.0006321360124275088, 0.00012200047785881907, 0.005767859984189272, 0.013485876843333244, 0.003417539643123746, 0.0013144105905666947, 0.002121343044564128, 0.00269438442774117, 0.004497264511883259, 0.0006851769867353141, 0.0022490795236080885, 0.023646948859095573], [2.4183942514355294e-05, 2.663561826921068e-05, 0.9999362230300903, 6.729634151270147e-06, 2.4651697572153353e-08, 7.71648203112818e-08, 2.4117517227750795e-07, 1.3230322792878724e-06, 2.2574633362637542e-07, 6.688897352669088e-11, 8.345146795818437e-08, 6.750530445742697e-08, 3.5828691125061596e-08, 2.093240254907869e-06, 7.551505254443924e-11, 3.291590289222768e-08, 2.0395289084262913e-06], [4.9522532208357006e-05, 1.2538701810171915e-07, 5.1543138397391886e-05, 0.9998676776885986, 2.0876732378383167e-05, 5.3843027814082234e-08, 1.6032204896987423e-08, 1.3356617500903667e-06, 8.168541171471588e-06, 5.034894723365824e-09, 1.3684407339925597e-12, 1.7777360694637423e-09, 3.9445861688136574e-08, 3.6639386280512554e-07, 1.20783084867071e-07, 9.111669752037699e-11, 1.651739012231701e-07], [2.3706088541075587e-05, 3.999061846116092e-07, 2.9908974852332904e-07, 4.186195656075142e-05, 0.999927282333374, 4.991550667909905e-06, 1.1205658623225645e-08, 1.2119709253965993e-06, 3.243880053460657e-09, 1.1853795456318039e-07, 8.557883290905011e-10, 9.03196313854944e-14, 4.379467899440215e-10, 5.088097165817373e-10, 1.1983359859968346e-09, 1.484856397837575e-07, 4.890620797226575e-09], [4.1800649341894314e-05, 5.8263367463951e-07, 4.6524004915227124e-07, 1.768398476542643e-07, 3.5129418392898515e-05, 0.9998550415039062, 5.951595085207373e-05, 3.5462303458189126e-06, 3.824240124572498e-08, 2.533420584427404e-09, 9.89551040220249e-07, 1.4470968867641432e-09, 4.817335224303887e-13, 6.062939039708226e-11, 2.3247517799696027e-10, 1.3970891643566574e-07, 2.533301312723779e-06], [5.8842222642851993e-05, 1.7936281437869184e-07, 6.611982428239571e-08, 3.15730488864574e-07, 2.0993208238451189e-07, 3.707683572429232e-05, 0.9998800754547119, 1.8106306015397422e-05, 3.504309518120863e-07, 2.589069936220767e-06, 3.130722703303945e-08, 1.0975093118759105e-06, 2.365457518749281e-08, 4.988917807514925e-13, 2.697697620845929e-09, 2.181169023174334e-09, 9.959522913050023e-07], [0.006497211288660765, 1.4868014375224448e-07, 6.426125764846802e-06, 1.6409568814879094e-08, 1.6087889775917574e-07, 5.386507240245919e-08, 1.3310499298313516e-06, 0.9932237863540649, 1.216377040691441e-05, 4.672782716319546e-10, 1.3840391943631403e-07, 1.2232224122499247e-08, 6.989983830862911e-06, 5.921562351574039e-09, 9.378612431004446e-16, 2.530046172566358e-09, 0.0002515384112484753], [0.38218584656715393, 2.7172676553277597e-08, 6.39225181657821e-05, 0.0003378564724698663, 2.509233854652848e-05, 8.042449735512491e-06, 1.3365769291340257e-07, 0.004005192779004574, 0.612531304359436, 0.00019031904230359942, 1.0191830369876698e-05, 2.3659126782149542e-06, 4.838503627979662e-06, 0.00015785740106366575, 3.517355253279675e-06, 1.08113181340741e-06, 0.0004723687598016113], [2.41368326214797e-07, 3.6208297644890752e-12, 1.0343443766422378e-12, 6.947926678435579e-10, 2.9276256974242187e-09, 1.212914577802815e-11, 1.392966852975519e-09, 1.6996103413546848e-09, 8.832465141495049e-07, 0.9999966621398926, 2.2813285340816947e-06, 2.443766577986395e-10, 4.216922810940105e-09, 1.0999680983767024e-13, 4.531743158509016e-08, 4.631131589327708e-11, 3.0618947047950096e-12], [2.1098087472637417e-06, 1.0241648640274548e-09, 1.32145974718334e-10, 7.450339173722953e-14, 1.234571378461169e-08, 3.8343351604908094e-08, 9.489530689021919e-11, 1.854712365911837e-07, 1.0038334607997967e-08, 4.434824916188518e-07, 0.9999945163726807, 1.0689757345971884e-06, 8.551377383980707e-10, 2.644171270826945e-10, 1.4889549012145342e-13, 1.6551505268580513e-06, 1.0440869857575308e-07], [2.2537390620414044e-08, 6.81242345521027e-12, 3.287557726050494e-11, 9.86782880342714e-14, 6.149711189368029e-16, 5.5573518009666145e-11, 8.634102877103089e-10, 2.0973868475326896e-10, 6.167482524688239e-09, 1.0174643350069346e-09, 4.445814454356878e-07, 0.9999994039535522, 9.945665624400135e-09, 8.234613685376146e-11, 6.620368316057057e-12, 2.4458145578276635e-11, 6.345809566710159e-08], [4.715019713330548e-06, 5.2157865582103113e-08, 1.8168208271163167e-08, 1.9601325007556625e-08, 8.170213811053983e-11, 1.1775035970145592e-13, 9.797312294779204e-09, 2.802044036798179e-06, 6.428618082310322e-09, 3.233865868423891e-07, 2.5456442287463688e-08, 4.73509658149851e-07, 0.9999853372573853, 5.9189414969296195e-06, 9.494866226589238e-09, 1.2566689022719402e-08, 1.6351934561953385e-07], [3.981900817962014e-07, 6.746429109805163e-11, 9.87180783340591e-07, 2.4476123905436964e-10, 3.1708488612558483e-10, 2.6177122969262e-11, 2.225533752647996e-15, 4.338565062766975e-08, 1.8450238314926537e-07, 4.650263507599561e-11, 6.29014778041892e-08, 7.823311998222948e-10, 8.268973488156917e-07, 0.9999974966049194, 3.1792581012268784e-08, 8.2652276134354e-09, 3.105888879417762e-08], [1.927006287871791e-08, 1.1122814475017506e-11, 5.089676510111607e-12, 8.135398466002641e-10, 5.365475425067601e-12, 4.1413338145064593e-13, 2.452288318157553e-13, 9.42025429016835e-13, 7.460297028576146e-13, 4.16874712527715e-09, 6.561941094662682e-12, 1.1381785271213918e-11, 1.2697748408285747e-09, 3.3190685826411936e-06, 0.9999964237213135, 2.0666004729719134e-07, 2.7829707427429184e-09], [3.7666080743292696e-07, 1.1337527894283994e-07, 9.196482117501681e-12, 2.5277460788571127e-13, 2.394596521071435e-09, 2.2095480826933578e-12, 5.933642032579511e-12, 7.22328419300311e-10, 1.5305704038894936e-16, 4.35294827960675e-10, 4.5447310981217015e-07, 7.224255048343675e-12, 1.161479357136841e-09, 1.5278165399479349e-09, 9.749639957590261e-07, 0.9999970197677612, 9.957230986401555e-07], [0.0015373423229902983, 3.1003571621113224e-06, 8.651769007883559e-07, 8.01367860958635e-08, 1.0216561818765513e-08, 4.962153980159201e-06, 1.823811714984913e-07, 2.5269819161621854e-06, 1.1550569389839893e-08, 3.0552460561494854e-10, 5.582612175203394e-06, 3.791042399825528e-05, 3.9955369857125334e-07, 2.3942457119119354e-05, 5.23545440955786e-06, 0.0009931641397997737, 0.9973847270011902], [0.9593814611434937, 9.632896080802311e-07, 7.254982847371139e-06, 0.00018230268324259669, 5.615074201159587e-07, 6.559254472904286e-09, 2.4137492005138483e-07, 0.00032646520412527025, 5.8050823099620175e-06, 1.783136553967779e-06, 1.0075653200658508e-08, 1.8380191022515646e-06, 0.002660952275618911, 2.3716840587439947e-05, 5.146314833837096e-06, 1.774206612026319e-05, 0.03738361969590187]], [[0.5949885845184326, 0.04145917296409607, 0.014269188977777958, 0.011253397911787033, 0.016904832795262337, 0.02736825868487358, 0.008986677043139935, 0.1156281977891922, 0.013131280429661274, 0.007402143441140652, 0.008258018642663956, 0.004602531902492046, 0.011755825020372868, 0.006515786983072758, 0.008498439565300941, 0.009520783089101315, 0.09945692121982574], [0.8546807765960693, 0.0226163100451231, 0.018080921843647957, 0.012616914696991444, 0.009288068860769272, 0.0024607141967862844, 0.0013012217823415995, 0.05018533021211624, 9.382356074638665e-05, 0.0009396235109306872, 8.55288963066414e-05, 0.000521920039318502, 0.0005513743381015956, 7.170735625550151e-05, 0.0011948422761633992, 0.003039079252630472, 0.02227189764380455], [0.25310471653938293, 0.4599505662918091, 0.003079465590417385, 0.005965010728687048, 0.08530064672231674, 0.026872683316469193, 0.0030082762241363525, 0.10711006820201874, 0.00019498378969728947, 0.002360929036512971, 0.00014745222870260477, 2.446347934892401e-05, 2.7329191652825102e-05, 8.088388858595863e-05, 7.531927258241922e-05, 0.00017295591533184052, 0.05252426862716675], [0.7643661499023438, 0.04683690145611763, 0.006889530457556248, 0.0015992140397429466, 0.03807903826236725, 0.01547530759125948, 0.0024274331517517567, 0.07721491903066635, 6.371150811901316e-05, 2.1336063582566567e-05, 8.816142508294433e-05, 0.000335853110300377, 2.844283517333679e-05, 4.3501207983354107e-05, 0.0036157267168164253, 0.004941575229167938, 0.037973254919052124], [0.19517144560813904, 0.05839504301548004, 0.012696070596575737, 0.5632587671279907, 0.0027082546148449183, 0.04422884061932564, 0.0004030381969641894, 0.09544773399829865, 0.00035377044696360826, 0.000536356121301651, 7.78046014602296e-05, 2.2460399122792296e-05, 0.0012518352596089244, 0.0001873059809440747, 0.002509131096303463, 0.00020179111743345857, 0.022550296038389206], [0.5002495050430298, 0.03241099417209625, 0.025222670286893845, 0.07769428938627243, 0.08382061868906021, 0.052557431161403656, 0.0135029973462224, 0.17214134335517883, 0.00057251937687397, 0.002517190296202898, 0.00015763800183776766, 0.00022777655976824462, 0.0008719456382095814, 7.019796612439677e-05, 0.001091563142836094, 0.0022897934541106224, 0.03460157290101051], [0.4310643970966339, 0.021289434283971786, 0.006160495337098837, 0.0009870436042547226, 0.2854451537132263, 0.11022341996431351, 0.0004944716347381473, 0.10603439807891846, 0.0011509193573147058, 0.0015419807750731707, 0.0010350528173148632, 0.000659246405120939, 0.0016612800536677241, 9.060092270374298e-05, 0.0009832715149968863, 0.0056259045377373695, 0.02555299736559391], [0.796432614326477, 0.006353548262268305, 0.0032617892138659954, 0.004934101365506649, 0.0033642577473074198, 0.007846800610423088, 0.003419620217755437, 0.13450399041175842, 0.001016809605062008, 0.0005014926427975297, 0.0005373733583837748, 0.0009752211044542491, 0.00024762199609540403, 0.00013472182035911828, 0.00016016718291211873, 0.0013363180914893746, 0.03497352451086044], [0.24919486045837402, 0.007206745911389589, 0.0001356502325506881, 0.0019404733320698142, 0.005103836301714182, 0.016646314412355423, 0.002942462917417288, 0.23032179474830627, 0.000714728725142777, 0.40486544370651245, 0.029963817447423935, 0.0007177259540185332, 0.005146562587469816, 2.7008532924810424e-05, 0.00014822804951108992, 0.0018870854983106256, 0.04303720220923424], [0.40319758653640747, 0.0197871346026659, 0.00014951803314033896, 0.0008175448747351766, 0.005618836730718613, 0.051850032061338425, 0.006674640346318483, 0.2511081099510193, 0.040659405291080475, 0.03697432205080986, 0.045180536806583405, 0.00874173454940319, 0.007354083936661482, 0.0007959581562317908, 6.845649477327242e-05, 0.00873834453523159, 0.11228380352258682], [0.10848839581012726, 0.00024361119722016156, 2.8878013836219907e-05, 2.0344685253803618e-05, 0.0001451286516385153, 0.0007513011805713177, 0.003403593087568879, 0.054972633719444275, 0.002998771844431758, 0.016414230689406395, 0.00018222177459392697, 0.7936817407608032, 0.002682511694729328, 0.00032181941787712276, 0.0006228039273992181, 0.000173580163391307, 0.014868473634123802], [0.5428603291511536, 0.013690683990716934, 0.0002944112056866288, 0.0008217784925363958, 0.0010721463477239013, 0.01077132485806942, 0.0022588202264159918, 0.2994878888130188, 0.00425883661955595, 0.0010804233606904745, 0.04121650755405426, 0.000346769840689376, 0.0029246043413877487, 0.0009662922821007669, 0.0013209349708631635, 0.004413546994328499, 0.0722147598862648], [0.5985311269760132, 0.004378383047878742, 2.260666406073142e-06, 0.0006068684160709381, 0.0016325123142451048, 0.001627118093892932, 0.005191441625356674, 0.15198488533496857, 0.012242529541254044, 0.007205548696219921, 0.03400833159685135, 0.01812802255153656, 0.01433394942432642, 0.03324098512530327, 0.00713706249371171, 0.002238480607047677, 0.10751036554574966], [0.3219894766807556, 0.012197881005704403, 5.496832454809919e-05, 0.000931154761929065, 0.0009875481482595205, 0.001988510601222515, 0.0008117770194076002, 0.14286759495735168, 0.0001122108951676637, 0.07392741739749908, 0.061075158417224884, 0.002721543191000819, 0.1522398591041565, 0.003901849268004298, 0.007672846782952547, 0.023880938068032265, 0.19263917207717896], [0.6205939054489136, 0.0039581917226314545, 0.0005913144559599459, 0.0001976605853997171, 0.0012620574561879039, 0.001228285487741232, 0.00010953479068120942, 0.04177296161651611, 0.0003529075183905661, 0.03517037257552147, 0.007956725545227528, 0.01971692219376564, 0.09413152188062668, 0.004131590947508812, 0.003335351124405861, 0.021190563216805458, 0.144300177693367], [0.2399197220802307, 0.07880997657775879, 0.00259426049888134, 0.002576143480837345, 0.0008089802577160299, 0.015935951843857765, 0.002184772863984108, 0.1388407051563263, 0.004522915463894606, 0.0009857991244643927, 0.011517094448208809, 0.0011087798047810793, 0.12236630916595459, 0.0357777401804924, 0.03258981555700302, 0.003595049725845456, 0.30586597323417664], [0.8447940349578857, 0.0023220782168209553, 0.0007928755367174745, 0.0009924678597599268, 0.0005624364712275565, 0.001095983781851828, 0.0009029190987348557, 0.0418771468102932, 0.00025011805701069534, 0.00031846619094721973, 0.0003772870113607496, 0.001040619215928018, 0.00121454824693501, 0.0006862554582767189, 0.0014170415233820677, 0.004296107217669487, 0.09705953299999237]], [[0.6794950366020203, 0.020080437883734703, 0.009186024777591228, 0.016070690006017685, 0.010829854756593704, 0.016985945403575897, 0.041587211191654205, 0.07047858089208603, 0.00965174287557602, 0.006120110861957073, 0.0038138360250741243, 0.009115786291658878, 0.013074948452413082, 0.00718285795301199, 0.005220686551183462, 0.028262505307793617, 0.05284372717142105], [0.7979868054389954, 0.010404558852314949, 0.02692222408950329, 0.010519581846892834, 0.004837049171328545, 0.008305289782583714, 0.018681367859244347, 0.04176031053066254, 0.012916214764118195, 0.008285220712423325, 0.0033736240584403276, 0.004157812334597111, 0.002895163604989648, 0.010550260543823242, 0.008037804625928402, 0.009152938611805439, 0.021213700994849205], [0.5050345659255981, 0.04889919236302376, 0.0021495306864380836, 0.005230682902038097, 0.020860718563199043, 0.07513469457626343, 0.005232284311205149, 0.15812332928180695, 0.001692757592536509, 0.0017810034332796931, 0.0042657991871237755, 0.01695844531059265, 0.008787930943071842, 0.0014454179909080267, 0.01248086616396904, 0.013968534767627716, 0.11795426160097122], [0.723012387752533, 0.02347753196954727, 0.0035447818227112293, 0.0031816321425139904, 0.011554001830518246, 0.023514436557888985, 0.010746429674327374, 0.10511188954114914, 0.0031964888330549, 0.003487018868327141, 0.00423961179330945, 0.0009529824601486325, 0.0007597259245812893, 0.0022798865102231503, 0.00235563307069242, 0.002616351703181863, 0.07596906274557114], [0.9389620423316956, 0.0031622087117284536, 0.00816851295530796, 0.00289873406291008, 0.0005961715360172093, 0.0027973565738648176, 0.0018213825533166528, 0.02155333384871483, 0.0006787073216401041, 0.0009381890413351357, 0.0015244201058521867, 0.0007928678533062339, 0.000431269669206813, 0.0006296270876191556, 0.0005316688329912722, 0.0006454806425608695, 0.013868081383407116], [0.753540575504303, 0.012931165285408497, 0.02717972919344902, 0.011035887524485588, 0.006233082618564367, 0.009879739955067635, 0.02271539531648159, 0.060485802590847015, 0.014293571002781391, 0.011909821070730686, 0.004617311526089907, 0.0033763854298740625, 0.0032438780181109905, 0.010758349671959877, 0.00498055387288332, 0.007952501997351646, 0.034866221249103546], [0.6807866096496582, 0.024723999202251434, 0.013768007047474384, 0.02800874039530754, 0.022718045860528946, 0.021652555093169212, 0.0020194954704493284, 0.08672391623258591, 0.014468180015683174, 0.004552105907350779, 0.0029184448067098856, 0.0027586177457123995, 0.003531964961439371, 0.010132170282304287, 0.005204258486628532, 0.010821963660418987, 0.06521096080541611], [0.5612258911132812, 0.028047649189829826, 0.016220543533563614, 0.014265069738030434, 0.017594577744603157, 0.023234743624925613, 0.0356149896979332, 0.10591285675764084, 0.02076452597975731, 0.018292412161827087, 0.005868815816938877, 0.008068295195698738, 0.01283286139369011, 0.013952228240668774, 0.004266048315912485, 0.033777762204408646, 0.08006081730127335], [0.8799819946289062, 0.004421754274517298, 0.005495412275195122, 0.0029790292028337717, 0.0012005098396912217, 0.004702387377619743, 0.01796642877161503, 0.021590828895568848, 0.0003034147375728935, 0.00293393200263381, 0.002668288303539157, 0.0069808089174330235, 0.008658720180392265, 0.00027484112069942057, 0.011845042929053307, 0.014737254939973354, 0.013259399682283401], [0.7632590532302856, 0.011373892426490784, 0.011581245809793472, 0.033029116690158844, 0.00474944431334734, 0.0209407489746809, 0.0038896873593330383, 0.03430390730500221, 0.006271999794989824, 0.0007240584236569703, 0.01409540232270956, 0.0035669973585754633, 0.014447689987719059, 0.008148306980729103, 0.006337392143905163, 0.02071329951286316, 0.042567718774080276], [0.8940169215202332, 0.006516528315842152, 0.0008585987379774451, 0.0007568628643639386, 0.0019729332998394966, 0.008235134184360504, 0.00253323488868773, 0.03385712951421738, 0.00320843025110662, 0.0013705930905416608, 0.0004802478360943496, 0.002106709871441126, 0.010157761164009571, 0.0032736239954829216, 0.00032396192546002567, 0.0027145911008119583, 0.027616964653134346], [0.6837222576141357, 0.01950814388692379, 0.005392636638134718, 0.009299716912209988, 0.00788936112076044, 0.020118579268455505, 0.001508766203187406, 0.10875685513019562, 0.004666187800467014, 0.0054798126220703125, 0.0034487403463572264, 0.0031756414100527763, 0.002699614269658923, 0.0036273326259106398, 0.0005925296572968364, 0.0031115885358303785, 0.11700227111577988], [0.8573183417320251, 0.008999879471957684, 0.01650077849626541, 0.006297715473920107, 0.004899804014712572, 0.006857826840132475, 0.008868544362485409, 0.028955595567822456, 0.008493779227137566, 0.001923307660035789, 0.010739739052951336, 0.0015105127822607756, 0.00026008865097537637, 0.007146472577005625, 0.0009599243057891726, 0.0017145522870123386, 0.028553275391459465], [0.8732692003250122, 0.004498566500842571, 0.004311298951506615, 0.0031807872001081705, 0.0015554772689938545, 0.0044036624021828175, 0.018561385571956635, 0.02422196976840496, 0.00034519791370257735, 0.0027839289978146553, 0.0029735141433775425, 0.008310222066938877, 0.013398518785834312, 0.00029091929900459945, 0.007897323928773403, 0.014986430294811726, 0.015011569485068321], [0.8171095848083496, 0.010777651332318783, 0.010929337702691555, 0.00822288915514946, 0.00596046494320035, 0.011628340929746628, 0.00045918094110675156, 0.03933892026543617, 0.011541320011019707, 0.00124576676171273, 0.008243943564593792, 0.005604828242212534, 0.006028258241713047, 0.00924117024987936, 8.495533256791532e-05, 0.004458678420633078, 0.04912463575601578], [0.7714602947235107, 0.026443015784025192, 0.0020406756084412336, 0.002663692459464073, 0.015086560510098934, 0.01921915076673031, 0.003041022690013051, 0.0637235939502716, 0.004517432767897844, 0.0005923475255258381, 0.002148760948330164, 0.0010903128422796726, 0.0015405797166749835, 0.003435541642829776, 0.0003489606606308371, 0.0007614369387738407, 0.08188647776842117], [0.43853560090065, 0.034660011529922485, 0.013434701599180698, 0.017402980476617813, 0.01563013717532158, 0.03564619645476341, 0.08030953258275986, 0.12421101331710815, 0.014718686230480671, 0.017459692433476448, 0.005664506461471319, 0.014510716311633587, 0.01727837324142456, 0.011015098541975021, 0.006577224470674992, 0.056417204439640045, 0.0965283215045929]], [[0.5815844535827637, 0.02720939740538597, 0.0157015323638916, 0.03102879598736763, 0.012070601806044579, 0.01922813057899475, 0.013879239559173584, 0.08372199535369873, 0.02299232967197895, 0.017558297142386436, 0.014973497949540615, 0.009932146407663822, 0.026017161086201668, 0.01971599832177162, 0.009903118014335632, 0.01970537006855011, 0.07477787137031555], [0.1692742109298706, 0.054433368146419525, 0.022639498114585876, 0.05578644573688507, 0.033741313964128494, 0.04558313637971878, 0.07267678529024124, 0.1259724348783493, 0.037145696580410004, 0.054348308593034744, 0.027663199231028557, 0.017808442935347557, 0.05746692791581154, 0.03015071153640747, 0.021804338321089745, 0.03797299787402153, 0.13553231954574585], [0.4722102880477905, 0.03741598129272461, 0.00181068223901093, 0.012383394874632359, 0.007177495863288641, 0.02557646669447422, 0.016764646396040916, 0.07205918431282043, 0.0568358488380909, 0.02328498102724552, 0.008952340111136436, 0.008760440163314342, 0.0918356403708458, 0.050526779145002365, 0.010255200788378716, 0.018460946157574654, 0.08568958193063736], [0.2148580402135849, 0.026853758841753006, 0.0072295768186450005, 0.004100218880921602, 0.01129070296883583, 0.01742633618414402, 0.010483977384865284, 0.15515193343162537, 0.12032398581504822, 0.041423339396715164, 0.00701836496591568, 0.03778037428855896, 0.10201407968997955, 0.0885799378156662, 0.01197520736604929, 0.025951823219656944, 0.1175384595990181], [0.3852078318595886, 0.04560801014304161, 0.010762291960418224, 0.024914126843214035, 0.016002479940652847, 0.038918256759643555, 0.03740183636546135, 0.12538009881973267, 0.023994291201233864, 0.025541409850120544, 0.015669062733650208, 0.016547728329896927, 0.02709055133163929, 0.020456785336136818, 0.017996516078710556, 0.022095780819654465, 0.14641302824020386], [0.22539757192134857, 0.04281027615070343, 0.02305930107831955, 0.03603752329945564, 0.025167472660541534, 0.032921578735113144, 0.060300275683403015, 0.11208374798297882, 0.04323510453104973, 0.04555424302816391, 0.029335828498005867, 0.026096701622009277, 0.07451300323009491, 0.03385065495967865, 0.03003452904522419, 0.053211942315101624, 0.1063903272151947], [0.21871009469032288, 0.0513530969619751, 0.011265574023127556, 0.006773148663341999, 0.011489405296742916, 0.04077046364545822, 0.0018833070062100887, 0.05675496906042099, 0.13768060505390167, 0.04151282459497452, 0.010876127518713474, 0.028814086690545082, 0.12604494392871857, 0.11376316845417023, 0.018502499908208847, 0.0657174214720726, 0.058088213205337524], [0.7058432698249817, 0.01679087243974209, 0.007516131270676851, 0.011418919079005718, 0.007463071960955858, 0.012064480222761631, 0.017614472657442093, 0.06049349159002304, 0.014102059416472912, 0.011629853397607803, 0.014399201609194279, 0.010711430571973324, 0.014296703971922398, 0.010744431056082249, 0.006797045469284058, 0.011764036491513252, 0.0663505345582962], [0.1433573067188263, 0.06983232498168945, 0.04338371008634567, 0.06718699634075165, 0.04012880101799965, 0.051670514047145844, 0.15788646042346954, 0.12298418581485748, 0.00500798923894763, 0.03456784412264824, 0.03514351323246956, 0.011213215999305248, 0.038101568818092346, 0.0048529040068387985, 0.015241623856127262, 0.03346850723028183, 0.12597255408763885], [0.35274598002433777, 0.07534167915582657, 0.05326433479785919, 0.05724025517702103, 0.05773056298494339, 0.05824365094304085, 0.03344985470175743, 0.11632174253463745, 0.022036297246813774, 0.0005809458671137691, 0.030180415138602257, 0.008157889358699322, 0.014675678685307503, 0.017044993117451668, 0.004692139569669962, 0.015336982905864716, 0.08295659720897675], [0.5790175795555115, 0.028419334441423416, 0.010066869668662548, 0.014237898401916027, 0.013887273147702217, 0.02822345681488514, 0.014961293898522854, 0.11086398363113403, 0.013128424063324928, 0.009187465533614159, 0.0044557033106684685, 0.006255378946661949, 0.01893085241317749, 0.01157055888324976, 0.006718425080180168, 0.022045638412237167, 0.10802987217903137], [0.44963908195495605, 0.03398113697767258, 0.035185620188713074, 0.01729641482234001, 0.014622033573687077, 0.03139468654990196, 0.04932186380028725, 0.09485386312007904, 0.019954705610871315, 0.027963638305664062, 0.010881787165999413, 0.003121713176369667, 0.029573190957307816, 0.019426889717578888, 0.015149048529565334, 0.03391532227396965, 0.11371901631355286], [0.21124984323978424, 0.05526366084814072, 0.015933411195874214, 0.07138905674219131, 0.04252435639500618, 0.05880718678236008, 0.13843990862369537, 0.12324792146682739, 0.018241779878735542, 0.013994716107845306, 0.01485170517116785, 0.01285125408321619, 0.0029458908829838037, 0.014646648429334164, 0.028690563514828682, 0.03138447925448418, 0.14553764462471008], [0.21742378175258636, 0.06340129673480988, 0.03287157043814659, 0.053827784955501556, 0.032982222735881805, 0.042721688747406006, 0.1290695071220398, 0.12705668807029724, 0.0055196830071508884, 0.031154394149780273, 0.036424294114112854, 0.009855358861386776, 0.03458060696721077, 0.004623322747647762, 0.014992689713835716, 0.03287259116768837, 0.1306225061416626], [0.6842207312583923, 0.02986539527773857, 0.009423304349184036, 0.002455885522067547, 0.013980555348098278, 0.020320314913988113, 0.01330432016402483, 0.0479150153696537, 0.024849439039826393, 0.0023381586652249098, 0.019646957516670227, 0.005946996621787548, 0.056288741528987885, 0.017741305753588676, 0.0008609762298874557, 0.005518859252333641, 0.045323099941015244], [0.36001014709472656, 0.027944333851337433, 0.02824557013809681, 0.02590601146221161, 0.011874223127961159, 0.023193221539258957, 0.02439897507429123, 0.12978418171405792, 0.0382370725274086, 0.03340689092874527, 0.02585063874721527, 0.007932947017252445, 0.036404069513082504, 0.030620813369750977, 0.026394160464406013, 0.01917407102882862, 0.15062275528907776], [0.7096184492111206, 0.023926839232444763, 0.010619768872857094, 0.013453569263219833, 0.010290422476828098, 0.01543080247938633, 0.015900254249572754, 0.06299453228712082, 0.01348634622991085, 0.007427625358104706, 0.015428809449076653, 0.011548458598554134, 0.007830829359591007, 0.010287066921591759, 0.0042676618322730064, 0.010458126664161682, 0.05703045427799225]], [[0.6058481335639954, 0.0253519956022501, 0.006869847886264324, 0.02331080473959446, 0.013111171312630177, 0.025573665276169777, 0.010937036946415901, 0.09205174446105957, 0.015806013718247414, 0.00848170556128025, 0.008922051638364792, 0.016189537942409515, 0.012357783503830433, 0.01296777743846178, 0.009534301236271858, 0.028084704652428627, 0.08460170775651932], [0.3333311080932617, 0.13556884229183197, 0.01859838329255581, 0.027172811329364777, 0.019308721646666527, 0.08105447888374329, 0.01326698251068592, 0.14831332862377167, 0.04469044879078865, 0.007027589716017246, 0.008336471393704414, 0.004321874585002661, 0.018051037564873695, 0.03156008571386337, 0.004057455342262983, 0.022323332726955414, 0.08301709592342377], [0.5671290159225464, 0.027801064774394035, 0.005680502392351627, 0.08517289906740189, 0.010959060862660408, 0.012946318835020065, 0.00936854537576437, 0.07888246327638626, 0.037053726613521576, 0.013440326787531376, 0.006712822709232569, 0.009968185797333717, 0.01830669306218624, 0.029227375984191895, 0.01506025530397892, 0.030402272939682007, 0.04188845679163933], [0.673143208026886, 0.008338067680597305, 0.06539830565452576, 0.06388434767723083, 0.03449659422039986, 0.008033735677599907, 0.014819418080151081, 0.045905113220214844, 0.0032489330042153597, 0.029535092413425446, 0.002964033978059888, 0.022185180336236954, 0.0037812592927366495, 0.002082084072753787, 0.001272389548830688, 0.003433247795328498, 0.01747893914580345], [0.24137534201145172, 0.022460442036390305, 0.01723959855735302, 0.222296342253685, 0.1049489825963974, 0.03090766631066799, 0.0516129732131958, 0.14631067216396332, 0.011282529681921005, 0.0011694047134369612, 0.012264223769307137, 0.007438526954501867, 0.03554824739694595, 0.012532058171927929, 0.00207935506477952, 0.01606513187289238, 0.06446851044893265], [0.29123392701148987, 0.1424478143453598, 0.020962093025445938, 0.027036501094698906, 0.028803322464227676, 0.14765538275241852, 0.007624009158462286, 0.15184204280376434, 0.041935935616493225, 0.0029757674783468246, 0.0075397915206849575, 0.003990232013165951, 0.009279372170567513, 0.02702932618558407, 0.0030242090579122305, 0.017513636499643326, 0.06910670548677444], [0.24581553041934967, 0.0313277393579483, 0.015972675755620003, 0.2633145749568939, 0.04884167015552521, 0.03901386260986328, 0.02762654423713684, 0.09373121708631516, 0.02646380104124546, 0.0008216437418013811, 0.019053399562835693, 0.015661345794796944, 0.057423364371061325, 0.039659179747104645, 0.003662919858470559, 0.014991740696132183, 0.05661878362298012], [0.4324389100074768, 0.08541450649499893, 0.007437588647007942, 0.01671590656042099, 0.00935688242316246, 0.08448371291160583, 0.009448128752410412, 0.18262578547000885, 0.008880075067281723, 0.0034841494634747505, 0.004519734065979719, 0.011026954278349876, 0.002134965965524316, 0.005588314030319452, 0.00492390152066946, 0.01731644570827484, 0.11420413106679916], [0.22228237986564636, 0.06658090651035309, 0.020575420930981636, 0.022468898445367813, 0.009674722328782082, 0.09498171508312225, 0.012938061729073524, 0.16718639433383942, 0.0006061867461539805, 0.0026482364628463984, 0.12931525707244873, 0.007239471655339003, 0.003939827438443899, 0.0003299923846498132, 0.029332712292671204, 0.08185838907957077, 0.12804140150547028], [0.5818532705307007, 0.013720554299652576, 0.0018779218662530184, 0.03872267156839371, 0.005376925691962242, 0.010774753987789154, 0.008912059478461742, 0.11085251718759537, 0.0034989304840564728, 0.0010915175080299377, 0.0646749958395958, 0.010324764996767044, 0.01761830784380436, 0.0025115488097071648, 0.0024469487834721804, 0.027936693280935287, 0.09780576080083847], [0.33331605792045593, 0.013803028501570225, 0.000333157746354118, 0.005709727760404348, 0.0013226809678599238, 0.014582417905330658, 0.0010145908454433084, 0.10755118727684021, 0.249624103307724, 0.009553052484989166, 0.0008679964812472463, 0.003544895676895976, 0.019413676112890244, 0.17553210258483887, 0.0005073253996670246, 0.010822988115251064, 0.05250106379389763], [0.34077519178390503, 0.04579440876841545, 0.00508628785610199, 0.006581885274499655, 0.031532078981399536, 0.04037422686815262, 0.016493279486894608, 0.1931513398885727, 0.006358105689287186, 0.003800834994763136, 0.02671726606786251, 0.003260429948568344, 0.009607400745153427, 0.005959047935903072, 0.006782330106943846, 0.010761281475424767, 0.24696464836597443], [0.31444963812828064, 0.026757458224892616, 0.006010961718857288, 0.009595262818038464, 0.019186679273843765, 0.0241029541939497, 0.00830118078738451, 0.1584509015083313, 0.023749465122818947, 0.008241751231253147, 0.04405442625284195, 0.0072031281888484955, 0.032528460025787354, 0.026813674718141556, 0.017404893413186073, 0.11518726497888565, 0.1579618901014328], [0.1614888608455658, 0.06875857710838318, 0.02178669162094593, 0.017640182748436928, 0.010145997628569603, 0.0975291058421135, 0.009621965698897839, 0.16886742413043976, 0.0004612971388269216, 0.002318678656592965, 0.09921890497207642, 0.006933483295142651, 0.003259926801547408, 0.000463211938040331, 0.04867072030901909, 0.11754919588565826, 0.16528579592704773], [0.3964332640171051, 0.032853882759809494, 0.0013279204722493887, 0.005148499272763729, 0.004246916621923447, 0.03142772614955902, 0.0005245922948233783, 0.21750952303409576, 0.027621380984783173, 0.010780484415590763, 0.0010161803802475333, 0.0049780686385929585, 0.021854573860764503, 0.028553089126944542, 0.0020125696901232004, 0.01694355346262455, 0.19676776230335236], [0.2180866152048111, 0.023630503565073013, 0.004087154287844896, 0.012857169844210148, 0.021206026896834373, 0.017039740458130836, 0.0014415261102840304, 0.11514461040496826, 0.03040764294564724, 0.012752312235534191, 0.012508979998528957, 0.007340692449361086, 0.2948306202888489, 0.06804068386554718, 0.004302926827222109, 0.005076535977423191, 0.15124629437923431], [0.39016395807266235, 0.05084690451622009, 0.003221756312996149, 0.006970633752644062, 0.005004544276744127, 0.0618128776550293, 0.005754649639129639, 0.17235592007637024, 0.011689548380672932, 0.006776392925530672, 0.006221544463187456, 0.01918535679578781, 0.0045694918371737, 0.010254628956317902, 0.009098042733967304, 0.03265119343996048, 0.20342253148555756]], [[0.582505464553833, 0.02546740137040615, 0.04924190044403076, 0.044356029480695724, 0.01463341061025858, 0.030199341475963593, 0.10021436959505081, 0.058093320578336716, 0.005660651717334986, 0.017507027834653854, 0.003995534963905811, 0.007855177856981754, 0.00870499387383461, 0.004224460572004318, 0.007459970191121101, 0.012171356938779354, 0.027709605172276497], [0.1877545416355133, 0.10667350888252258, 0.10201557725667953, 0.11406737565994263, 0.07527679949998856, 0.09606726467609406, 0.07152875512838364, 0.09858537465333939, 0.011088890954852104, 0.015129963867366314, 0.01033075526356697, 0.008662494830787182, 0.03582123667001724, 0.01243582833558321, 0.025038450956344604, 0.010599627159535885, 0.018923452123999596], [0.5206281542778015, 0.082667775452137, 0.0018030989449471235, 0.006330402567982674, 0.024240126833319664, 0.08025576919317245, 0.001909830141812563, 0.23971572518348694, 0.002570951357483864, 0.0004839225730393082, 0.0033152918331325054, 0.0030120927840471268, 0.0022512711584568024, 0.0030405730940401554, 0.0030764658004045486, 0.0006405285676009953, 0.024057943373918533], [0.6422134637832642, 0.033057257533073425, 0.0026392668951302767, 0.013757710345089436, 0.025444896891713142, 0.06301417201757431, 0.009336418472230434, 0.1418127566576004, 0.0029442645609378815, 0.0014438031939789653, 0.0019679900724440813, 0.0014616451226174831, 0.0027139412704855204, 0.00308454898186028, 0.007041700650006533, 0.0009452001540921628, 0.047121044248342514], [0.24980372190475464, 0.07706596702337265, 0.019942507147789, 0.057168569415807724, 0.1871625930070877, 0.12090840190649033, 0.05699198320508003, 0.17078334093093872, 0.0027274582535028458, 0.0031944988295435905, 0.01525421254336834, 0.00202236813493073, 0.004111922346055508, 0.0017890288727357984, 0.005243231076747179, 0.0028422202449291945, 0.02298792637884617], [0.17239253222942352, 0.02828207053244114, 0.07990632206201553, 0.06958632916212082, 0.05648583546280861, 0.09246767312288284, 0.28592216968536377, 0.13000918924808502, 0.006378619000315666, 0.011332699097692966, 0.012371061369776726, 0.009071980603039265, 0.013289705850183964, 0.0030504572205245495, 0.007369216997176409, 0.005593894515186548, 0.016490206122398376], [0.5687983632087708, 0.012919253669679165, 0.00533297797665, 0.00926362443715334, 0.0031991549767553806, 0.024911878630518913, 0.07413862645626068, 0.2129899561405182, 0.0021864664740860462, 0.0010578305227681994, 0.0015351250767707825, 0.0030664284713566303, 0.0010965625988319516, 0.0012242335360497236, 0.0014874160988256335, 0.002019115723669529, 0.07477303594350815], [0.46560055017471313, 0.03224664926528931, 0.10046101361513138, 0.05452308803796768, 0.028736744076013565, 0.05134398117661476, 0.11766839027404785, 0.04460185393691063, 0.0075304340571165085, 0.022453168407082558, 0.005035162903368473, 0.012518107891082764, 0.012070742435753345, 0.004226121120154858, 0.012571001425385475, 0.013694709166884422, 0.014718231745064259], [0.39393001794815063, 0.040288373827934265, 0.0026253084652125835, 0.009588421322405338, 0.011799974367022514, 0.03533002361655235, 0.006263634189963341, 0.21772681176662445, 0.07718871533870697, 0.018702641129493713, 0.02745433710515499, 0.01970721036195755, 0.021050162613391876, 0.019020356237888336, 0.008839542046189308, 0.008620994165539742, 0.08186350017786026], [0.41198137402534485, 0.019247496500611305, 0.009578035213053226, 0.009465794079005718, 0.009994862601161003, 0.030319368466734886, 0.042050961405038834, 0.1346721351146698, 0.047908201813697815, 0.016177354380488396, 0.02969096414744854, 0.036451712250709534, 0.06305212527513504, 0.030253730714321136, 0.017770856618881226, 0.019500145688652992, 0.07188490778207779], [0.6045801043510437, 0.014017041772603989, 0.0015931782545521855, 0.002544862451031804, 0.0073415860533714294, 0.015469339676201344, 0.006513651926070452, 0.12261021882295609, 0.029429838061332703, 0.003224156331270933, 0.02866539917886257, 0.014520345255732536, 0.01581287570297718, 0.03067084215581417, 0.006541646551340818, 0.005499962251633406, 0.0909649059176445], [0.6020297408103943, 0.022959209978580475, 0.0034842458553612232, 0.006723849102854729, 0.008959964849054813, 0.022289635613560677, 0.004260982386767864, 0.1385488659143448, 0.02162935584783554, 0.004556650761514902, 0.00586526095867157, 0.0131843201816082, 0.01027646753937006, 0.026674263179302216, 0.0075353821739554405, 0.008229343220591545, 0.09279245883226395], [0.2420206516981125, 0.02764251083135605, 0.004622197709977627, 0.004616035148501396, 0.004732013214379549, 0.01552066020667553, 0.004600547719746828, 0.11105573177337646, 0.07855802029371262, 0.02207113429903984, 0.02056225948035717, 0.04186885058879852, 0.057524099946022034, 0.11470051109790802, 0.03868047893047333, 0.012506169266998768, 0.1987181156873703], [0.3321884572505951, 0.03389744088053703, 0.002693361835554242, 0.0071026929654181, 0.005533934570848942, 0.01564689911901951, 0.003093804232776165, 0.18151551485061646, 0.029250681400299072, 0.0081030810251832, 0.014644493348896503, 0.029862424358725548, 0.026823272928595543, 0.0492832213640213, 0.030096178874373436, 0.016970288008451462, 0.21329432725906372], [0.6845720410346985, 0.016754653304815292, 0.0009863210143521428, 0.002535318722948432, 0.0037805584724992514, 0.014086965471506119, 0.0013210945762693882, 0.08875895291566849, 0.009227129630744457, 0.00434531457722187, 0.009984754025936127, 0.013242486864328384, 0.020694760605692863, 0.016962971538305283, 0.02019704133272171, 0.010012459009885788, 0.08253715932369232], [0.5398125648498535, 0.015081505291163921, 0.0018217518227174878, 0.004363028332591057, 0.0071001797914505005, 0.015609458088874817, 0.005663950927555561, 0.1367199718952179, 0.012434837408363819, 0.00426257262006402, 0.007367572747170925, 0.018330521881580353, 0.020279422402381897, 0.014222918078303337, 0.006555483210831881, 0.011862548068165779, 0.17851173877716064], [0.36825042963027954, 0.01007560919970274, 0.04554915800690651, 0.03139343485236168, 0.00729637686163187, 0.01621279865503311, 0.07366807013750076, 0.028060002252459526, 0.016219986602663994, 0.05559534206986427, 0.005889969877898693, 0.04531412199139595, 0.05655630677938461, 0.017460372298955917, 0.06137128174304962, 0.08160776644945145, 0.07947903126478195]], [[0.10150274634361267, 0.056775402277708054, 0.027641616761684418, 0.055415842682123184, 0.05959916487336159, 0.05462685972452164, 0.04789852350950241, 0.1458617001771927, 0.03546008840203285, 0.017090609297156334, 0.02668726071715355, 0.04303312674164772, 0.031764257699251175, 0.03336093947291374, 0.02486257627606392, 0.09027407318353653, 0.14814525842666626], [0.0863855853676796, 0.24787335097789764, 0.003853786503896117, 0.003847409738227725, 0.005954352207481861, 0.28800687193870544, 0.0014496278017759323, 0.16159002482891083, 0.012423563748598099, 0.0025480466429144144, 0.0009974673157557845, 0.001931938691996038, 0.0011101821437478065, 0.013208563439548016, 0.004920025356113911, 0.0032107937149703503, 0.16068826615810394], [0.14062336087226868, 0.013474451377987862, 0.3341744840145111, 0.001062109018675983, 0.002328150672838092, 0.015371915884315968, 0.012867647223174572, 0.17474909126758575, 0.003574281930923462, 0.02682465873658657, 0.0005727342795580626, 0.004803636576980352, 0.0012695365585386753, 0.004581225570291281, 0.029927408322691917, 0.0009980346076190472, 0.23279722034931183], [0.05724711716175079, 0.0033608940429985523, 0.0012052609818056226, 0.6935656666755676, 0.0059132869355380535, 0.010741160251200199, 0.018963724374771118, 0.10247406363487244, 0.000613437790889293, 0.00036747835110872984, 0.0037309376057237387, 0.001134525053203106, 0.0008378717466257513, 0.0009415954118594527, 0.0005056040245108306, 0.01611877605319023, 0.08227851986885071], [0.05343194678425789, 0.007357680704444647, 0.0008152057998813689, 0.003345476696267724, 0.7409043908119202, 0.007481908425688744, 0.00017718825256451964, 0.06375930458307266, 0.014160959981381893, 0.0002252731064800173, 0.0009103623451665044, 0.009406579658389091, 0.0017068530432879925, 0.011185236275196075, 0.0013475895393639803, 0.0019056780729442835, 0.08187828958034515], [0.07601923495531082, 0.45464256405830383, 0.0016442922642454505, 0.005464432295411825, 0.005849150009453297, 0.1937789022922516, 0.0013004663633182645, 0.14252859354019165, 0.0065056439489126205, 0.001619619899429381, 0.0005873207119293511, 0.0018004789017140865, 0.0006894461112096906, 0.004495232366025448, 0.007708333898335695, 0.0008719105971977115, 0.09449435025453568], [0.14645135402679443, 0.00824565440416336, 0.003093477338552475, 0.021993644535541534, 0.0005127917975187302, 0.005400036461651325, 0.5194729566574097, 0.12501980364322662, 0.0007646023877896369, 0.006850738078355789, 0.003791574388742447, 0.0018295790068805218, 0.0006944182678125799, 0.0004898236365988851, 0.010378297418355942, 0.0007684637093916535, 0.1442427933216095], [0.18492233753204346, 0.051034316420555115, 0.015085405670106411, 0.042665183544158936, 0.02634424902498722, 0.035478249192237854, 0.05155931040644646, 0.17810136079788208, 0.020249750465154648, 0.04976426810026169, 0.01902586594223976, 0.02346872165799141, 0.015254486352205276, 0.015495894476771355, 0.026960913091897964, 0.0406220480799675, 0.20396772027015686], [0.11262324452400208, 0.0034453312400728464, 0.0025132603477686644, 0.0005145873292349279, 0.019906997680664062, 0.0038114232011139393, 0.0007193675264716148, 0.10669032484292984, 0.23912738263607025, 0.0010404441272839904, 0.0005012480542063713, 0.005631100386381149, 0.005967219825834036, 0.4252619445323944, 0.0007809204398654401, 0.002135257702320814, 0.06932993978261948], [0.15359656512737274, 0.02635994926095009, 0.05118342489004135, 0.004991552792489529, 0.0014750513946637511, 0.013714841566979885, 0.021671149879693985, 0.20006506145000458, 0.009619561024010181, 0.3566414415836334, 0.0008415657794103026, 0.009659632109105587, 0.00103670300450176, 0.01356533169746399, 0.016946449875831604, 0.002617922145873308, 0.11601381748914719], [0.17769388854503632, 0.0027802397962659597, 0.0015493643004447222, 0.006638471968472004, 0.0032385496888309717, 0.002391247544437647, 0.007084549404680729, 0.20605194568634033, 0.001683704787865281, 0.0003792982897721231, 0.4159244894981384, 0.0007561771781183779, 0.008840864524245262, 0.0020081130787730217, 0.001441675703972578, 0.0004729137581307441, 0.16106446087360382], [0.0668032243847847, 0.0026265436317771673, 0.0035894776228815317, 0.0006340295076370239, 0.008292131125926971, 0.002639294136315584, 0.0014065058203414083, 0.07170724123716354, 0.0018339167581871152, 0.0006682593375444412, 0.0002708070387598127, 0.8025994896888733, 0.0007157629006542265, 0.0018917274428531528, 0.0003930495004169643, 0.0005359607748687267, 0.03339254856109619], [0.11458969116210938, 0.006263757590204477, 0.0023777647875249386, 0.002864033216610551, 0.01796647720038891, 0.009822512976825237, 0.0018172648269683123, 0.08093534409999847, 0.02149457484483719, 0.0011926060542464256, 0.006302511319518089, 0.013844604603946209, 0.5653843879699707, 0.02411363460123539, 0.0026775277219712734, 0.007272959686815739, 0.12108036130666733], [0.17340035736560822, 0.0020083824638277292, 0.0027011942584067583, 0.0006685554981231689, 0.015232319943606853, 0.0014870482264086604, 0.0008552268263883889, 0.10619340091943741, 0.3732151687145233, 0.0022064638324081898, 0.0009895730763673782, 0.0050681582652032375, 0.0027907853946089745, 0.20185750722885132, 0.0014800208155065775, 0.0021708288695663214, 0.1076749935746193], [0.12609931826591492, 0.028834374621510506, 0.006291717756539583, 0.0016308606136590242, 0.000967914005741477, 0.00683382386341691, 0.002945375395938754, 0.08620961755514145, 0.0005601923330686986, 0.005568929947912693, 0.0008325201342813671, 0.0004995049675926566, 0.0004126435669604689, 0.000431784923421219, 0.6565546989440918, 0.0028775108512490988, 0.07244917750358582], [0.0808182954788208, 0.007231817115098238, 0.0007726553594693542, 0.0107741579413414, 0.004917825106531382, 0.00674121268093586, 0.0012940966989845037, 0.20596443116664886, 0.0030732129234820604, 0.00018728358554653823, 0.00017955902148969471, 0.0024609218817204237, 0.0017491645412519574, 0.002376766176894307, 0.0017995280213654041, 0.539187490940094, 0.13047149777412415], [0.18937397003173828, 0.05623513460159302, 0.02961532026529312, 0.042857296764850616, 0.023699268698692322, 0.028199635446071625, 0.06605412065982819, 0.19971922039985657, 0.025171738117933273, 0.039639391005039215, 0.015038169920444489, 0.012659987434744835, 0.03845212236046791, 0.027315106242895126, 0.04270714521408081, 0.033653244376182556, 0.12960918247699738]], [[0.7072710394859314, 0.02025127410888672, 0.008623739704489708, 0.022592656314373016, 0.009532532654702663, 0.012545792385935783, 0.00797272752970457, 0.05966459587216377, 0.024847362190485, 0.012376786209642887, 0.00644359365105629, 0.007226041983813047, 0.01551248412579298, 0.020753052085638046, 0.005207892507314682, 0.013097947463393211, 0.04608050361275673], [0.2058332860469818, 0.1329808384180069, 0.06840626150369644, 0.08813412487506866, 0.0321757011115551, 0.08071182668209076, 0.035094212740659714, 0.11282630264759064, 0.042788613587617874, 0.015489951707422733, 0.014115683734416962, 0.009140207432210445, 0.04922871291637421, 0.045428887009620667, 0.007329728454351425, 0.01670478843152523, 0.043610889464616776], [0.14562542736530304, 0.18104705214500427, 0.02394924871623516, 0.10756579786539078, 0.04833333566784859, 0.11318337172269821, 0.0711887776851654, 0.13359986245632172, 0.02750270999968052, 0.008604022674262524, 0.008918151259422302, 0.004984763450920582, 0.028118327260017395, 0.022080594673752785, 0.011575060896575451, 0.014446396380662918, 0.04927710443735123], [0.25682151317596436, 0.12538205087184906, 0.10437680780887604, 0.03524830564856529, 0.04944097250699997, 0.06037474051117897, 0.035989273339509964, 0.09861882030963898, 0.054988257586956024, 0.01650240086019039, 0.008534936234354973, 0.017553573474287987, 0.0325353667140007, 0.049732524901628494, 0.015073303133249283, 0.00849836878478527, 0.030328797176480293], [0.381420761346817, 0.05739995464682579, 0.051154471933841705, 0.0935162901878357, 0.015592883341014385, 0.03177342563867569, 0.07335171103477478, 0.12216147780418396, 0.04713563993573189, 0.006516763940453529, 0.010556009598076344, 0.0056880321353673935, 0.03486748784780502, 0.030796300619840622, 0.0014682153705507517, 0.006805592682212591, 0.02979503758251667], [0.18502046167850494, 0.12758572399616241, 0.050649192184209824, 0.09789030253887177, 0.03500235080718994, 0.08044002205133438, 0.04070030525326729, 0.1257300078868866, 0.0493810810148716, 0.011870051734149456, 0.011907748878002167, 0.006355403922498226, 0.05904830992221832, 0.05142189562320709, 0.006249067839235067, 0.01577920652925968, 0.04496882110834122], [0.24962207674980164, 0.06421149522066116, 0.07388907670974731, 0.12005764991044998, 0.03547848388552666, 0.04370953515172005, 0.009727505967020988, 0.19703389704227448, 0.04452044144272804, 0.013740920461714268, 0.010302780196070671, 0.006059207953512669, 0.010932795703411102, 0.041698601096868515, 0.006756617221981287, 0.00753893842920661, 0.06471994519233704], [0.8676420450210571, 0.009835900738835335, 0.002584961475804448, 0.004810415208339691, 0.007630026899278164, 0.004555467050522566, 0.0026950433384627104, 0.046919312328100204, 0.008078555576503277, 0.002796290908008814, 0.001584212644957006, 0.0015981710748746991, 0.004415057133883238, 0.008153198286890984, 0.0018554717535153031, 0.0036000539548695087, 0.02124566212296486], [0.1824914813041687, 0.08712238818407059, 0.04976505786180496, 0.028436629101634026, 0.018137233331799507, 0.07061360031366348, 0.06778471171855927, 0.11746978014707565, 0.030701857060194016, 0.020283183082938194, 0.05955550819635391, 0.033622290939092636, 0.020324481651186943, 0.01827339082956314, 0.0497424453496933, 0.06593020260334015, 0.0797458067536354], [0.27385109663009644, 0.06333151459693909, 0.01755063608288765, 0.01068738754838705, 0.015490111894905567, 0.05293416604399681, 0.006554890889674425, 0.08919497579336166, 0.09335487335920334, 0.020913520827889442, 0.018606366589665413, 0.027294263243675232, 0.09664979577064514, 0.05416751280426979, 0.008564081974327564, 0.05615193396806717, 0.0947028324007988], [0.48727378249168396, 0.023071911185979843, 0.0075753857381641865, 0.026162711903452873, 0.006361059844493866, 0.01431692112237215, 0.01638052426278591, 0.07400745898485184, 0.06480858474969864, 0.009130040183663368, 0.004551251884549856, 0.017120515927672386, 0.09465217590332031, 0.04308564215898514, 0.009638577699661255, 0.053390566259622574, 0.04847288504242897], [0.16506165266036987, 0.016937734559178352, 0.009962411597371101, 0.026620345190167427, 0.017974555492401123, 0.02112235501408577, 0.006529505830258131, 0.030426068231463432, 0.11578191816806793, 0.04105665162205696, 0.03256804496049881, 0.022309020161628723, 0.27163925766944885, 0.12392012029886246, 0.011107023805379868, 0.05113895237445831, 0.0358443483710289], [0.20663395524024963, 0.0599508173763752, 0.008706527762115002, 0.018223673105239868, 0.011991174891591072, 0.04738074913620949, 0.02089555747807026, 0.05381520837545395, 0.1368630826473236, 0.010366475209593773, 0.06298252195119858, 0.045971982181072235, 0.04792414978146553, 0.12080392241477966, 0.02490728162229061, 0.04904209077358246, 0.07354079931974411], [0.22065909206867218, 0.07472027093172073, 0.03604720160365105, 0.020893141627311707, 0.015495178289711475, 0.06506757438182831, 0.06635229289531708, 0.1403273642063141, 0.032369840890169144, 0.01293972972780466, 0.04850924387574196, 0.025209493935108185, 0.023000935092568398, 0.02444223314523697, 0.03753481060266495, 0.05492141470313072, 0.101510189473629], [0.6286150217056274, 0.013461850583553314, 0.0055913557298481464, 0.0010578696383163333, 0.002348033245652914, 0.009881854988634586, 0.00440909992903471, 0.057110778987407684, 0.05197405070066452, 0.004761812277138233, 0.014561306685209274, 0.005125395022332668, 0.08251536637544632, 0.04107102379202843, 0.0008804557146504521, 0.027141733095049858, 0.04949304834008217], [0.06919750571250916, 0.015849336981773376, 0.0034721260890364647, 0.0072216917760670185, 0.004556418862193823, 0.01166294515132904, 0.0030687975231558084, 0.01746642403304577, 0.24035799503326416, 0.02604944072663784, 0.021381542086601257, 0.006992953363806009, 0.2801233232021332, 0.25395768880844116, 0.005604046396911144, 0.011747616343200207, 0.021290110424160957], [0.8101101517677307, 0.007141087669879198, 0.0013573500327765942, 0.003956690896302462, 0.004529180005192757, 0.004052420612424612, 0.0017949139000847936, 0.03936672583222389, 0.01639990508556366, 0.006694929674267769, 0.00391845079138875, 0.004969191271811724, 0.014184552244842052, 0.019815120846033096, 0.005173931363970041, 0.008216982707381248, 0.04831838235259056]], [[0.7205291986465454, 0.01972651109099388, 0.004841483663767576, 0.008108249865472317, 0.00906450767070055, 0.018766412511467934, 0.010340290144085884, 0.09753643721342087, 0.009674835950136185, 0.002809976926073432, 0.004401346668601036, 0.004032909870147705, 0.005608757957816124, 0.006970263551920652, 0.001958990702405572, 0.006524901837110519, 0.0691048800945282], [0.1944400817155838, 0.15130627155303955, 0.014380430802702904, 0.04605203494429588, 0.07361453771591187, 0.1953025758266449, 0.04238109290599823, 0.13135938346385956, 0.007811491843312979, 0.0047035724855959415, 0.014010553248226643, 0.006448366679251194, 0.01134528312832117, 0.005740555468946695, 0.01579647697508335, 0.006385642569512129, 0.07892167568206787], [0.3568216860294342, 0.0807134285569191, 0.012620361521840096, 0.07988269627094269, 0.08753029257059097, 0.0518876314163208, 0.06770256906747818, 0.16095077991485596, 0.007481612730771303, 0.004240153357386589, 0.006185802631080151, 0.0060904379934072495, 0.0037427342031151056, 0.0043992046266794205, 0.0025104053784161806, 0.0027659162878990173, 0.06447428464889526], [0.4519554674625397, 0.0926629826426506, 0.011256865225732327, 0.007133648730814457, 0.07802079617977142, 0.09134230017662048, 0.027379531413316727, 0.1143922358751297, 0.00988288689404726, 0.006507444195449352, 0.005873312242329121, 0.0032645121682435274, 0.007257004268467426, 0.005953365005552769, 0.002365853637456894, 0.0023222421295940876, 0.08242962509393692], [0.5478538870811462, 0.058316558599472046, 0.0035570503678172827, 0.008939341641962528, 0.053859394043684006, 0.07035426795482635, 0.008053836412727833, 0.14636379480361938, 0.004996392875909805, 0.005150305572897196, 0.003946470562368631, 0.0009508839575573802, 0.005032761953771114, 0.0023058801889419556, 0.002249420154839754, 0.0028274590149521828, 0.07524217665195465], [0.3169063329696655, 0.12778297066688538, 0.005257573910057545, 0.03707710653543472, 0.07445850223302841, 0.1684202402830124, 0.03242195025086403, 0.10388179123401642, 0.004478077869862318, 0.009408521465957165, 0.017710385844111443, 0.004453954752534628, 0.008664041757583618, 0.002642967039719224, 0.007299333345144987, 0.004143392201513052, 0.07499285787343979], [0.5461118817329407, 0.03994627296924591, 0.007004902698099613, 0.029389038681983948, 0.029334770515561104, 0.03242163360118866, 0.03923169523477554, 0.12223955243825912, 0.024496493861079216, 0.00499787786975503, 0.009530574083328247, 0.006606025621294975, 0.017486440017819405, 0.013894093222916126, 0.004353648982942104, 0.0053316266275942326, 0.06762354075908661], [0.7106369733810425, 0.029361261054873466, 0.004129546694457531, 0.004591869655996561, 0.008564203977584839, 0.03219747170805931, 0.00542434211820364, 0.1246507465839386, 0.0022715346422046423, 0.0014207608764991164, 0.001892385887913406, 0.002299071289598942, 0.0009582726052030921, 0.0017483675619587302, 0.0007167321746237576, 0.002529596211388707, 0.06660689413547516], [0.5863592624664307, 0.06978526711463928, 0.0031259639654308558, 0.017568491399288177, 0.012831158004701138, 0.04484372213482857, 0.008003173395991325, 0.07923827320337296, 0.013727568089962006, 0.004443072713911533, 0.01606028899550438, 0.002947880420833826, 0.026261065155267715, 0.007917261682450771, 0.01490645669400692, 0.004926762543618679, 0.08705437928438187], [0.6563798785209656, 0.022435182705521584, 0.003965011332184076, 0.007658019196242094, 0.01815665140748024, 0.027777278795838356, 0.005089675076305866, 0.10607391595840454, 0.01491209864616394, 0.005640522576868534, 0.011453702114522457, 0.004246762953698635, 0.0036070796195417643, 0.008176050148904324, 0.004317238926887512, 0.012927439995110035, 0.08718345314264297], [0.9022427201271057, 0.013484259136021137, 0.005736830178648233, 0.0019634913187474012, 0.005895042791962624, 0.008695744909346104, 0.005197812337428331, 0.01322328019887209, 0.00653631379827857, 0.0009133713319897652, 0.0010642585111781955, 0.0073537142015993595, 0.005280360579490662, 0.004163064993917942, 0.0023721761535853148, 0.008230739273130894, 0.007646849844604731], [0.7062962055206299, 0.01543585304170847, 0.00038290562224574387, 0.0008453846094198525, 0.01872618868947029, 0.01083575189113617, 0.005968781653791666, 0.09175468236207962, 0.006616016384214163, 0.0075217681005597115, 0.007811934221535921, 0.0007507918635383248, 0.020119305700063705, 0.003274541115388274, 0.0014504962600767612, 0.006686098407953978, 0.09552323818206787], [0.5687817335128784, 0.007500171661376953, 0.0012481000740081072, 0.0038021039217710495, 0.01597571186721325, 0.008688051253557205, 0.001626061974093318, 0.14397060871124268, 0.009618405252695084, 0.010725083760917187, 0.008235657587647438, 0.004670614842325449, 0.022284414619207382, 0.005498046986758709, 0.00850935559719801, 0.015744689851999283, 0.16312110424041748], [0.5851798057556152, 0.04845580831170082, 0.0024077247362583876, 0.01979917101562023, 0.011966938152909279, 0.028333179652690887, 0.00647295918315649, 0.08622950315475464, 0.011261907406151295, 0.004242253489792347, 0.021014271304011345, 0.004997418727725744, 0.04563039168715477, 0.009517955593764782, 0.015191788785159588, 0.0059085870161652565, 0.09339034557342529], [0.9669435024261475, 0.0022029958199709654, 0.00035214831586927176, 0.00016543437959626317, 0.0009811866329982877, 0.0016666107112541795, 0.00039290532004088163, 0.008445731364190578, 0.0010751164518296719, 0.00044181954581290483, 0.00043774713412858546, 0.0006587635725736618, 0.002310391515493393, 0.0006159336189739406, 0.00046792119974270463, 0.002146282000467181, 0.010695446282625198], [0.663899838924408, 0.017244383692741394, 0.00319538451731205, 0.015414198860526085, 0.018682176247239113, 0.011811940930783749, 0.007079834584146738, 0.09032615274190903, 0.006297524552792311, 0.008428191766142845, 0.007065104320645332, 0.007958292961120605, 0.016529854387044907, 0.003802537452429533, 0.005587233696132898, 0.013406040146946907, 0.10327138006687164], [0.6939183473587036, 0.014839899726212025, 0.001858274918049574, 0.0038684557657688856, 0.004067894537001848, 0.015002700500190258, 0.003196831326931715, 0.11537855863571167, 0.0061392392963171005, 0.0035417380277067423, 0.004894040059298277, 0.007757817395031452, 0.0036283310037106276, 0.005451475735753775, 0.0016782361781224608, 0.006084951106458902, 0.10869317501783371]], [[0.6333684325218201, 0.024028640240430832, 0.01582026295363903, 0.02619941160082817, 0.04270121082663536, 0.009178788401186466, 0.0038392902351915836, 0.21672624349594116, 0.0017483258852735162, 0.001017351751215756, 0.0010785945923998952, 0.0001320177543675527, 0.0011158701963722706, 0.0013306918554008007, 0.000298039783956483, 0.0008072047494351864, 0.020609553903341293], [1.938435343618039e-05, 6.173909059725702e-05, 0.9999021291732788, 6.603690962947439e-06, 3.5700384870551716e-08, 9.066732076234985e-08, 3.589256891700643e-07, 4.228381840221118e-06, 6.511343286774718e-08, 1.2281238526146154e-11, 3.181120078465938e-08, 5.9575171462711296e-08, 3.022314487566291e-08, 8.439465659648704e-07, 1.3673281951120941e-10, 1.6037619232633915e-08, 4.2792162275873125e-06], [2.3578810214530677e-05, 1.2059001619491028e-07, 6.968660272832494e-06, 0.9999454021453857, 2.1070180082460865e-05, 3.048452157372594e-08, 4.142956200325898e-09, 5.234975333223701e-07, 1.9411934317759005e-06, 2.668707033137707e-09, 8.73100826663184e-14, 6.171158584145076e-10, 3.570415962883544e-09, 1.8966605352943589e-07, 1.4805887360580527e-07, 4.436480349756522e-11, 1.0428436070242242e-07], [8.065016299951822e-05, 8.256847650045529e-07, 2.5425518401789304e-07, 5.274629802443087e-05, 0.9998537302017212, 7.893589099694509e-06, 2.3578591523687464e-08, 3.43811643688241e-06, 4.5912926238678153e-10, 6.059946144887363e-08, 4.253125074349384e-10, 1.0152786272289716e-13, 1.6957475423851065e-09, 9.410592305414411e-10, 5.470099040394416e-09, 3.573008200419281e-07, 1.0538805383930594e-08], [7.252969953697175e-05, 1.0088042472489178e-06, 7.699180173403875e-07, 9.171030512789002e-08, 2.4529730580979958e-05, 0.9998767375946045, 1.5409394109155983e-05, 5.737579613196431e-06, 2.1189318744063712e-08, 1.1404707178641615e-09, 1.92392076314718e-06, 5.324307661425109e-10, 1.7075421209367808e-13, 2.0237015188606655e-11, 1.554867623543288e-10, 2.7191543239268867e-08, 1.2435561984602828e-06], [0.0001449573173886165, 1.961492017699129e-07, 6.18173388033938e-08, 4.283891996692546e-07, 2.922433282037673e-07, 3.328191814944148e-05, 0.9997778534889221, 4.209042526781559e-05, 3.019092886802355e-08, 1.7570700094893255e-07, 3.8337217844741645e-09, 8.989945143866862e-08, 1.9401731332635563e-09, 2.6702029259570437e-14, 5.938007863193207e-10, 1.386070841435938e-10, 5.102368163534265e-07], [0.007875271141529083, 2.78672530384938e-07, 3.2709865536162397e-06, 1.2520325221032635e-08, 3.734522024956277e-08, 3.981714513656698e-08, 1.071748783942894e-06, 0.9919500350952148, 4.3283617401357333e-07, 4.404072245778323e-11, 5.2820126938968315e-08, 8.038200105531246e-10, 6.446478550969914e-07, 1.78486253554766e-10, 9.004659693239187e-17, 4.770594475012047e-10, 0.00016886369849089533], [0.9675384759902954, 2.4402952547575296e-08, 2.4659111659275368e-05, 0.0005393843166530132, 9.950016828952357e-05, 9.271211638406385e-06, 3.642330170805508e-07, 0.0050777713768184185, 0.026093894615769386, 0.0005052406922914088, 4.626071677193977e-05, 3.5581527413341973e-07, 1.1286647350061685e-06, 7.7222457548487e-06, 4.0241860688183806e-07, 1.0882030210268567e-06, 5.439632514026016e-05], [3.0027919706299144e-07, 3.0919871697732138e-12, 2.84315648281519e-13, 2.217392935932594e-09, 6.027913279638142e-09, 1.4596358843821378e-11, 4.318633450850484e-09, 3.70443609121196e-09, 7.221189662232064e-06, 0.9999920129776001, 5.019703053221747e-07, 8.708393728351638e-11, 5.175761597087103e-09, 2.2277747138352288e-13, 1.1703649605010469e-08, 7.325244577582879e-12, 5.253080522654718e-12], [5.72791691411112e-07, 9.049626326085303e-11, 7.4393498306069e-11, 9.07126207917025e-15, 5.032383953995634e-10, 1.0314469278682736e-08, 1.1090874746377821e-11, 2.6279504794501918e-08, 4.8859178924942626e-09, 3.091863050030952e-07, 0.999998927116394, 1.2974977536828192e-08, 1.603888133416831e-10, 3.0041871768027306e-11, 3.181084410643076e-14, 6.817750630716546e-08, 8.795264072603004e-09], [1.1951796352605015e-07, 1.9400122133750308e-11, 3.7851676654154787e-11, 2.199900689132256e-13, 2.863965354883138e-15, 1.702163687777869e-10, 3.0246940507794307e-09, 1.4474657028529236e-09, 3.899944367447006e-09, 1.523147052928664e-09, 5.010775794289657e-07, 0.9999991655349731, 5.912943468189269e-09, 6.186419432285817e-11, 5.683380502330415e-11, 1.9405043544251654e-11, 2.479856391346402e-07], [2.4267053959192708e-05, 1.472443926786582e-07, 7.291051673519178e-08, 3.203686205210943e-08, 5.360495380912766e-10, 2.116960713715102e-13, 8.399983819629142e-09, 1.3914322153141256e-05, 6.976647259904212e-09, 1.6974820482573705e-07, 3.115429336730813e-08, 4.852436177316122e-07, 0.999946117401123, 1.4210274457582273e-05, 1.5129073105413227e-08, 2.0338172035394564e-08, 5.701721761397494e-07], [1.1685061451771617e-07, 7.447960650996954e-12, 3.1582297310706053e-07, 6.020477172352656e-11, 6.321258794184104e-11, 5.942484785498303e-12, 1.2458434815132923e-15, 2.5889788091149057e-08, 1.6114299228320306e-07, 2.351414722656653e-11, 2.9442976057225678e-08, 2.647815300349521e-10, 8.733418326301035e-07, 0.9999984502792358, 2.9151566494078907e-08, 5.1976982717860665e-09, 1.964271412191465e-08], [7.690763936807343e-07, 5.7923849744456746e-11, 2.123421750932497e-11, 1.0758450130765596e-08, 1.7259421669635344e-10, 3.496171234462775e-12, 4.377409459216386e-12, 1.452169027388317e-11, 1.063384536675871e-11, 3.252171154599637e-08, 1.560671383793455e-11, 6.66970367824149e-11, 1.3440947022047567e-08, 2.2631687897956e-05, 0.9999750852584839, 1.4043592955204076e-06, 2.9219464181551302e-08], [2.985507080666139e-06, 1.4227438214220456e-06, 1.7013533637477707e-10, 1.1511919889573008e-12, 1.6513105549620377e-08, 4.302720332804988e-11, 8.268533774336007e-11, 4.258034369541974e-09, 2.1723913555235145e-16, 3.6362386435229155e-09, 5.106020125822397e-06, 8.561303575793655e-12, 9.83621273320523e-09, 1.3094116901868347e-09, 4.0106687038132804e-07, 0.999988317489624, 1.691036345619068e-06], [0.0030385619029402733, 1.633214878893341e-06, 1.3128044429322472e-06, 5.003703407169269e-08, 5.8457025886582414e-09, 9.394044013788516e-07, 1.0661939597866876e-07, 1.0856862900254782e-05, 1.429447671341677e-09, 6.110028455408312e-11, 5.1160109251213726e-06, 1.8191908566222992e-06, 9.099260012135346e-08, 3.81958079742617e-06, 7.576409757348301e-07, 0.00038831771234981716, 0.9965465664863586], [0.9982079267501831, 8.643622209092428e-07, 3.3460573831689544e-06, 0.0002227737131761387, 2.980569661303889e-06, 1.2109315328245884e-08, 4.106748008325667e-07, 0.0002561875735409558, 4.2954678036721816e-08, 1.069768131856108e-06, 1.8953638658558702e-08, 9.561464509033613e-08, 0.00022737712424714118, 3.61794889158773e-07, 7.938553494568623e-07, 1.636926390347071e-05, 0.0010594949126243591]], [[0.6119003295898438, 0.021581873297691345, 0.021193664520978928, 0.012721186503767967, 0.015900934115052223, 0.01727457530796528, 0.009155241772532463, 0.11030912399291992, 0.016146371141076088, 0.009341476485133171, 0.009282168932259083, 0.0073842392303049564, 0.014071679674088955, 0.01539422944188118, 0.00883619673550129, 0.011063641868531704, 0.0884430930018425], [0.35931894183158875, 0.03959937393665314, 0.06138524413108826, 0.022490933537483215, 0.030552275478839874, 0.034835271537303925, 0.019451741129159927, 0.17165839672088623, 0.013603885658085346, 0.012843778356909752, 0.015116498805582523, 0.013518509455025196, 0.011545107699930668, 0.01215664017945528, 0.00508335093036294, 0.014057524502277374, 0.16278253495693207], [0.09382070600986481, 0.09909869730472565, 0.05581952631473541, 0.0935056135058403, 0.04949573799967766, 0.07117585092782974, 0.12767624855041504, 0.061477504670619965, 0.03662518784403801, 0.15335127711296082, 0.013764445669949055, 0.0025596146006137133, 0.004863533657044172, 0.029972365126013756, 0.03622915595769882, 0.023739580065011978, 0.04682505875825882], [0.3372790515422821, 0.07769323140382767, 0.01718800514936447, 0.22605858743190765, 0.021703997626900673, 0.06861609220504761, 0.11675143241882324, 0.03624695539474487, 0.007911092601716518, 0.00880168005824089, 0.007015812676399946, 0.009533777832984924, 0.0010994685580953956, 0.0075613390654325485, 0.015131873078644276, 0.011829919181764126, 0.029577815905213356], [0.39484772086143494, 0.041872866451740265, 0.08691174536943436, 0.11149054020643234, 0.007923164404928684, 0.03480171784758568, 0.03658316656947136, 0.1314055174589157, 0.00800010934472084, 0.011399831622838974, 0.002736506750807166, 0.002895988989621401, 0.014981966465711594, 0.008690150454640388, 0.00405798340216279, 0.003336118534207344, 0.09806475788354874], [0.36681169271469116, 0.048759058117866516, 0.06123392656445503, 0.03902408853173256, 0.03148386999964714, 0.03719025105237961, 0.03373100236058235, 0.1484169214963913, 0.01929200254380703, 0.015458864159882069, 0.014254549518227577, 0.013169675134122372, 0.014441785402595997, 0.017074115574359894, 0.004936002194881439, 0.012161352671682835, 0.12256094068288803], [0.2650616466999054, 0.05970345437526703, 0.011520858854055405, 0.08978330343961716, 0.03528253361582756, 0.059876374900341034, 0.01722855679690838, 0.20641465485095978, 0.010591143742203712, 0.007842496037483215, 0.029254935681819916, 0.01792449876666069, 0.004247467964887619, 0.008911955170333385, 0.02435431256890297, 0.015565590932965279, 0.13643625378608704], [0.8111027479171753, 0.013256637379527092, 0.016896145418286324, 0.010062233544886112, 0.01424717903137207, 0.0110491206869483, 0.004007804673165083, 0.04021908715367317, 0.007732273545116186, 0.008060350082814693, 0.0056525953114032745, 0.002672546310350299, 0.006186809856444597, 0.006608298514038324, 0.0031156723853200674, 0.007105796132236719, 0.03202463686466217], [0.13258491456508636, 0.04388359189033508, 0.016911109909415245, 0.009300592355430126, 0.02081601321697235, 0.032853689044713974, 0.07237568497657776, 0.02264307625591755, 0.14581716060638428, 0.14241068065166473, 0.02281036600470543, 0.030502665787935257, 0.16079901158809662, 0.08642520755529404, 0.004905191715806723, 0.0346621610224247, 0.020298752933740616], [0.4857543110847473, 0.056926943361759186, 0.014180963858962059, 0.020062662661075592, 0.03338610753417015, 0.06713421642780304, 0.018575672060251236, 0.08533292263746262, 0.04026877135038376, 0.008103077299892902, 0.008086378686130047, 0.004944315645843744, 0.017765112221240997, 0.02887001819908619, 0.006037246435880661, 0.012386160902678967, 0.09218505024909973], [0.6266052722930908, 0.01786693185567856, 0.005147726275026798, 0.005475573241710663, 0.003312538843601942, 0.019775541499257088, 0.005380944348871708, 0.16185548901557922, 0.0026647131890058517, 0.0025690579786896706, 0.0008620548178441823, 0.0054298751056194305, 0.011533264070749283, 0.003829886205494404, 0.0034344352316111326, 0.0032135036308318377, 0.12104308605194092], [0.36255550384521484, 0.031023580580949783, 0.0024865244049578905, 0.011223023757338524, 0.006665357388556004, 0.014492548070847988, 0.10592179000377655, 0.07772282510995865, 0.13918937742710114, 0.018755998462438583, 0.03568809852004051, 0.0003342463169246912, 0.020573170855641365, 0.09029614180326462, 0.006884979084134102, 0.008011145517230034, 0.06817576289176941], [0.05405600741505623, 0.021632317453622818, 0.0006661695661023259, 0.0017734096618369222, 0.01594589464366436, 0.02799496054649353, 0.06631243228912354, 0.012204421684145927, 0.10479939728975296, 0.05822107940912247, 0.02415485866367817, 0.007486693561077118, 0.44230952858924866, 0.07456058263778687, 0.006768277380615473, 0.07046602666378021, 0.010647974908351898], [0.1416899859905243, 0.036284830421209335, 0.012041773647069931, 0.00978955626487732, 0.0186851117759943, 0.02894003316760063, 0.08818136900663376, 0.019620299339294434, 0.1371292918920517, 0.12121247500181198, 0.027334025129675865, 0.029566876590251923, 0.18601708114147186, 0.07799683511257172, 0.005384963471442461, 0.041254837065935135, 0.018870627507567406], [0.8988975286483765, 0.006032305303961039, 0.0009250526782125235, 0.0020901875104755163, 0.0023002855014055967, 0.008027924224734306, 0.009017824195325375, 0.036572836339473724, 0.0019775177352130413, 0.001221368322148919, 0.0008615312981419265, 0.0011286360677331686, 0.0012598119210451841, 0.0011694515123963356, 0.0007067133556120098, 0.0012603135546669364, 0.02655080333352089], [0.29498177766799927, 0.10912549495697021, 0.029262583702802658, 0.08615251630544662, 0.010567551478743553, 0.0639929547905922, 0.023371761664748192, 0.1629147082567215, 0.014571324922144413, 0.012601705268025398, 0.011871011927723885, 0.006539005320519209, 0.02212577871978283, 0.011297021992504597, 0.018693506717681885, 0.013690615072846413, 0.10824067890644073], [0.7812222242355347, 0.01291861291974783, 0.015142740681767464, 0.008516029454767704, 0.015193463303148746, 0.011530909687280655, 0.010483729653060436, 0.04752233996987343, 0.011744366958737373, 0.009586195461452007, 0.00792122446000576, 0.003679033135995269, 0.0067602889612317085, 0.010344968177378178, 0.0035452134907245636, 0.011068744584918022, 0.032819923013448715]], [[0.9251337051391602, 0.006425590254366398, 0.0013326085172593594, 0.004793581087142229, 0.0014584774617105722, 0.006270880810916424, 0.0009481451706960797, 0.01992804929614067, 0.002913089469075203, 0.0024434849619865417, 0.0017456605564802885, 0.0022675462532788515, 0.003355249995365739, 0.0021112968679517508, 0.0014518670504912734, 0.005709770135581493, 0.011711085215210915], [0.12398733198642731, 0.05022943392395973, 0.12278818339109421, 0.2391885668039322, 0.055548716336488724, 0.08650415390729904, 0.05161648988723755, 0.11533551663160324, 0.027826836332678795, 0.011841287836432457, 0.01185314916074276, 0.009901360608637333, 0.019688894972205162, 0.01652410253882408, 0.006566739175468683, 0.0269883144646883, 0.023611020296812057], [0.22464239597320557, 0.06491526961326599, 0.052171267569065094, 0.24684637784957886, 0.1097550168633461, 0.08265739679336548, 0.06476961821317673, 0.06129208207130432, 0.01808682270348072, 0.010290087200701237, 0.01295983325690031, 0.008745373226702213, 0.016657251864671707, 0.007513593882322311, 0.0022994112223386765, 0.007452774792909622, 0.00894559919834137], [0.10885557532310486, 0.05081658437848091, 0.044273555278778076, 0.04091927409172058, 0.1204332485795021, 0.1158447116613388, 0.16989220678806305, 0.11132703721523285, 0.05524253100156784, 0.03680300712585449, 0.01943988725543022, 0.013577629812061787, 0.050738219171762466, 0.019201047718524933, 0.004744599107652903, 0.016651147976517677, 0.02123967558145523], [0.13410499691963196, 0.02061476558446884, 0.13468383252620697, 0.14294801652431488, 0.016381992027163506, 0.06059090420603752, 0.1561424285173416, 0.1323607861995697, 0.040975309908390045, 0.05908168479800224, 0.01741497963666916, 0.011528359726071358, 0.022537054494023323, 0.012685073539614677, 0.004618597216904163, 0.02059810608625412, 0.012733209878206253], [0.23986829817295074, 0.029994338750839233, 0.0809846743941307, 0.09272664040327072, 0.030431149527430534, 0.0532967634499073, 0.05385315790772438, 0.1552196890115738, 0.051516227424144745, 0.039866406470537186, 0.03118482232093811, 0.014827771112322807, 0.030238475650548935, 0.01780758425593376, 0.01358115952461958, 0.0381292887032032, 0.026473522186279297], [0.6702460646629333, 0.04039071127772331, 0.010881781578063965, 0.040993090718984604, 0.017644641920924187, 0.036171287298202515, 0.0033300805371254683, 0.05849412456154823, 0.019225353375077248, 0.008896773681044579, 0.02332553267478943, 0.018135597929358482, 0.024901246652007103, 0.007929094135761261, 0.003953901119530201, 0.004606270231306553, 0.010874389670789242], [0.9814404249191284, 0.002165840473026037, 0.0005273839924484491, 0.0011563283624127507, 0.0007391585386358202, 0.002715606242418289, 0.00021227878460194916, 0.005128032993525267, 0.00044840783812105656, 0.00038368551759049296, 0.00039024706347845495, 0.0005696629523299634, 0.0005227657966315746, 0.00038487158599309623, 0.0003794225340243429, 0.0013633103808388114, 0.001472635893151164], [0.32013431191444397, 0.01341475173830986, 0.009093686006963253, 0.04590839147567749, 0.022945737466216087, 0.03237318992614746, 0.020527128130197525, 0.05668998509645462, 0.026710310950875282, 0.1174580305814743, 0.06167193129658699, 0.017793554812669754, 0.15245500206947327, 0.01663772016763687, 0.010135079734027386, 0.04125090688467026, 0.034800320863723755], [0.7257449626922607, 0.020953062921762466, 0.0028106123208999634, 0.010643127374351025, 0.008396069519221783, 0.028052086010575294, 0.0025198792573064566, 0.023551637306809425, 0.02948867902159691, 0.008803030475974083, 0.017441801726818085, 0.038056425750255585, 0.01912309229373932, 0.022980468347668648, 0.003315456910058856, 0.013302041217684746, 0.024817688390612602], [0.8992270827293396, 0.0067262048833072186, 0.0037738471291959286, 0.0050713843666017056, 0.004541181959211826, 0.006459873169660568, 0.0006238860660232604, 0.009045282378792763, 0.009381905198097229, 0.009805469773709774, 0.001834037364460528, 0.015132950618863106, 0.0069765495136380196, 0.0061210766434669495, 0.003013929817825556, 0.007298424374312162, 0.004966893699020147], [0.6119157671928406, 0.016551412642002106, 0.004575283266603947, 0.012285488657653332, 0.009446066804230213, 0.01088800374418497, 0.002462513977661729, 0.04286373034119606, 0.025588825345039368, 0.04634644091129303, 0.019791895523667336, 0.009067904204130173, 0.052891988307237625, 0.034999214112758636, 0.007858281023800373, 0.03673967346549034, 0.05572747811675072], [0.6649059653282166, 0.009184145368635654, 0.004191290121525526, 0.013850522227585316, 0.01730494201183319, 0.012185389176011086, 0.004381608683615923, 0.0180215947329998, 0.0073500704020261765, 0.031854789704084396, 0.019856369122862816, 0.013474204577505589, 0.0696917325258255, 0.01758892461657524, 0.01586998626589775, 0.045877959579229355, 0.034410424530506134], [0.4487479627132416, 0.01283120084553957, 0.009737028740346432, 0.03140019252896309, 0.02394791506230831, 0.021185917779803276, 0.014423376880586147, 0.03310423716902733, 0.007790941745042801, 0.0367874950170517, 0.019738828763365746, 0.008874916471540928, 0.13407641649246216, 0.028462158516049385, 0.023419804871082306, 0.08842384815216064, 0.05704784393310547], [0.9669104814529419, 0.003034558380022645, 0.0005289777764119208, 0.0010848940582945943, 0.002162822987884283, 0.00490755308419466, 0.00032730502425692976, 0.0036787576973438263, 0.0006356577505357563, 0.0011706246295943856, 0.0013943002559244633, 0.001032605185173452, 0.0029417066834867, 0.00118972675409168, 0.0006834525265730917, 0.0021427369210869074, 0.006173804868012667], [0.4120328426361084, 0.03165428340435028, 0.01652018167078495, 0.022160783410072327, 0.012025549076497555, 0.024525921791791916, 0.012909539975225925, 0.0346548855304718, 0.07412867248058319, 0.040148187428712845, 0.020908406004309654, 0.024011891335248947, 0.10253993421792984, 0.0767844170331955, 0.01099012978374958, 0.01676901802420616, 0.06723534315824509], [0.9908773303031921, 0.000542798254173249, 8.300925401272252e-05, 0.0002533980878069997, 0.00022805252228863537, 0.0006212692824192345, 4.867378083872609e-05, 0.0018639866029843688, 0.0002034437784459442, 0.0002092993090627715, 0.00013177577056922019, 0.0003484897024463862, 0.00038077490171417594, 0.00026910213637165725, 0.00030158163281157613, 0.0013308931374922395, 0.002306313021108508]]], [[[0.23649276793003082, 0.009575363248586655, 0.0007070523570291698, 0.0015374531503766775, 0.0012267339043319225, 0.005538392346352339, 0.0015727514401078224, 0.3931201994419098, 0.0010759999277070165, 0.0004444090591277927, 0.0005441537941806018, 0.00015489981160499156, 0.0007632747292518616, 0.0024629707913845778, 0.00022943763178773224, 0.0007042424404062331, 0.34384989738464355], [0.12603791058063507, 0.2567879855632782, 1.9506596800056286e-05, 0.00011261352483415976, 0.0020082637201994658, 0.244845911860466, 0.00035053110332228243, 0.19640083611011505, 0.0017314940923824906, 2.4647080863360316e-05, 0.00015852053184062243, 8.420042286161333e-05, 0.00022511796851176769, 0.0014762879582121968, 1.6738204067223705e-05, 0.00034344629966653883, 0.169375941157341], [0.2667919397354126, 0.0009445402538403869, 0.03838546201586723, 0.00010722418664954603, 3.0123646865831688e-05, 0.0007586829597130418, 0.00042929858318530023, 0.3687818646430969, 0.0001257827680092305, 0.0008805532124824822, 1.170871746580815e-05, 3.203524829586968e-05, 2.382057346039801e-06, 0.0001387391530442983, 1.489913665864151e-05, 2.072250390483532e-05, 0.3225440979003906], [0.25240787863731384, 0.0009592779679223895, 8.733053618925624e-06, 0.005342131946235895, 1.592511762282811e-05, 0.000510979734826833, 0.00013963374658487737, 0.41733303666114807, 6.04202868998982e-05, 7.206467125797644e-05, 9.382057214679662e-06, 4.89102887968329e-07, 3.6075623484066455e-06, 0.00012186798267066479, 1.7112070054281503e-06, 5.533490184461698e-05, 0.32295748591423035], [0.2866417169570923, 0.010120163671672344, 2.732828215812333e-05, 7.675612869206816e-05, 0.07337981462478638, 0.0013653499772772193, 1.9206754586775787e-05, 0.3546513319015503, 0.0018209987320005894, 6.871797813801095e-05, 0.00012114230048609897, 8.295612497022375e-05, 0.001291619031690061, 0.0031827157363295555, 9.813452379603405e-06, 0.00012900974252261221, 0.2670113444328308], [0.16606470942497253, 0.30059969425201416, 1.622009585844353e-05, 4.898137922282331e-05, 0.0003668418503366411, 0.03200509399175644, 4.740394069813192e-05, 0.2782410979270935, 0.00029426015680655837, 1.1166186595801264e-05, 7.531353185186163e-05, 7.454174192389473e-05, 0.0001449183328077197, 0.0013525830581784248, 9.3729968284606e-06, 0.0005967199103906751, 0.22005097568035126], [0.22409360110759735, 0.0001981118693947792, 5.5427735787816346e-05, 0.0002474568609613925, 5.681469701812603e-06, 5.1353239541640505e-05, 0.26499027013778687, 0.28145670890808105, 3.3460337363067083e-06, 0.00018685536633711308, 6.383216532412916e-05, 1.3170049896871205e-05, 5.5807347962399945e-05, 3.0161529139149934e-05, 0.00012832177162636071, 5.7618140999693424e-05, 0.22836226224899292], [0.1995595246553421, 0.005892673507332802, 0.0006851264624856412, 0.006053902208805084, 0.0026101788971573114, 0.005052225198596716, 0.0011218657018616796, 0.40491876006126404, 0.0007635498768649995, 0.0015699805226176977, 0.0013346181949600577, 0.00024893501540645957, 0.0012974567944183946, 0.0012894216924905777, 0.0027221899945288897, 0.0009985225042328238, 0.3638811409473419], [0.015748417004942894, 0.0036024393048137426, 3.0689796403748915e-05, 0.00011882036051247269, 0.0003829338529612869, 0.0011954547371715307, 4.5041655539534986e-05, 0.010944324545562267, 0.10715772211551666, 1.9123013771604747e-05, 9.353139830636792e-06, 2.318828774150461e-05, 0.00030614729621447623, 0.8494649529457092, 1.8484159227227792e-05, 0.0001321829331573099, 0.01080086175352335], [0.3048360049724579, 0.0011948402971029282, 0.002164096338674426, 0.0017807262483984232, 0.0001600413816049695, 0.0004921670770272613, 0.0003017329436261207, 0.2903909385204315, 1.725131005514413e-05, 0.15673694014549255, 3.0251478165155277e-05, 2.6057023205794394e-05, 1.7679623852018267e-05, 0.00010164028208237141, 0.0007087530102580786, 8.948415779741481e-05, 0.2409513294696808], [0.3597284257411957, 0.0014826977858319879, 2.9184247978264466e-05, 9.311461326433346e-05, 5.959664485999383e-05, 0.002033697674050927, 0.0003328493912704289, 0.33513370156288147, 2.7585723728407174e-05, 6.568998651346192e-05, 0.04598112404346466, 3.110198304057121e-05, 8.600515138823539e-05, 6.983146158745512e-05, 1.4047523109184112e-05, 0.00035343787749297917, 0.25447797775268555], [0.29739445447921753, 0.006057217717170715, 4.130787056055851e-05, 7.26759299141122e-06, 0.0001116898565669544, 0.00417375797405839, 0.0005609277868643403, 0.2740154564380646, 9.651802974985912e-05, 3.515285061439499e-05, 8.231084393628407e-06, 0.22740326821804047, 0.00019201493705622852, 0.00010530885629123077, 8.536979294149205e-06, 0.00022549816640093923, 0.18956346809864044], [0.2682710289955139, 0.0015420912532135844, 8.533003210686729e-07, 2.8026937798131257e-05, 0.00023646163754165173, 0.0007433740538544953, 5.388245699577965e-05, 0.37844619154930115, 0.0007813562406226993, 5.683265499101253e-06, 3.973229104303755e-05, 1.5615118172718212e-05, 0.06846363842487335, 0.00020468801085371524, 1.0058330190076958e-05, 0.00019216093642171472, 0.28096523880958557], [0.0045647090300917625, 0.0011219752486795187, 3.761092011700384e-05, 6.128181121312082e-05, 0.0007528176065534353, 0.0013628704473376274, 6.049275179975666e-05, 0.002771766623482108, 0.9442093372344971, 3.6061639548279345e-05, 1.3617004697152879e-05, 4.668351903092116e-05, 0.0001984307891689241, 0.04232368618249893, 6.502510586869903e-06, 7.811482646502554e-05, 0.002354126190766692], [0.28933101892471313, 0.0008484618156217039, 0.00010030897101387382, 4.103552782908082e-05, 2.538410080887843e-05, 0.0009417132823728025, 0.0010526480618864298, 0.31054484844207764, 3.440405271248892e-05, 0.00044045443064533174, 5.6101434893207625e-05, 1.0321265108359512e-05, 4.170285683358088e-05, 1.4257343536883127e-05, 0.16003845632076263, 0.00019029082613997161, 0.23628857731819153], [0.30895987153053284, 0.004043697379529476, 3.239829675294459e-05, 0.00014185896725393832, 4.520246875472367e-05, 0.0007967217243276536, 8.723305654712021e-05, 0.3647679388523102, 7.612311310367659e-05, 2.9786029699607752e-05, 4.432657078723423e-05, 1.8494247342459857e-05, 0.0005809476133435965, 6.335557554848492e-05, 1.3103333913022652e-05, 0.0838891789317131, 0.2364097535610199], [0.20048923790454865, 0.008255542255938053, 0.0012486386112868786, 0.009487465023994446, 0.004535979125648737, 0.007236729841679335, 0.0018339318921789527, 0.39370521903038025, 0.0013723339652642608, 0.002480501774698496, 0.002343823667615652, 0.000419615680584684, 0.002009877236559987, 0.0019508538534864783, 0.004005650989711285, 0.0015729168662801385, 0.35705164074897766]], [[0.16566145420074463, 0.015125798061490059, 0.01922011561691761, 0.020326795056462288, 0.004797699861228466, 0.010370586067438126, 0.023106077685952187, 0.23920071125030518, 0.039752788841724396, 0.05765015259385109, 0.017900383099913597, 0.02877078205347061, 0.031183753162622452, 0.029586652293801308, 0.011257155798375607, 0.0501849465072155, 0.23590421676635742], [0.13907481729984283, 0.08366601914167404, 0.055825281888246536, 0.036082956939935684, 0.021178683266043663, 0.055910807102918625, 0.03193854168057442, 0.16754628717899323, 0.046839211136102676, 0.021042468026280403, 0.02282017096877098, 0.01574786566197872, 0.04562472552061081, 0.06084536761045456, 0.023004189133644104, 0.0210743211209774, 0.15177834033966064], [0.35804805159568787, 0.10585717856884003, 0.014202317222952843, 0.07143095880746841, 0.013787460513412952, 0.0662609115242958, 0.02437428943812847, 0.11632745712995529, 0.019533473998308182, 0.012860754504799843, 0.014019209891557693, 0.017268164083361626, 0.017720986157655716, 0.02187144011259079, 0.020141465589404106, 0.01041004341095686, 0.09588591009378433], [0.14526860415935516, 0.03266781195998192, 0.15536606311798096, 0.13914602994918823, 0.05245282128453255, 0.03037911467254162, 0.02508748508989811, 0.12895701825618744, 0.019950805231928825, 0.017940910533070564, 0.006575548555701971, 0.015332230366766453, 0.07360490411520004, 0.024235304445028305, 0.01593637652695179, 0.005387315526604652, 0.11171156913042068], [0.10584387183189392, 0.038585737347602844, 0.054004330188035965, 0.07600826770067215, 0.036920323967933655, 0.03763601928949356, 0.0558011494576931, 0.13693569600582123, 0.061999015510082245, 0.024289187043905258, 0.016732703894376755, 0.029763350263237953, 0.12203860282897949, 0.05631129816174507, 0.013645175844430923, 0.01671905256807804, 0.11676623672246933], [0.08550261706113815, 0.04280836880207062, 0.049025677144527435, 0.041540972888469696, 0.028936117887496948, 0.04474128037691116, 0.04905322566628456, 0.2231435775756836, 0.051509030163288116, 0.015517876483500004, 0.019659003242850304, 0.011456151492893696, 0.0459018349647522, 0.054249271750450134, 0.017956694588065147, 0.020888112485408783, 0.19811008870601654], [0.014103135094046593, 0.004069712478667498, 0.004426481202244759, 0.008143501356244087, 0.0014075982617214322, 0.005606498569250107, 0.0023416897747665644, 0.5280054807662964, 0.00027254040469415486, 0.0004615779034793377, 0.00010851474507944658, 0.00018104056653100997, 4.101957892999053e-05, 0.00034542428329586983, 2.7488731575431302e-05, 0.000218697139644064, 0.43023961782455444], [0.35656288266181946, 0.0067189158871769905, 0.0018708865391090512, 0.0032620748970657587, 0.001853961730375886, 0.004631890915334225, 0.007949705235660076, 0.3221552073955536, 0.00026649871142581105, 0.0003651123261079192, 0.0010049158008769155, 0.0003800539707299322, 7.71403283579275e-05, 0.0004358759615570307, 0.00017083103011827916, 0.0011360130738466978, 0.2911580502986908], [0.38777580857276917, 0.05366511642932892, 0.009683288633823395, 0.015662986785173416, 0.005217588506639004, 0.03399496152997017, 0.0038405091036111116, 0.21795238554477692, 0.0017122601857408881, 0.0070227268151938915, 0.010918883606791496, 0.010896172374486923, 0.0026462136302143335, 0.0016673828940838575, 0.008995069190859795, 0.007768749725073576, 0.22057998180389404], [0.5097324848175049, 0.031216494739055634, 0.0042355116456747055, 0.03175543248653412, 0.0043870918452739716, 0.018827300518751144, 0.005962616764008999, 0.1682821661233902, 0.005288941785693169, 0.001818414661101997, 0.004931340925395489, 0.019717585295438766, 0.017200108617544174, 0.007164893671870232, 0.005362712312489748, 0.011771274730563164, 0.15234561264514923], [0.24368488788604736, 0.012060053646564484, 0.0036528469063341618, 0.00884238164871931, 0.004285604227334261, 0.008642337284982204, 0.003514337819069624, 0.3415936529636383, 0.013960831798613071, 0.009584360755980015, 0.0015181229682639241, 0.0023609644267708063, 0.007772835437208414, 0.012805987149477005, 0.002693768125027418, 0.004084757063537836, 0.3189423084259033], [0.39582696557044983, 0.02603474259376526, 0.01088739838451147, 0.008601061068475246, 0.006442354992032051, 0.011248040944337845, 0.0053299954161047935, 0.22557170689105988, 0.01817713864147663, 0.031103869900107384, 0.003345947479829192, 0.00441359169781208, 0.004902055021375418, 0.01641446352005005, 0.006832662038505077, 0.014490067958831787, 0.21037794649600983], [0.46436452865600586, 0.01925422064960003, 0.0055376822128891945, 0.06705516576766968, 0.0018807972082868218, 0.018117379397153854, 0.0022427611984312534, 0.21054308116436005, 0.0036799160297960043, 0.00959421694278717, 0.0010486743412911892, 0.008370766416192055, 0.0013634919887408614, 0.0034598156344145536, 0.0013772824313491583, 0.005135754123330116, 0.17697441577911377], [0.3270284831523895, 0.04445355385541916, 0.01507740281522274, 0.02340153604745865, 0.0072011942975223064, 0.029678281396627426, 0.007937619462609291, 0.2483299970626831, 0.0018150038085877895, 0.009280304424464703, 0.006424960680305958, 0.009549562819302082, 0.003548107109963894, 0.0023488106671720743, 0.012036940082907677, 0.011603156104683876, 0.24028503894805908], [0.5155463218688965, 0.020527197048068047, 0.0031477888114750385, 0.00836227834224701, 0.002185608958825469, 0.010678303427994251, 0.0004790009406860918, 0.2112290859222412, 0.0047378153540194035, 0.0020149198826402426, 0.0018609585240483284, 0.0028360304422676563, 0.0011315797455608845, 0.007264580577611923, 0.0004669177869800478, 0.001574977650307119, 0.2059566229581833], [0.13673928380012512, 0.007576501928269863, 0.0028797087725251913, 0.004442200995981693, 0.000613247393630445, 0.005959173198789358, 0.0038431473076343536, 0.4266640543937683, 0.0011041710386052728, 0.005565161816775799, 0.0005206270143389702, 0.002383708953857422, 0.00039912795182317495, 0.0014414364704862237, 0.00129184708930552, 0.0007260623970068991, 0.39785054326057434], [0.3457328677177429, 0.007541000377386808, 0.0018660450587049127, 0.002963022328913212, 0.001958847278729081, 0.005238546524196863, 0.008718038909137249, 0.3251279592514038, 0.00027395138749852777, 0.0004306524060666561, 0.001180921564809978, 0.0004179262323305011, 7.665226439712569e-05, 0.0004622643464244902, 0.00020027034042868763, 0.001322228810749948, 0.29648879170417786]], [[0.051312319934368134, 0.0876384973526001, 0.17412599921226501, 0.14350438117980957, 0.08713427931070328, 0.0724579319357872, 0.12110596150159836, 0.09044918417930603, 0.014724448323249817, 0.01631760597229004, 0.012381690554320812, 0.006825348827987909, 0.011448292061686516, 0.013631625100970268, 0.01545750256627798, 0.010381454601883888, 0.07110342383384705], [0.1969301700592041, 0.28458258509635925, 0.0038713670801371336, 0.006759915500879288, 0.008784622885286808, 0.18563655018806458, 0.005505913868546486, 0.12806057929992676, 0.01777256652712822, 0.004765874706208706, 0.005507144145667553, 0.0034387826453894377, 0.00973068829625845, 0.023596184328198433, 0.004746395628899336, 0.001729907002300024, 0.10858067125082016], [0.45950597524642944, 0.06575663387775421, 0.0063303932547569275, 0.003677841043099761, 0.008392597548663616, 0.06312037259340286, 0.002392701106145978, 0.19838546216487885, 0.004684922751039267, 0.004563748370856047, 0.0002943066938314587, 0.003986249677836895, 0.0013027605600655079, 0.004047942813485861, 0.000828122952952981, 0.0014804803067818284, 0.17124947905540466], [0.2845461964607239, 0.036605387926101685, 0.02755858190357685, 0.0032900201622396708, 0.015052550472319126, 0.04137738421559334, 0.0026921681128442287, 0.2864788770675659, 0.017347566783428192, 0.005538726691156626, 0.001475588884204626, 0.0032688952051103115, 0.0027443543076515198, 0.013423304073512554, 0.0024277393240481615, 0.0006459648138843477, 0.25552669167518616], [0.3055438995361328, 0.052283868193626404, 0.0017056307988241315, 0.001491795526817441, 0.07828649133443832, 0.10255371779203415, 0.0010234442306682467, 0.20744562149047852, 0.007912371307611465, 0.01105152815580368, 0.0037719127722084522, 0.003811924485489726, 0.005850175861269236, 0.011667022481560707, 0.0028535674791783094, 0.0007328641368076205, 0.20201420783996582], [0.17854151129722595, 0.25978195667266846, 0.0034006487112492323, 0.007435706444084644, 0.01013151090592146, 0.20271798968315125, 0.004136411007493734, 0.1492464244365692, 0.010227848775684834, 0.004627350252121687, 0.005648356396704912, 0.002689969027414918, 0.007270490750670433, 0.014641687273979187, 0.004107028245925903, 0.0013174338964745402, 0.1340775340795517], [0.23669534921646118, 0.0701078251004219, 0.01535047497600317, 0.0031450584065169096, 0.01812673918902874, 0.07420109957456589, 0.0018542851321399212, 0.2656595706939697, 0.02621058188378811, 0.004980674013495445, 0.001956499181687832, 0.005500655621290207, 0.008511850610375404, 0.02460516430437565, 0.004957708530128002, 0.0015349300811067224, 0.23660147190093994], [0.08065950870513916, 0.013604281470179558, 0.008183090947568417, 0.018565576523542404, 0.011017550714313984, 0.011542350053787231, 0.008329598233103752, 0.42129987478256226, 0.005238765385001898, 0.005411036778241396, 0.010756195522844791, 0.00836203247308731, 0.003818002063781023, 0.0055589680559933186, 0.013599388301372528, 0.010790707543492317, 0.3632631301879883], [0.3463127911090851, 0.019910847768187523, 0.0028712155763059855, 0.001937204971909523, 0.009620370343327522, 0.02225448191165924, 0.0030229464173316956, 0.1885293573141098, 0.062288522720336914, 0.03159907087683678, 0.006017755717039108, 0.03708822652697563, 0.010913186706602573, 0.06734605133533478, 0.003888201666995883, 0.0043817609548568726, 0.182017982006073], [0.40223851799964905, 0.022448886185884476, 0.004766193684190512, 0.0012084383051842451, 0.007061272393912077, 0.02783259004354477, 0.000665231142193079, 0.13956595957279205, 0.08567231893539429, 0.003324684454128146, 0.0013349952641874552, 0.006668219342827797, 0.07699213176965714, 0.06819037348031998, 0.0029621850699186325, 0.006815033499151468, 0.14225290715694427], [0.2201210856437683, 0.021527353674173355, 0.0013998292852193117, 0.001408234122209251, 0.00748630752786994, 0.02635379508137703, 0.0003560612676665187, 0.34953826665878296, 0.009027156047523022, 0.009278581477701664, 0.0009142491617240012, 0.002009307499974966, 0.006421767640858889, 0.006856012158095837, 0.001049260376021266, 0.003703055204823613, 0.33254969120025635], [0.330806702375412, 0.01894311234354973, 0.001337847439572215, 0.002257048152387142, 0.004078138619661331, 0.026078445836901665, 0.0007346675847657025, 0.20742954313755035, 0.07214827090501785, 0.0075255646370351315, 0.0015580265317112207, 0.016265228390693665, 0.012608840130269527, 0.07888384908437729, 0.003930090460926294, 0.0020720292814075947, 0.21334251761436462], [0.4056752622127533, 0.013733073137700558, 0.0008396247867494822, 0.0004131085588596761, 0.011418395675718784, 0.019627142697572708, 0.0003671601298265159, 0.16934248805046082, 0.06417147815227509, 0.019553881138563156, 0.0008884568233042955, 0.00800548680126667, 0.031497471034526825, 0.0752173513174057, 0.008101118728518486, 0.0028123182710260153, 0.16833628714084625], [0.3208017647266388, 0.020930685102939606, 0.0034892926923930645, 0.002572428435087204, 0.010344401933252811, 0.02392757497727871, 0.0018703333334997296, 0.16050612926483154, 0.09179414808750153, 0.030959784984588623, 0.008713416755199432, 0.03428856283426285, 0.020825296640396118, 0.09988607466220856, 0.004196513444185257, 0.0036085019819438457, 0.1612851768732071], [0.12618961930274963, 0.03439353033900261, 0.0016950198914855719, 0.00328842899762094, 0.01111387275159359, 0.03145797550678253, 0.00023016263730823994, 0.2627779245376587, 0.09627987444400787, 0.009207965806126595, 0.005602475255727768, 0.014568572863936424, 0.025963345542550087, 0.10419780015945435, 0.0026338330935686827, 0.006490825209766626, 0.26390883326530457], [0.23712463676929474, 0.013862146064639091, 0.002972102491185069, 0.00022881408222019672, 0.005436482839286327, 0.015493832528591156, 0.0001550440938444808, 0.23319503664970398, 0.09143111854791641, 0.020237697288393974, 0.00041963515104725957, 0.007160970475524664, 0.051827553659677505, 0.07600528001785278, 0.0038547900039702654, 0.0006768287275917828, 0.23991796374320984], [0.07357767969369888, 0.014126394875347614, 0.011256272904574871, 0.026995262131094933, 0.015389678999781609, 0.01213464792817831, 0.011027597822248936, 0.39537906646728516, 0.007940866984426975, 0.00742122158408165, 0.016475969925522804, 0.012983748689293861, 0.005858550779521465, 0.00855204463005066, 0.019185863435268402, 0.017107512801885605, 0.3445875644683838]], [[0.23812603950500488, 0.013542610220611095, 0.011273401789367199, 0.02823493629693985, 0.014286049641668797, 0.021555786952376366, 0.01884930580854416, 0.2968319356441498, 0.015098338015377522, 0.01914321258664131, 0.011285470798611641, 0.010934410616755486, 0.0128102982416749, 0.010114695876836777, 0.005489877425134182, 0.019679846242070198, 0.25274381041526794], [0.08374489098787308, 0.02542233094573021, 0.1503116935491562, 0.3700125217437744, 0.14909628033638, 0.051722683012485504, 0.05293792858719826, 0.049150899052619934, 0.006440934259444475, 0.0047494531609117985, 0.003069997299462557, 0.0014855240005999804, 0.0028392469976097345, 0.0026910281740128994, 0.004118798300623894, 0.00696194963529706, 0.03524375334382057], [0.1429978907108307, 0.024069292470812798, 0.010400200262665749, 0.2969824969768524, 0.08424403518438339, 0.18418507277965546, 0.04658111557364464, 0.11137878894805908, 0.003819148289039731, 0.002419189317151904, 0.002007892122492194, 0.000912139134015888, 0.0009279529913328588, 0.0007097056368365884, 0.0017923347186297178, 0.0024559644516557455, 0.08411677181720734], [0.11362873762845993, 0.008832193911075592, 0.007846135646104813, 0.011306566186249256, 0.09067278355360031, 0.22387845814228058, 0.26449453830718994, 0.15235784649848938, 0.003051033243536949, 0.0037924922071397305, 0.005783767439424992, 0.0014299113536253572, 0.0014694476267322898, 0.00033578972215764225, 0.0009531410178169608, 0.00407171156257391, 0.10609544813632965], [0.22259047627449036, 0.008213945664465427, 0.013412400148808956, 0.06451929360628128, 0.03955568000674248, 0.0989847257733345, 0.23565803468227386, 0.13265840709209442, 0.01001766324043274, 0.032098665833473206, 0.035502221435308456, 0.0050549875013530254, 0.004594606813043356, 0.0007314184913411736, 0.0016700889682397246, 0.006878517102450132, 0.08785880357027054], [0.2887754440307617, 0.0049813478253781796, 0.007710357196629047, 0.028955738991498947, 0.024260103702545166, 0.03360525518655777, 0.15357674658298492, 0.2310647964477539, 0.016675546765327454, 0.03503798320889473, 0.017644492909312248, 0.004566413816064596, 0.004240596666932106, 0.0014707515947520733, 0.0011851885356009007, 0.004901846405118704, 0.14134739339351654], [0.3256472945213318, 0.0029100917745381594, 0.004601185210049152, 0.004306642338633537, 0.006189735606312752, 0.009496156126260757, 0.001732576871290803, 0.382027268409729, 0.0021867689210921526, 0.0026366328820586205, 0.0029764585196971893, 0.0010122834937646985, 0.0017241064924746752, 0.0007945962715893984, 0.0007747078198008239, 0.000649699941277504, 0.2503337264060974], [0.23481814563274384, 0.002886624541133642, 0.000799752539023757, 0.0009729790035635233, 0.0010915538296103477, 0.0020563225261867046, 0.0008604609756730497, 0.40829139947891235, 0.00043629438732750714, 0.0005121291615068913, 0.001093202969059348, 0.0008208752260543406, 0.0007180587854236364, 0.0006434452370740473, 0.0006144591607153416, 0.0010497416369616985, 0.342334508895874], [0.20543630421161652, 0.0038185182493180037, 0.001219694851897657, 0.0012299768859520555, 0.0028288091998547316, 0.005674166139215231, 0.0022720531560480595, 0.18461737036705017, 0.008559633046388626, 0.09781770408153534, 0.06930069625377655, 0.1354847550392151, 0.05012443661689758, 0.005486645735800266, 0.025679532438516617, 0.028226645663380623, 0.17222300171852112], [0.15549218654632568, 0.0011795128230005503, 0.0009012238588184118, 0.0005863612750545144, 0.0005595111870206892, 0.000969412038102746, 0.002176641020923853, 0.10326727479696274, 0.0019652810879051685, 0.007372149266302586, 0.034687578678131104, 0.2771749794483185, 0.24280959367752075, 0.012176459655165672, 0.011547922156751156, 0.045714303851127625, 0.1014196053147316], [0.28772521018981934, 0.0037108364049345255, 0.006013798993080854, 0.0019293326186016202, 0.0012034496758133173, 0.0009043319732882082, 0.0009712878381833434, 0.264838844537735, 0.004278562497347593, 0.01585112325847149, 0.004252071492373943, 0.026004666462540627, 0.062172919511795044, 0.03758455067873001, 0.02155417576432228, 0.019138576462864876, 0.2418663203716278], [0.20591065287590027, 0.0025776757393032312, 0.007069122511893511, 0.01854437030851841, 0.0017259728629142046, 0.001013009692542255, 0.0010591489262878895, 0.12952493131160736, 0.00449209101498127, 0.009895481169223785, 0.004436237271875143, 0.005122073460370302, 0.20944079756736755, 0.1475449502468109, 0.043594297021627426, 0.07523138076066971, 0.13281787931919098], [0.15741169452667236, 0.0012139403261244297, 0.002037624828517437, 0.016033286228775978, 0.002611300675198436, 0.0015326469438150525, 0.002454599365592003, 0.17723743617534637, 0.0003592646971810609, 0.0031855946872383356, 0.004714293871074915, 0.015239547938108444, 0.04644538462162018, 0.047300197184085846, 0.11976227164268494, 0.1957552284002304, 0.20670568943023682], [0.22564224898815155, 0.003870989428833127, 0.00479245837777853, 0.005489078350365162, 0.004701380152255297, 0.005425821989774704, 0.0018404364818707108, 0.12566891312599182, 0.00018030180945061147, 0.0009419664274901152, 0.005283443722873926, 0.010369377210736275, 0.022576821967959404, 0.009564556181430817, 0.11748247593641281, 0.2999962270259857, 0.1561734825372696], [0.22938059270381927, 0.0019519813358783722, 0.0009188493713736534, 0.0015820966800674796, 0.0008396836929023266, 0.002055262215435505, 0.0014747388195246458, 0.3249129056930542, 0.0006282321992330253, 0.0006195993046276271, 0.002751001389697194, 0.0010078174527734518, 0.004143605474382639, 0.005169812124222517, 0.0022204730194061995, 0.030331900343298912, 0.3900115191936493], [0.2890292704105377, 0.0010926674585789442, 0.0008049040334299207, 0.0019986398983746767, 0.0010198007803410292, 0.001558020245283842, 0.0009933864930644631, 0.3131781220436096, 0.0017489832825958729, 0.0016760394209995866, 0.0011345853563398123, 0.002127464162185788, 0.007609092630445957, 0.0060449326410889626, 0.00913903396576643, 0.016268176957964897, 0.3445769250392914], [0.23727703094482422, 0.0028177034109830856, 0.0007670554332435131, 0.0008649309165775776, 0.0009330627508461475, 0.0017572446959093213, 0.000627197150606662, 0.40376582741737366, 0.0004647271416615695, 0.0004561011155601591, 0.0009076031856238842, 0.0007229156326502562, 0.0006499338196590543, 0.0006539231399074197, 0.0005998696433380246, 0.0010867762612178922, 0.3456481695175171]], [[0.015074202790856361, 0.002733819652348757, 0.013741790316998959, 0.012800196185708046, 0.0031410395167768, 0.0021775206550955772, 0.005527285858988762, 0.09581828117370605, 0.06444068253040314, 0.17903266847133636, 0.0567283071577549, 0.07805319130420685, 0.10137956589460373, 0.06181042268872261, 0.10145027935504913, 0.08284267783164978, 0.12324801832437515], [0.03455992415547371, 0.019627351313829422, 0.13794824481010437, 0.37155792117118835, 0.02415231056511402, 0.013886740431189537, 0.06560098379850388, 0.0668807402253151, 0.01592329889535904, 0.05355620011687279, 0.008395003154873848, 0.016503797844052315, 0.03332667797803879, 0.024411451071500778, 0.010600137524306774, 0.03346439450979233, 0.06960479170084], [0.015668001025915146, 0.018284857273101807, 0.10939310491085052, 0.1946064978837967, 0.03943575173616409, 0.013247070834040642, 0.09733768552541733, 0.1152714341878891, 0.017638592049479485, 0.06886610388755798, 0.03714107722043991, 0.019760239869356155, 0.01584073342382908, 0.025139881297945976, 0.056117668747901917, 0.040040940046310425, 0.11621031910181046], [0.14878350496292114, 0.04068545252084732, 0.034609366208314896, 0.048067860305309296, 0.07137387245893478, 0.048215705901384354, 0.05356723442673683, 0.27685683965682983, 0.0007094665197655559, 0.0077704014256596565, 0.004277274012565613, 0.0023541850969195366, 0.004541891627013683, 0.0017819401109591126, 0.0031853127293288708, 0.0106439720839262, 0.24257582426071167], [0.07884162664413452, 0.05685599520802498, 0.08959731459617615, 0.11665792763233185, 0.01483894418925047, 0.041199177503585815, 0.03051702491939068, 0.1842353641986847, 0.013747302815318108, 0.052646756172180176, 0.0028084099758416414, 0.012249667197465897, 0.07251479476690292, 0.019716797396540642, 0.010016750544309616, 0.028222085908055305, 0.1753341108560562], [0.029086267575621605, 0.0246526338160038, 0.1141786053776741, 0.3623295724391937, 0.024351488798856735, 0.016340157017111778, 0.08472375571727753, 0.05092674121260643, 0.01851745694875717, 0.05839090421795845, 0.009065670892596245, 0.014646440744400024, 0.07390888035297394, 0.030589772388339043, 0.007064448669552803, 0.02693883702158928, 0.054288312792778015], [0.007630626205354929, 0.059208907186985016, 0.29380542039871216, 0.1702014058828354, 0.05261504650115967, 0.04845365881919861, 0.12445542961359024, 0.015100257471203804, 0.01993914693593979, 0.019770771265029907, 0.035516154021024704, 0.00959553848952055, 0.017925925552845, 0.027808066457509995, 0.05986456573009491, 0.0231754370033741, 0.014933713711798191], [0.5680379867553711, 0.003838262287899852, 0.001604758552275598, 0.00024536254932172596, 0.0015560410683974624, 0.005650751758366823, 0.012603996321558952, 0.22344042360782623, 0.0001605024008313194, 0.0005157678970135748, 0.0016141857486218214, 0.0005872269393876195, 0.00014016139903105795, 0.0002575662510935217, 0.00021910018404014409, 0.0010514185996726155, 0.178476482629776], [0.01099062617868185, 0.008430218324065208, 0.023147163912653923, 0.01433907076716423, 0.005455352365970612, 0.0050171841867268085, 0.03917991369962692, 0.052007947117090225, 0.03773559257388115, 0.24792255461215973, 0.013387096114456654, 0.04456782341003418, 0.2775261402130127, 0.048127759248018265, 0.05128278583288193, 0.07294405996799469, 0.04793870449066162], [0.029024610295891762, 0.0059443870559334755, 0.01280670054256916, 0.010657187551259995, 0.010226013138890266, 0.004886270966380835, 0.021133311092853546, 0.07020115852355957, 0.20410768687725067, 0.0830124020576477, 0.006431234534829855, 0.02800767496228218, 0.23986245691776276, 0.17749890685081482, 0.005987014155834913, 0.020180007442831993, 0.070032998919487], [0.03204919397830963, 0.06134318932890892, 0.08600104600191116, 0.07150735706090927, 0.07479589432477951, 0.04317883029580116, 0.03521884232759476, 0.08108802884817123, 0.037794023752212524, 0.06395422667264938, 0.023168940097093582, 0.057696398347616196, 0.0741790160536766, 0.05731366202235222, 0.08684738725423813, 0.03700404241681099, 0.0768599659204483], [0.00525916600599885, 0.008785514160990715, 0.011000535450875759, 0.007610354572534561, 0.009002807550132275, 0.008144741877913475, 0.009658114053308964, 0.013908450491726398, 0.2001817673444748, 0.06475845724344254, 0.021509097889065742, 0.006954004522413015, 0.3398072123527527, 0.24629947543144226, 0.015568635426461697, 0.019150378182530403, 0.012401323765516281], [0.0737789124250412, 0.04133566468954086, 0.012458235025405884, 0.02139125019311905, 0.02812064066529274, 0.03289500251412392, 0.021069908514618874, 0.23188292980194092, 0.027373474091291428, 0.07675138860940933, 0.004353882744908333, 0.018229171633720398, 0.12620076537132263, 0.04950166866183281, 0.01073414832353592, 0.028140075504779816, 0.195782870054245], [0.00787345226854086, 0.0134547995403409, 0.029882492497563362, 0.01594468392431736, 0.0065183332189917564, 0.008612637408077717, 0.02612624503672123, 0.029550986364483833, 0.03465934842824936, 0.24081695079803467, 0.008265920914709568, 0.0465913787484169, 0.32882753014564514, 0.045924730598926544, 0.04382042959332466, 0.0864991843700409, 0.026630941778421402], [0.03260589390993118, 0.020491892471909523, 0.04921816661953926, 0.01201153825968504, 0.03276713937520981, 0.03191586583852768, 0.27126753330230713, 0.10745521634817123, 0.024816282093524933, 0.06547172367572784, 0.05209805443882942, 0.048544544726610184, 0.014689113013446331, 0.024222608655691147, 0.021391814574599266, 0.10453769564628601, 0.0864948183298111], [0.00525372801348567, 0.008562478236854076, 0.06322943419218063, 0.051881082355976105, 0.016447104513645172, 0.0070586856454610825, 0.02810019813477993, 0.014835519716143608, 0.16312631964683533, 0.10837546736001968, 0.02614545077085495, 0.03397918865084648, 0.12994147837162018, 0.22453990578651428, 0.0840318351984024, 0.021313536912202835, 0.013178596273064613], [0.5652258992195129, 0.004329378716647625, 0.0014928752789273858, 0.00023409936693497002, 0.0017497795633971691, 0.006608752068132162, 0.014801431447267532, 0.22674818336963654, 0.00014837308845017105, 0.00043031087261624634, 0.001710680895484984, 0.0005727025563828647, 0.00011728404933819547, 0.0002464406134095043, 0.00020737582235597074, 0.0010602230904623866, 0.17431631684303284]], [[0.6414362788200378, 0.03097419999539852, 0.015060914680361748, 0.013869376853108406, 0.030054345726966858, 0.04289112985134125, 0.12132196128368378, 0.02386116422712803, 0.0039678215980529785, 0.001956162741407752, 0.01697446033358574, 0.0028806745540350676, 0.0037777922116219997, 0.0039759124629199505, 0.004358331672847271, 0.02160787209868431, 0.02103157341480255], [0.7796869277954102, 0.08965377509593964, 0.0006269690929912031, 0.00013449213292915374, 0.003985970746725798, 0.0008733444265089929, 0.0030556542333215475, 0.05371616408228874, 7.197825652838219e-06, 0.0002654893323779106, 8.548847836209461e-05, 0.0009809831390157342, 0.0005150526412762702, 6.053207835066132e-05, 0.00011665627243928611, 0.002710674423724413, 0.06352473795413971], [0.13382840156555176, 0.8474864959716797, 0.0007663805736228824, 0.0004329253570176661, 0.0017503236886113882, 0.0005166102782823145, 0.0010875544976443052, 0.005970390979200602, 1.975052509806119e-05, 6.413087248802185e-05, 2.6730540412245318e-05, 0.00027211170527152717, 0.0021560448221862316, 5.8920166338793933e-05, 1.864887963165529e-05, 0.00032318488229066133, 0.0052213892340660095], [0.6396790146827698, 0.11604563146829605, 0.11961358040571213, 0.025691872462630272, 0.0039607021026313305, 0.014793678186833858, 0.0006790012703277171, 0.04284597560763359, 0.00012398074613884091, 1.4971393284213264e-05, 3.507592191454023e-05, 9.602763748262078e-05, 0.0009469330543652177, 0.0004751326923724264, 0.000462841970147565, 0.0008707231609150767, 0.03366478160023689], [0.014892622828483582, 0.00028357599512673914, 0.004274646285921335, 0.9667023420333862, 0.001094589475542307, 0.0026263876352459192, 3.664407631731592e-05, 0.005360112059861422, 4.8568246711511165e-05, 1.8892007574322633e-05, 1.7737947928253561e-06, 1.1354706657584757e-05, 1.4460270904237404e-05, 1.773163239704445e-05, 0.0010288808261975646, 2.011245123867411e-05, 0.003567290958017111], [0.13455365598201752, 0.0005738924373872578, 0.0005524298758246005, 0.005741039756685495, 0.812554657459259, 0.028746595606207848, 0.003608370665460825, 0.006458199582993984, 5.1745795644819736e-05, 0.0009732747566886246, 1.591056934557855e-05, 4.918182639812585e-06, 7.799909326422494e-06, 2.671424408617895e-05, 5.763000808656216e-05, 0.0013665275182574987, 0.004706633742898703], [0.13473674654960632, 0.0005315656308084726, 0.001562734367325902, 0.00354046025313437, 0.01571027934551239, 0.8205724358558655, 0.0024839614052325487, 0.008684917353093624, 0.0011435078922659159, 0.0015346268191933632, 0.0009490312659181654, 0.00016722585132811219, 4.825779797101859e-06, 3.3896903914865106e-05, 1.1827576599898748e-05, 0.00038634252268821, 0.007945622317492962], [0.8451223373413086, 0.018900206312537193, 0.006750135216861963, 0.004620402120053768, 0.005972793325781822, 0.017863905057311058, 0.03220708668231964, 0.0149227948859334, 0.005645844154059887, 0.0051730358973145485, 0.002922446234151721, 0.006880566012114286, 0.0029047727584838867, 0.003861810779199004, 0.004655292723327875, 0.007023051381111145, 0.014573591761291027], [0.480422705411911, 0.0035214603412896395, 0.0005336942849680781, 0.0002365068212384358, 0.0006784957367926836, 0.00166214513592422, 0.00926278904080391, 0.27294233441352844, 0.02755379118025303, 0.0330209843814373, 0.00012725594569928944, 0.003297538263723254, 0.0031243714038282633, 2.063170904875733e-05, 1.3691305866814218e-06, 0.0002789180143736303, 0.16331496834754944], [0.01811359077692032, 1.3607079381472431e-05, 9.948589649866335e-07, 1.7589618437341414e-05, 2.5885583454510197e-05, 0.00028699261019937694, 4.719393837149255e-05, 0.026368696242570877, 0.8781535029411316, 0.05810905992984772, 0.00015312856703530997, 0.0005857597570866346, 0.00010577541252132505, 8.14062004792504e-05, 9.541477083985228e-07, 2.096671323670307e-06, 0.01793370582163334], [0.05285900458693504, 0.00030160401365719736, 6.556468179041985e-06, 6.674522592220455e-05, 0.00024568778462707996, 0.0005034739151597023, 0.0002049742906820029, 0.002742514945566654, 0.008634930476546288, 0.9288884997367859, 9.068482177099213e-05, 0.0008016722276806831, 0.0003258608339820057, 0.0011096963426098228, 0.0006773598142899573, 7.933868619147688e-05, 0.0024613363202661276], [0.00221243710257113, 2.2538906705449335e-05, 1.9147739749314496e-07, 1.647128442527901e-07, 1.0417431894893525e-06, 2.8651111279032193e-05, 1.2499674085120205e-05, 0.0003085247299168259, 3.5355267300474225e-06, 0.0003461781016085297, 0.9964503049850464, 0.00022603623801842332, 2.6259762307745405e-05, 2.3586344468640164e-06, 3.885842943418538e-06, 4.942189480061643e-05, 0.000305999128613621], [0.01036781631410122, 0.0008342416840605438, 1.7128641047747806e-05, 2.0533991119009443e-05, 2.9555935725511517e-06, 3.6144359910394996e-05, 7.378499140031636e-05, 0.006765018682926893, 0.00021779925737064332, 0.0015782729024067521, 0.0007277305121533573, 0.9368734955787659, 0.03224625438451767, 0.0010896268067881465, 0.0009329294553026557, 0.00032172640203498304, 0.007894447073340416], [0.03563461825251579, 0.002855573082342744, 0.00013671958004124463, 3.781789564527571e-05, 1.0104925422638189e-05, 2.1286070932546863e-06, 3.7324938602978364e-05, 0.008594353683292866, 2.7424977815826423e-05, 0.0009628230473026633, 3.615149762481451e-05, 0.024303004145622253, 0.899254560470581, 0.017478864639997482, 0.000342755374731496, 0.003295246744528413, 0.006990519352257252], [0.23552358150482178, 0.0007450535777024925, 1.1584193998714909e-05, 3.410106728551909e-05, 1.4207294043444563e-05, 1.7315085642621852e-05, 1.5014760492704227e-06, 0.002948538865894079, 0.0024790773168206215, 0.0010183261474594474, 7.436121813952923e-05, 0.0002751484571490437, 0.003731070552021265, 0.7477232813835144, 0.001778375473804772, 0.0004652889329008758, 0.003159280400723219], [0.0037040780298411846, 5.1233841077191755e-05, 3.5408945677772863e-06, 9.383707219967619e-05, 1.5205941963358782e-05, 3.0963343306211755e-05, 3.3952089779631933e-06, 0.00026308675296604633, 2.7787433509729453e-07, 4.019607513328083e-05, 4.345218712842325e-06, 0.00011893665214302018, 8.212411194108427e-05, 0.0002901941479649395, 0.9945210218429565, 0.0004440408665686846, 0.00033355338382534683], [0.8236054182052612, 0.020156359300017357, 0.007659207563847303, 0.005220772698521614, 0.007615920156240463, 0.01960946060717106, 0.02552926167845726, 0.013913772068917751, 0.00552974920719862, 0.006079630460590124, 0.006619543768465519, 0.007675293833017349, 0.004190606065094471, 0.00814027152955532, 0.009781114757061005, 0.014565378427505493, 0.014108278788626194]], [[0.09331492334604263, 0.056674692779779434, 0.21008054912090302, 0.14979636669158936, 0.027833765372633934, 0.03818805515766144, 0.0658632218837738, 0.17810924351215363, 0.004991346970200539, 0.017665700986981392, 0.006204724311828613, 0.0036213051062077284, 0.00416893046349287, 0.005470216274261475, 0.005164945498108864, 0.0071010710671544075, 0.1257508099079132], [0.2844104468822479, 0.10228923708200455, 0.012822174467146397, 0.0030406410805881023, 0.008765681646764278, 0.05496018007397652, 0.012015220709145069, 0.26282384991645813, 0.0015031542861834168, 0.0023093027994036674, 0.004559095948934555, 0.0027201236225664616, 0.0005719957989640534, 0.001391653437167406, 0.0021230566781014204, 0.004181517753750086, 0.2395126223564148], [0.41203033924102783, 0.04571434110403061, 0.022425886243581772, 0.0031624797265976667, 0.010137534700334072, 0.023898957297205925, 0.00600487319752574, 0.24329237639904022, 0.0029769798275083303, 0.015779104083776474, 0.0025153697934001684, 0.0032908369321376085, 0.001296468311920762, 0.00203381828032434, 0.001692996476776898, 0.0025826231576502323, 0.2011651247739792], [0.29724523425102234, 0.07139614224433899, 0.024225814267992973, 0.06562759727239609, 0.030032817274332047, 0.06358474493026733, 0.041361428797245026, 0.19900482892990112, 0.0032035429030656815, 0.011618641205132008, 0.004483609925955534, 0.009382389485836029, 0.0017680958844721317, 0.0027840775437653065, 0.009580304846167564, 0.0036194443237036467, 0.16108128428459167], [0.36703553795814514, 0.06263343244791031, 0.0022157705388963223, 0.0012496617855504155, 0.015489872545003891, 0.04504157230257988, 0.00374838849529624, 0.24440152943134308, 0.002558021107688546, 0.0014142229920253158, 0.0010269286576658487, 0.002482091775164008, 0.0008897537481971085, 0.0017272342229261994, 0.00022994652681518346, 0.0007226300076581538, 0.24713343381881714], [0.3090778887271881, 0.06674632430076599, 0.006006896961480379, 0.0024640285409986973, 0.010481102392077446, 0.04672401025891304, 0.0099563617259264, 0.2720598876476288, 0.0014253626577556133, 0.002726626116782427, 0.0026314258575439453, 0.0036743918899446726, 0.0010710149072110653, 0.0013408155646175146, 0.0010161481332033873, 0.0024455806706100702, 0.2601521611213684], [0.2985815107822418, 0.033986128866672516, 0.021185586228966713, 0.06316479295492172, 0.025739949196577072, 0.017223123461008072, 0.0019348671194165945, 0.24252822995185852, 0.0057408916763961315, 0.01144476979970932, 0.006610502488911152, 0.004603979177772999, 0.0031137766782194376, 0.0064673093147575855, 0.004012224264442921, 0.012082505039870739, 0.2415798157453537], [0.17522114515304565, 0.009799904190003872, 0.005302064120769501, 0.008822753094136715, 0.012573093175888062, 0.010886980220675468, 0.023083878681063652, 0.3665170967578888, 0.0050969296135008335, 0.006993346847593784, 0.005743982270359993, 0.008875705301761627, 0.014813671819865704, 0.0055067394860088825, 0.0047635892406105995, 0.019929952919483185, 0.3160691261291504], [0.34310823678970337, 0.06502809375524521, 0.0021205185912549496, 0.00020462179963942617, 0.0010511638829484582, 0.03155646473169327, 0.0007323729223571718, 0.24103838205337524, 0.007478512357920408, 0.0021872480865567923, 0.0028576774057000875, 0.009931106120347977, 0.005292694084346294, 0.004525842145085335, 0.0065164994448423386, 0.004492613486945629, 0.2718779444694519], [0.33643779158592224, 0.024609388783574104, 0.0016998628852888942, 0.0008097393438220024, 0.0038962308317422867, 0.015501647256314754, 0.000424665748141706, 0.2828017771244049, 0.00767585588619113, 0.001466101035475731, 0.003040989860892296, 0.0051518953405320644, 0.0038221084978431463, 0.0044102780520915985, 0.008668821305036545, 0.0032396167516708374, 0.29634320735931396], [0.3713393807411194, 0.011105533689260483, 0.003743572859093547, 0.008938511833548546, 0.002995800692588091, 0.008838982321321964, 0.001157468417659402, 0.2821926176548004, 0.004074815660715103, 0.005535315256565809, 0.0030692932195961475, 0.007599976379424334, 0.0012818478280678391, 0.003024027682840824, 0.0011526128510013223, 0.0058023687452077866, 0.27814778685569763], [0.36102738976478577, 0.05066632106900215, 0.0011718260357156396, 0.005367958918213844, 0.0035248207859694958, 0.0430634431540966, 0.0011840922525152564, 0.2113596349954605, 0.01272980310022831, 0.013502378948032856, 0.0031616787891834974, 0.00816856138408184, 0.012989488430321217, 0.012959785759449005, 0.004678542260080576, 0.008028513751924038, 0.24641574919223785], [0.31085413694381714, 0.03351454436779022, 0.00103286886587739, 0.0004577648942358792, 0.001868027145974338, 0.02804107405245304, 0.000937800679821521, 0.22322045266628265, 0.024304693564772606, 0.008120846934616566, 0.018699616193771362, 0.016921715810894966, 0.002167311031371355, 0.017596295103430748, 0.02243221551179886, 0.003368664300069213, 0.2864619195461273], [0.36689120531082153, 0.0737122893333435, 0.0015678750351071358, 0.00041538808727636933, 0.0010219237301498652, 0.036112092435359955, 0.0006845110910944641, 0.22632870078086853, 0.007153042126446962, 0.0015710457228124142, 0.0023514952044934034, 0.011136310175061226, 0.005858455318957567, 0.004047358874231577, 0.00858561135828495, 0.004430875647813082, 0.24813182651996613], [0.39216870069503784, 0.017801424488425255, 0.0007150857127271593, 0.0015222164802253246, 0.0008075842051766813, 0.015213064849376678, 0.001823327736929059, 0.27745744585990906, 0.0021716472692787647, 0.004416048526763916, 0.004889229312539101, 0.001699875108897686, 0.00041639237315393984, 0.001037251902744174, 0.00032518699299544096, 0.0012386300368234515, 0.2762969434261322], [0.3371252417564392, 0.01693182811141014, 0.0005964129813946784, 0.0010466380044817924, 0.0005187720526009798, 0.01702599786221981, 0.00028795877005904913, 0.2638838291168213, 0.015598482452332973, 0.001817028969526291, 0.0019759235437959433, 0.004123141523450613, 0.004623569082468748, 0.011006705462932587, 0.0008867817814461887, 0.0015239134663715959, 0.3210277855396271], [0.1683782935142517, 0.010808063670992851, 0.006809295155107975, 0.011103985831141472, 0.015447120182216167, 0.012427483685314655, 0.030049419030547142, 0.340798556804657, 0.007226421497762203, 0.009385605342686176, 0.00882553867995739, 0.012476149946451187, 0.022951599210500717, 0.007809228263795376, 0.006592255551367998, 0.030987152829766273, 0.29792383313179016]], [[0.3024081885814667, 0.030279221013188362, 0.021122870966792107, 0.07453761994838715, 0.016274038702249527, 0.02230302430689335, 0.00780830392614007, 0.13088785111904144, 0.02475917525589466, 0.05267786234617233, 0.019592205062508583, 0.017613627016544342, 0.022161509841680527, 0.027539823204278946, 0.02985086478292942, 0.06742105633020401, 0.13276273012161255], [0.12149839103221893, 0.03515300899744034, 0.027160238474607468, 0.036827292293310165, 0.038319479674100876, 0.032634101808071136, 0.04625249281525612, 0.0819600373506546, 0.06215345114469528, 0.09483538568019867, 0.03626265376806259, 0.025327347218990326, 0.12740276753902435, 0.07758533954620361, 0.027978135272860527, 0.05486680567264557, 0.07378306984901428], [0.3243929445743561, 0.011772572994232178, 0.0036798587534576654, 0.007018841803073883, 0.010714802891016006, 0.009844991378486156, 0.013599199242889881, 0.20018962025642395, 0.010531843639910221, 0.07193948328495026, 0.040880560874938965, 0.01727849803864956, 0.05413239821791649, 0.008542153984308243, 0.01814631186425686, 0.012107236310839653, 0.18522872030735016], [0.3385339677333832, 0.04158630594611168, 0.036076080054044724, 0.01327919028699398, 0.024219930171966553, 0.047071948647499084, 0.04189755767583847, 0.16749706864356995, 0.026543328538537025, 0.015559019520878792, 0.00597268994897604, 0.005869063548743725, 0.04054190590977669, 0.028945906087756157, 0.004271315410733223, 0.01570635475218296, 0.14642831683158875], [0.16865506768226624, 0.04938710480928421, 0.019549604505300522, 0.05930985137820244, 0.07521853595972061, 0.04664945602416992, 0.024897927418351173, 0.14647944271564484, 0.03934016451239586, 0.06177183613181114, 0.013278636150062084, 0.02136322110891342, 0.03814351186156273, 0.038135841488838196, 0.02510661818087101, 0.029801618307828903, 0.14291152358055115], [0.18725530803203583, 0.05376933887600899, 0.029055269435048103, 0.049089428037405014, 0.03742990270256996, 0.05701279267668724, 0.03076740726828575, 0.10879355669021606, 0.06185257062315941, 0.07229889929294586, 0.0236397385597229, 0.02484162338078022, 0.06708837300539017, 0.04484312981367111, 0.02780052274465561, 0.02588454633951187, 0.09857764095067978], [0.2257799208164215, 0.0345316082239151, 0.006942221894860268, 0.01437275018543005, 0.039183422923088074, 0.025698179379105568, 0.008817379362881184, 0.1455419957637787, 0.0696139857172966, 0.021771183237433434, 0.025386322289705276, 0.006431358866393566, 0.09228303283452988, 0.08459044247865677, 0.016931749880313873, 0.050242576748132706, 0.13188187777996063], [0.29042553901672363, 0.007339330855756998, 0.0014254071284085512, 0.0033209100365638733, 0.00431196391582489, 0.008623596280813217, 0.0015750013990327716, 0.32417330145835876, 0.00427272217348218, 0.002996231894940138, 0.005141968838870525, 0.006836503744125366, 0.005467037204653025, 0.005742002744227648, 0.0024779916275292635, 0.0054355221800506115, 0.32043489813804626], [0.25044775009155273, 0.036818623542785645, 0.09313350915908813, 0.00952901877462864, 0.012264832854270935, 0.024558238685131073, 0.053025126457214355, 0.16229106485843658, 0.024035083130002022, 0.04518783837556839, 0.018864067271351814, 0.017499709501862526, 0.03569392114877701, 0.014961350709199905, 0.024027641862630844, 0.03000655397772789, 0.14765571057796478], [0.2250499278306961, 0.04895733296871185, 0.013785665854811668, 0.005020629148930311, 0.00792643241584301, 0.03186631575226784, 0.013938482850790024, 0.271396279335022, 0.02267514169216156, 0.007961919531226158, 0.0034771845676004887, 0.030567822977900505, 0.02528628334403038, 0.012654193677008152, 0.006924149580299854, 0.010077578015625477, 0.262434720993042], [0.3020283579826355, 0.06289812922477722, 0.017301304265856743, 0.011179446242749691, 0.006016455590724945, 0.017042119055986404, 0.03613385185599327, 0.2194403111934662, 0.01605672389268875, 0.02131962962448597, 0.007852209731936455, 0.010359067469835281, 0.012205595150589943, 0.008597283624112606, 0.025482745841145515, 0.01104278676211834, 0.21504399180412292], [0.23796281218528748, 0.0670369565486908, 0.06441561132669449, 0.008521844632923603, 0.01696154475212097, 0.03187268599867821, 0.04601573571562767, 0.1900751292705536, 0.014263161458075047, 0.02239990048110485, 0.006141927558928728, 0.012937868945300579, 0.014200568199157715, 0.008146810345351696, 0.008377047255635262, 0.06010851636528969, 0.19056174159049988], [0.33841192722320557, 0.05038106441497803, 0.02013375610113144, 0.008323549292981625, 0.007523011416196823, 0.011559674516320229, 0.013525321148335934, 0.22538666427135468, 0.01617872714996338, 0.01860102266073227, 0.0009935613488778472, 0.02454315684735775, 0.008262021467089653, 0.019163036718964577, 0.004872401710599661, 0.013227381743490696, 0.2189137190580368], [0.2676387131214142, 0.04345650225877762, 0.05921970680356026, 0.008791954256594181, 0.007398023270070553, 0.016023466363549232, 0.026174215599894524, 0.191934734582901, 0.018210653215646744, 0.02745293453335762, 0.015342566184699535, 0.021244583651423454, 0.04558000713586807, 0.022756297141313553, 0.019447900354862213, 0.0283270925283432, 0.1810005158185959], [0.23023921251296997, 0.03771331533789635, 0.00433285953477025, 0.0019842812325805426, 0.0068510472774505615, 0.012707910500466824, 0.007208711933344603, 0.27782872319221497, 0.01437422912567854, 0.009604205377399921, 0.024872267618775368, 0.01447662990540266, 0.01158254686743021, 0.011691499501466751, 0.03275063633918762, 0.009953297674655914, 0.29182863235473633], [0.287616103887558, 0.048705801367759705, 0.02851501666009426, 0.0043346332386136055, 0.010277453809976578, 0.018220730125904083, 0.017625797539949417, 0.19514334201812744, 0.019543014466762543, 0.016393378376960754, 0.010116017423570156, 0.03422543406486511, 0.03615939989686012, 0.01829969324171543, 0.02935090847313404, 0.033645447343587875, 0.19182783365249634], [0.273516446352005, 0.010180409997701645, 0.0021359883248806, 0.003978705033659935, 0.006034583318978548, 0.012492965906858444, 0.002306202193722129, 0.32091525197029114, 0.0057565392926335335, 0.0031509348191320896, 0.007490784861147404, 0.009531576186418533, 0.00661844527348876, 0.00788586214184761, 0.003700338304042816, 0.007180912885814905, 0.31712406873703003]], [[0.35998964309692383, 0.00720044644549489, 0.00952097773551941, 0.02097858302295208, 0.004048050846904516, 0.006635804660618305, 0.009357865899801254, 0.2452428787946701, 0.013117024675011635, 0.02917439490556717, 0.005705017596483231, 0.011636906303465366, 0.025099685415625572, 0.013271000236272812, 0.0032164352014660835, 0.009907032363116741, 0.22589831054210663], [0.2001110166311264, 0.03055390901863575, 0.04606832563877106, 0.18214459717273712, 0.028567230328917503, 0.03422768414020538, 0.02714725024998188, 0.15776167809963226, 0.011524679139256477, 0.031206343322992325, 0.004239538684487343, 0.008807740174233913, 0.06622479856014252, 0.01336232852190733, 0.002346902387216687, 0.00920367892831564, 0.14650234580039978], [0.04230611026287079, 0.03502862900495529, 0.33979862928390503, 0.19633087515830994, 0.02950000949203968, 0.019022446125745773, 0.026152078062295914, 0.13933001458644867, 0.003422748064622283, 0.0113377645611763, 0.0017469078302383423, 0.0019091337453573942, 0.0027396988589316607, 0.0034355332609266043, 0.0014984962763264775, 0.015023568645119667, 0.13141736388206482], [0.12133914977312088, 0.026223640888929367, 0.013829448260366917, 0.26078295707702637, 0.026229368522763252, 0.018690595403313637, 0.08555667847394943, 0.17745764553546906, 0.006205730140209198, 0.040454357862472534, 0.017176903784275055, 0.004207426682114601, 0.005872828420251608, 0.007993195205926895, 0.004594189580529928, 0.009112872183322906, 0.17427298426628113], [0.2000785917043686, 0.019389281049370766, 0.01480416115373373, 0.09598270803689957, 0.007851284928619862, 0.017362261191010475, 0.040210265666246414, 0.2831098139286041, 0.0029720584861934185, 0.023673919960856438, 0.0070809959433972836, 0.002881062915548682, 0.03115232102572918, 0.003954009152948856, 0.001562010496854782, 0.0040627093985676765, 0.24387258291244507], [0.15942883491516113, 0.042274899780750275, 0.02302929200232029, 0.1309691220521927, 0.03720428794622421, 0.03163367137312889, 0.044747963547706604, 0.22667333483695984, 0.009123533964157104, 0.015994736924767494, 0.0070763020776212215, 0.005939931608736515, 0.041402436792850494, 0.010428872890770435, 0.0033966531045734882, 0.007228354457765818, 0.2034478634595871], [0.18066667020320892, 0.017416061833500862, 0.01610175333917141, 0.11018575727939606, 0.03102722018957138, 0.01920236088335514, 0.020321322605013847, 0.3049446642398834, 0.0038466574624180794, 0.0040113721042871475, 0.004581425338983536, 0.0075630624778568745, 0.009110287763178349, 0.004809110891073942, 0.0016243770951405168, 0.0031855714041739702, 0.26140230894088745], [0.5840312838554382, 0.004743863362818956, 0.014474052004516125, 0.009881310164928436, 0.00264731771312654, 0.004685430787503719, 0.002674694638699293, 0.1871562898159027, 0.002601411659270525, 0.007300317753106356, 0.0015796282095834613, 0.0023993945214897394, 0.0017056650249287486, 0.0018584171775728464, 0.0009000025456771255, 0.005964957643300295, 0.16539600491523743], [0.30828002095222473, 0.006220230832695961, 0.0059934575110673904, 0.00644401041790843, 0.002393897855654359, 0.010106999427080154, 0.005361232906579971, 0.17305806279182434, 0.011036748997867107, 0.14138130843639374, 0.01077658124268055, 0.024243030697107315, 0.10418997704982758, 0.00816765334457159, 0.014450970105826855, 0.009320174343883991, 0.158575639128685], [0.0815197005867958, 0.020000943914055824, 0.16753031313419342, 0.03431420400738716, 0.02243422158062458, 0.029751300811767578, 0.012675809673964977, 0.12738651037216187, 0.04833992198109627, 0.04793912544846535, 0.009313643909990788, 0.029552778229117393, 0.04286762699484825, 0.028911998495459557, 0.12530866265296936, 0.04064905270934105, 0.13150420784950256], [0.5282158255577087, 0.009215841069817543, 0.02208750694990158, 0.014886071905493736, 0.0026185635942965746, 0.012227934785187244, 0.01554732397198677, 0.16134385764598846, 0.006239890120923519, 0.016384460031986237, 0.003174096578732133, 0.013376103714108467, 0.010847215540707111, 0.004526900127530098, 0.001723004737868905, 0.016203444451093674, 0.16138193011283875], [0.3043380081653595, 0.02286265231668949, 0.03255901858210564, 0.02116522192955017, 0.015146443620324135, 0.027407599613070488, 0.021879781037569046, 0.22911445796489716, 0.007677081506699324, 0.03117549605667591, 0.004477428738027811, 0.01851595565676689, 0.011677582748234272, 0.006602891720831394, 0.015908220782876015, 0.007692528888583183, 0.22179968655109406], [0.1752730756998062, 0.004588421434164047, 0.002817933913320303, 0.003289309097453952, 0.00123742560390383, 0.006643157918006182, 0.015229299664497375, 0.3235494792461395, 0.009337316267192364, 0.04946617782115936, 0.0074899159371852875, 0.005904505494982004, 0.08630012720823288, 0.013564500026404858, 0.006529935635626316, 0.0073045543394982815, 0.281474769115448], [0.3593403697013855, 0.004025507718324661, 0.006888154428452253, 0.005860699340701103, 0.0020617141854017973, 0.006544760428369045, 0.004472521133720875, 0.12614770233631134, 0.008367729373276234, 0.15482084453105927, 0.009447501040995121, 0.021249212324619293, 0.13394621014595032, 0.006845593918114901, 0.020357808098196983, 0.010948236100375652, 0.11867548525333405], [0.4243904650211334, 0.00980836059898138, 0.01192525215446949, 0.008604561910033226, 0.00566175626590848, 0.01086607575416565, 0.0030110382940620184, 0.20327475666999817, 0.026776380836963654, 0.017595678567886353, 0.012794801034033298, 0.01770150288939476, 0.014544767327606678, 0.03148660808801651, 0.002583786379545927, 0.021791616454720497, 0.17718258500099182], [0.3249542713165283, 0.01082911528646946, 0.006215124856680632, 0.010009188205003738, 0.011470712721347809, 0.010403158143162727, 0.010292298160493374, 0.24442161619663239, 0.021513354033231735, 0.026710553094744682, 0.0015979440649971366, 0.012410338968038559, 0.041961729526519775, 0.023315489292144775, 0.004604635760188103, 0.01337346713989973, 0.2259170562028885], [0.5769861936569214, 0.005360886454582214, 0.015407702885568142, 0.00983339175581932, 0.003257320960983634, 0.005299969110637903, 0.002874945756047964, 0.18718747794628143, 0.0030531752854585648, 0.007916856557130814, 0.0019868374802172184, 0.003189179114997387, 0.001881532371044159, 0.0021618418395519257, 0.00131212396081537, 0.007276593241840601, 0.16501398384571075]], [[0.6959972381591797, 0.0077446275390684605, 0.005260814446955919, 0.009486541152000427, 0.007903705351054668, 0.009182424284517765, 0.014701749198138714, 0.10258606821298599, 0.00303898798301816, 0.010865113697946072, 0.0033028051257133484, 0.0031110954005271196, 0.0056053041480481625, 0.002030021511018276, 0.0021458358969539404, 0.017398711293935776, 0.0996389240026474], [0.006331637967377901, 0.0024512875825166702, 0.9547946453094482, 0.004563211463391781, 0.00018287448619958013, 0.0005222941399551928, 0.003179222112521529, 0.014406083151698112, 5.501753912540153e-05, 4.484011515160091e-05, 7.453135185642168e-05, 0.00029518091469071805, 0.0001010975320241414, 0.00010833601118065417, 4.48917162430007e-05, 0.00024407869204878807, 0.012600668705999851], [0.5162607431411743, 0.007085038349032402, 0.015279430896043777, 0.1405944973230362, 0.004260573070496321, 0.008886747062206268, 0.0021301910746842623, 0.16376133263111115, 0.00020257163851056248, 4.803802221431397e-05, 0.00010678466060198843, 4.197925954940729e-05, 2.712254354264587e-05, 0.00010132988245459273, 0.00024568845401518047, 0.00047873269068077207, 0.1404891461133957], [0.4020403027534485, 0.004058153834193945, 0.010286534205079079, 0.0034978666808456182, 0.055199943482875824, 0.23959887027740479, 0.016667453572154045, 0.14295917749404907, 0.00016278470866382122, 0.0008327076211571693, 0.00011797354090958834, 5.316642636898905e-05, 2.631263851071708e-05, 1.9544495444279164e-05, 0.00011205946793779731, 0.0015669430140405893, 0.1228000819683075], [0.011891569942235947, 0.000536019098944962, 0.018842056393623352, 0.00038672584923915565, 0.0006348469760268927, 0.5546340346336365, 0.3308072090148926, 0.043068744242191315, 0.00023108867753762752, 0.0006083636544644833, 0.0004469580599106848, 0.00014205224579200149, 4.204640390526038e-06, 5.441520443127956e-06, 1.2392515600367915e-05, 0.0006022992893122137, 0.03714597597718239], [0.06790461391210556, 0.0013910544803366065, 0.0192977637052536, 0.0015960752498358488, 0.0051045408472418785, 0.03420741856098175, 0.7798225283622742, 0.045795269310474396, 0.0009580369805917144, 0.003074520966038108, 0.0014870609156787395, 0.0029787614475935698, 5.907970989937894e-05, 6.843272331025219e-06, 2.2279327822616324e-05, 0.0009316236828453839, 0.03536241874098778], [0.9783364534378052, 1.3265317647892516e-05, 5.307854735292494e-06, 2.6011528007074958e-06, 3.017735707544489e-06, 2.971591857203748e-05, 4.9377053073840216e-05, 0.013613603077828884, 4.718369837064529e-06, 1.877530621641199e-06, 1.7774432592432277e-07, 1.299366516605005e-07, 1.1565211934794206e-06, 1.1774271513331769e-07, 5.248184997697081e-09, 4.572282819026441e-07, 0.007938079535961151], [0.8924463391304016, 0.006859590765088797, 0.0008357313927263021, 0.003109344281256199, 0.005411035381257534, 0.0059766145423054695, 0.0022917718160897493, 0.03939186409115791, 0.0007210785406641662, 0.0006299996166490018, 0.0004647286550607532, 0.00040604808600619435, 0.00022823124891147017, 0.00035850642598234117, 0.0002650823153089732, 0.0012534415582194924, 0.03935067355632782], [0.004391891416162252, 1.3718900845560711e-05, 6.766682054148987e-05, 2.313658796992968e-06, 7.263983661687234e-06, 4.908712435280904e-05, 0.0006549089448526502, 0.019540823996067047, 0.0005456091603264213, 0.9573616981506348, 0.002654122421517968, 0.0006231378647498786, 6.865190516691655e-05, 1.4445219676417764e-05, 0.0003623472875915468, 7.237040699692443e-05, 0.013570068404078484], [0.24095791578292847, 0.001101038884371519, 2.5554365493007936e-05, 5.337648872227874e-06, 9.855561984295491e-06, 0.00028948814724572003, 0.00027733712340705097, 0.3000878095626831, 0.0028884548228234053, 0.01611434854567051, 0.13610927760601044, 0.009755046106874943, 0.010220763273537159, 0.0012481944868341088, 0.00020628025231417269, 0.0012374324724078178, 0.27946582436561584], [0.7364034652709961, 0.007772201672196388, 0.00010314675455447286, 0.0003590492415241897, 1.2588564459292684e-05, 0.0007723228773102164, 0.00023753897403366864, 0.07869347184896469, 0.0005426991265267134, 0.0005315054440870881, 0.0008695297292433679, 0.07374822348356247, 0.00907106976956129, 0.0014584313612431288, 0.00013954263704363257, 0.0008263486088253558, 0.08845888078212738], [0.20631544291973114, 0.0010554296895861626, 0.0001994435442611575, 8.430961315752938e-05, 4.798006557393819e-05, 3.440720684011467e-05, 2.4605862563475966e-05, 0.05585185065865517, 0.00041770466486923397, 0.0010945653775706887, 0.0015753385378047824, 0.0011711008846759796, 0.6608336567878723, 0.016420327126979828, 0.0019185520941391587, 0.00260834489017725, 0.05034700408577919], [0.15306153893470764, 0.0005404168623499572, 0.013829533010721207, 0.0004605992289725691, 2.9478165743057616e-05, 4.2960520659107715e-05, 4.493079177336767e-05, 0.15155275166034698, 0.0005545448511838913, 0.0022157258354127407, 0.0014832664746791124, 0.0012187822721898556, 0.02074042148888111, 0.44500479102134705, 0.015950236469507217, 0.029683463275432587, 0.16358661651611328], [0.13357573747634888, 0.0005421385867521167, 0.002285833703354001, 0.00029894046019762754, 0.00015404039004351944, 0.00014606394688598812, 0.00031683960696682334, 0.14957192540168762, 1.3634257811645512e-05, 0.003980165813118219, 0.001818205346353352, 0.0012275701155886054, 0.001928592217154801, 0.01177874393761158, 0.39223793148994446, 0.09572045505046844, 0.20440325140953064], [0.3336658477783203, 0.0020987752359360456, 0.001119056949391961, 0.0013755030231550336, 0.005426853429526091, 0.0008849873556755483, 0.00045718473847955465, 0.07830797880887985, 5.12144197273301e-06, 5.2092196710873395e-05, 0.0005658217123709619, 0.0003359355032444, 0.0007042557699605823, 0.0008648771326988935, 0.0024996171705424786, 0.4676204025745392, 0.10401563346385956], [0.8951225876808167, 0.0006138449534773827, 0.00037581889773719013, 0.00021495531836990267, 0.00012625947420019656, 0.00044330267701298, 4.806590732187033e-05, 0.039259616285562515, 3.7414527469081804e-05, 7.820419341442175e-06, 0.00016283313743770123, 0.00016219723329413682, 0.001255991286598146, 0.0008050307515077293, 0.00030190771212801337, 0.0016256648814305663, 0.05943657085299492], [0.8804554343223572, 0.010952111333608627, 0.0013115830952301621, 0.003889394225552678, 0.0059689865447580814, 0.006705315783619881, 0.0026708838995546103, 0.04049256816506386, 0.0008356599137187004, 0.0007534812903031707, 0.0005282926140353084, 0.0005306598031893373, 0.0003439544525463134, 0.0005244679050520062, 0.0004264125891495496, 0.002150031039491296, 0.041460685431957245]], [[0.16313815116882324, 0.010214782319962978, 0.007985773496329784, 0.009216719307005405, 0.005400913767516613, 0.007640828378498554, 0.006037918385118246, 0.3429224193096161, 0.015088650397956371, 0.017681289464235306, 0.013900008983910084, 0.014689023606479168, 0.024341845884919167, 0.012888504192233086, 0.005105342250317335, 0.018789200112223625, 0.3249585032463074], [0.10556863993406296, 0.0213029608130455, 0.22882398962974548, 0.029935574159026146, 0.01614001765847206, 0.024913206696510315, 0.0601780042052269, 0.2148459404706955, 0.006354323122650385, 0.02204984612762928, 0.037534791976213455, 0.007483654655516148, 0.009778348729014397, 0.005424597300589085, 0.009583383798599243, 0.01371898502111435, 0.18636366724967957], [0.21925203502178192, 0.010619424283504486, 0.009154162369668484, 0.013447648845613003, 0.002133312402293086, 0.0037887978833168745, 0.06313080340623856, 0.3484101891517639, 0.009997552260756493, 0.005610581953078508, 0.004348789807409048, 0.002054534386843443, 0.001184592256322503, 0.0038006752729415894, 0.0069999839179217815, 0.003500851569697261, 0.2925661504268646], [0.25249430537223816, 0.03045513853430748, 0.16451053321361542, 0.06770873069763184, 0.00954813789576292, 0.03322198614478111, 0.08144077658653259, 0.18078094720840454, 0.0034067360684275627, 0.015687396749854088, 0.0050127641297876835, 0.0029873165767639875, 0.004060269799083471, 0.0025715664960443974, 0.0054634371772408485, 0.004585088696330786, 0.13606488704681396], [0.06218055635690689, 0.024736234918236732, 0.04211129993200302, 0.09440165013074875, 0.003490511793643236, 0.03348754346370697, 0.6195890307426453, 0.03570161014795303, 0.0026602039579302073, 0.01320437341928482, 0.003927464596927166, 0.0029557624366134405, 0.009014463983476162, 0.0020417345222085714, 0.014210077933967113, 0.008302469737827778, 0.027985019609332085], [0.10106872767210007, 0.02227054536342621, 0.09399963170289993, 0.04352429881691933, 0.01799512840807438, 0.03703578934073448, 0.1739225834608078, 0.2045392096042633, 0.005265017971396446, 0.03545690327882767, 0.03690201789140701, 0.011800568550825119, 0.01397752482444048, 0.005207898560911417, 0.009585777297616005, 0.017163900658488274, 0.17028449475765228], [0.1737377792596817, 0.036137405782938004, 0.02912095934152603, 0.08318175375461578, 0.014746901579201221, 0.02534753829240799, 0.005852538160979748, 0.32061633467674255, 0.006611044052988291, 0.007093543652445078, 0.016293715685606003, 0.0015727926511317492, 0.015334190800786018, 0.004305475391447544, 0.0008549922495149076, 0.003098628483712673, 0.25609442591667175], [0.16611126065254211, 0.002160026226192713, 0.00027706960099749267, 0.0012991471448913217, 0.0010359458392485976, 0.0012573046842589974, 0.0008831858867779374, 0.4368273913860321, 0.0006482069147750735, 0.00022974541934672743, 0.000876160804182291, 0.00037930923281237483, 0.00035299212322570384, 0.0011706784134730697, 0.0001941129012266174, 0.0004047449037898332, 0.38589274883270264], [0.27780860662460327, 0.007181299850344658, 0.03430254012346268, 0.002727442653849721, 0.0024476083926856518, 0.005520181730389595, 0.04594927281141281, 0.2645586133003235, 0.004329823888838291, 0.010615751147270203, 0.042487066239118576, 0.004604010377079248, 0.018386365845799446, 0.004805687814950943, 0.0069451588205993176, 0.013882076367735863, 0.2534484267234802], [0.17509734630584717, 0.005132684949785471, 0.0024121240712702274, 0.005544045474380255, 0.002542763715609908, 0.0020307451486587524, 0.0033844350837171078, 0.35567519068717957, 0.05636337399482727, 0.010415504686534405, 0.0052816253155469894, 0.005710158962756395, 0.03493678197264671, 0.02345617488026619, 0.003200435545295477, 0.007878652773797512, 0.3009379506111145], [0.20018059015274048, 0.011447232216596603, 0.0001332415995420888, 0.007591171655803919, 0.0018935579573735595, 0.0034528563264757395, 0.003451855620369315, 0.3165396451950073, 0.07653333991765976, 0.00351185305044055, 0.004889857955276966, 0.004553612787276506, 0.010967076756060123, 0.05868019908666611, 0.011176601983606815, 0.003685247851535678, 0.28131207823753357], [0.08472973108291626, 0.002550333272665739, 0.0003626806428655982, 0.0013660861877724528, 0.001070957980118692, 0.0028481835033744574, 0.0029948020819574594, 0.12200871855020523, 0.008415686897933483, 0.015148157253861427, 0.6234654188156128, 0.0015276491176337004, 0.008929545991122723, 0.006448919884860516, 0.0011488809250295162, 0.00272906431928277, 0.1142551377415657], [0.3884458839893341, 0.005400855094194412, 0.0007452070130966604, 0.001395656494423747, 0.0011131340870633721, 0.005875939037650824, 0.012734247371554375, 0.26736313104629517, 0.012306464836001396, 0.007459679152816534, 0.009618441574275494, 0.004871377721428871, 0.0348256416618824, 0.018677085638046265, 0.010374339297413826, 0.014959724619984627, 0.20383313298225403], [0.23664157092571259, 0.005337074864655733, 0.020726708695292473, 0.002912538591772318, 0.002009662101045251, 0.006205644924193621, 0.06205156445503235, 0.25222960114479065, 0.004345778841525316, 0.010430662892758846, 0.0376138910651207, 0.007728245574980974, 0.044588249176740646, 0.00898000504821539, 0.02461782470345497, 0.03506685420870781, 0.23851406574249268], [0.21425940096378326, 0.0021856981329619884, 0.00013983677490614355, 0.0004723976308014244, 0.0015020743012428284, 0.004206244368106127, 0.006867186166346073, 0.38373953104019165, 0.007286078296601772, 0.011573935858905315, 0.004320519044995308, 0.003174481214955449, 0.005137848202139139, 0.009484467096626759, 0.0014578505652025342, 0.005404961295425892, 0.3387875258922577], [0.17057380080223083, 0.0025483244098722935, 0.0021330469753593206, 0.009184647351503372, 0.0007252037758007646, 0.002898829523473978, 0.006204844452440739, 0.30981993675231934, 0.003075559390708804, 0.01724020391702652, 0.005073311273008585, 0.003156411461532116, 0.007416308857500553, 0.006591300014406443, 0.1697906255722046, 0.001247695181518793, 0.28231993317604065], [0.1663760542869568, 0.0020809604320675135, 0.0002614349068608135, 0.0010972103336825967, 0.0011253571137785912, 0.0012386712478473783, 0.0008872587932273746, 0.43411368131637573, 0.0007039686315692961, 0.00027688537375070155, 0.0010195140494033694, 0.0004455139860510826, 0.0003689740551635623, 0.001333949388936162, 0.00023100862745195627, 0.00044686885667033494, 0.3879926800727844]], [[0.40364983677864075, 0.025120750069618225, 0.011074002832174301, 0.016334034502506256, 0.011914027854800224, 0.018830345943570137, 0.026117056608200073, 0.19828075170516968, 0.008338275365531445, 0.021790064871311188, 0.010964248329401016, 0.01199241355061531, 0.01866769790649414, 0.013665527105331421, 0.008113598451018333, 0.020461246371269226, 0.17468610405921936], [0.5635526776313782, 0.03385404124855995, 0.007266618311405182, 0.013103276491165161, 0.00887469481676817, 0.008410237729549408, 0.0028482729103416204, 0.20250868797302246, 0.0006163949728943408, 0.004819197580218315, 0.0018767673755064607, 0.0007006556261330843, 0.0020434099715203047, 0.0010770083172246814, 0.0007934764144010842, 0.001713984995149076, 0.1459406316280365], [0.37714195251464844, 0.14172081649303436, 0.008621757850050926, 0.00991392694413662, 0.010674310848116875, 0.00409815926104784, 0.007047262508422136, 0.22698773443698883, 0.00025643763365224004, 0.0005310469423420727, 0.00450561149045825, 0.00530346529558301, 0.003930752631276846, 0.002185943303629756, 0.0019242740236222744, 0.0038209883496165276, 0.19133560359477997], [0.3081434667110443, 0.11616382747888565, 0.0391848161816597, 0.02311808429658413, 0.013648797757923603, 0.014760948717594147, 0.012777508236467838, 0.24995149672031403, 0.000450747087597847, 0.00037882072501815856, 0.001114367856644094, 0.0010013937717303634, 0.009912222623825073, 0.0018290058942511678, 0.0013238437240943313, 0.003607409307733178, 0.20263321697711945], [0.09228277206420898, 0.11723052710294724, 0.16430756449699402, 0.3651537001132965, 0.013564775697886944, 0.021695947274565697, 0.007155416999012232, 0.11741132289171219, 0.00151314667891711, 0.0004281438887119293, 0.0003446707269176841, 0.0007054094457998872, 0.004101785831153393, 0.0024136267602443695, 0.0038866810500621796, 0.0010112510062754154, 0.08679334819316864], [0.019979704171419144, 0.01239036489278078, 0.08644694089889526, 0.5665701031684875, 0.11214841902256012, 0.027260208502411842, 0.01073313970118761, 0.08942244946956635, 0.0017136252718046308, 0.000622992985881865, 0.00014602929877582937, 0.00023118713579606265, 0.003800582140684128, 0.0019534786697477102, 0.0011341077042743564, 0.0013442205963656306, 0.06410244107246399], [0.07835068553686142, 0.007088932674378157, 0.07685209810733795, 0.14028814435005188, 0.11182418465614319, 0.08017713576555252, 0.033603373914957047, 0.24489809572696686, 0.003469242015853524, 0.0038559818640351295, 0.0016143833054229617, 0.0013459809124469757, 0.0005649477825500071, 0.0020657978020608425, 0.003467536997050047, 0.008565433323383331, 0.2019680142402649], [0.6526908874511719, 0.008380125276744366, 0.0026245457120239735, 0.003257485805079341, 0.00205105054192245, 0.005095557309687138, 0.00795469805598259, 0.15926975011825562, 0.0013332751113921404, 0.003997142892330885, 0.0021102679893374443, 0.0008814124739728868, 0.002494734711945057, 0.0007861354388296604, 0.0012181694619357586, 0.003711279947310686, 0.14214341342449188], [0.5038552284240723, 0.0037317427340894938, 0.0033640835899859667, 0.0021810438483953476, 0.0026953844353556633, 0.013543082401156425, 0.01753557100892067, 0.2367292195558548, 0.008111598901450634, 0.007439303211867809, 0.005782516207545996, 0.0027748041320592165, 0.0007396257133223116, 9.606798994354904e-05, 0.00034089345717802644, 0.0010834061540663242, 0.18999645113945007], [0.3825465142726898, 0.0011468342272564769, 0.001488346024416387, 0.00292953965254128, 0.0037661674432456493, 0.008177045732736588, 0.008102930150926113, 0.265243262052536, 0.07681730389595032, 0.010812598280608654, 0.02568924054503441, 0.008674965240061283, 0.0039160787127912045, 0.0005494539509527385, 0.0005914760404266417, 0.0008790299179963768, 0.1986692100763321], [0.17469936609268188, 0.002286924747750163, 0.002205283148214221, 0.007978418841958046, 0.009781068190932274, 0.006775341462343931, 0.005869187880307436, 0.19131705164909363, 0.07510044425725937, 0.36214667558670044, 0.0060800835490226746, 0.004411804955452681, 0.008595521561801434, 0.004916910547763109, 0.002844932721927762, 0.000892776413820684, 0.1340981423854828], [0.14529021084308624, 0.000913427968043834, 0.00042777592898346484, 0.0005965724121779203, 0.001908866805024445, 0.003260993165895343, 0.003009781241416931, 0.1113179549574852, 0.05615972355008125, 0.49080854654312134, 0.07070713490247726, 0.006693044677376747, 0.009280619211494923, 0.003961681853979826, 0.003394369501620531, 0.0024593633133918047, 0.0898098275065422], [0.19385352730751038, 0.0015383190475404263, 0.00034114616573788226, 0.0005271152476780117, 0.0002293862053193152, 0.0011010300368070602, 0.002997500589117408, 0.21300683915615082, 0.010239376686513424, 0.09337914735078812, 0.15861043334007263, 0.09760335087776184, 0.01941184140741825, 0.007924336940050125, 0.011923238635063171, 0.003684157505631447, 0.18362928926944733], [0.07784727215766907, 0.002271876437589526, 0.0009028393542394042, 0.00038943561958149076, 0.00011923765850951895, 0.00037840381264686584, 0.0019389515509828925, 0.22047960758209229, 0.002973913913592696, 0.029365219175815582, 0.049135129898786545, 0.12171231210231781, 0.25525593757629395, 0.029332533478736877, 0.014174777083098888, 0.004278494976460934, 0.18944405019283295], [0.3963894546031952, 0.0018102923640981317, 0.0005626111524179578, 0.00044449896086007357, 7.007062231423333e-05, 0.00012360965774860233, 0.00044980357051827013, 0.24974709749221802, 0.0011976974783465266, 0.009696269407868385, 0.00471750320866704, 0.014660544693470001, 0.08063288033008575, 0.029151178896427155, 0.0041847508400678635, 0.0037953064311295748, 0.20236633718013763], [0.06080355495214462, 0.002805462572723627, 0.001391838421113789, 0.0022927639074623585, 0.001576337730512023, 0.0006511642714031041, 0.0005198956350795925, 0.07306408137083054, 0.004497938323765993, 0.03316112980246544, 0.008966446854174137, 0.01888444647192955, 0.2266205996274948, 0.42207422852516174, 0.05421263724565506, 0.01587660238146782, 0.07260075211524963], [0.6575058698654175, 0.008103255182504654, 0.0024175397120416164, 0.0027755568735301495, 0.0018751597963273525, 0.0041969954036176205, 0.006183417048305273, 0.1544516384601593, 0.0014032743638381362, 0.004613389726728201, 0.002514435676857829, 0.0010804757475852966, 0.003791904542595148, 0.001292334753088653, 0.0017895629862323403, 0.0050834477879107, 0.1409216821193695]]], [[[0.06759120523929596, 0.0020871013402938843, 0.0032625612802803516, 0.0015160661423578858, 0.001005565165542066, 0.0035139506217092276, 0.005905563943088055, 0.39193961024284363, 0.018096916377544403, 0.019527878612279892, 0.013833347707986832, 0.005969060119241476, 0.04883822798728943, 0.020120108500123024, 0.010947544127702713, 0.019832663238048553, 0.36601272225379944], [0.0076933689415454865, 0.020925795659422874, 0.2287757694721222, 0.0960816890001297, 0.017543915659189224, 0.025668492540717125, 0.10246124863624573, 0.14773337543010712, 0.017791511490941048, 0.06229807808995247, 0.030050484463572502, 0.009211882948875427, 0.024715689942240715, 0.015548155643045902, 0.04370894283056259, 0.011627846397459507, 0.13816367089748383], [0.004154375288635492, 0.012604085728526115, 0.0467446930706501, 0.018834814429283142, 0.010230228304862976, 0.01367613673210144, 0.15008331835269928, 0.3571232557296753, 0.01437342632561922, 0.008501267991960049, 0.014637360349297523, 0.001713148900307715, 0.009124052710831165, 0.008379245176911354, 0.009053273126482964, 0.002256709383800626, 0.31851068139076233], [0.005138943437486887, 0.05131759122014046, 0.15513327717781067, 0.09382656216621399, 0.07724513113498688, 0.05150994285941124, 0.04356435686349869, 0.23689043521881104, 0.010444794781506062, 0.01669452339410782, 0.005279402248561382, 0.0023975425865501165, 0.016171829774975777, 0.005926951766014099, 0.01036930549889803, 0.003390578320249915, 0.21469873189926147], [0.005500501021742821, 0.013060938566923141, 0.14383065700531006, 0.4154331088066101, 0.006511515937745571, 0.015535833314061165, 0.05618159472942352, 0.10404179990291595, 0.023118028417229652, 0.0096433125436306, 0.0041173831559717655, 0.004502392839640379, 0.05395468696951866, 0.015000199899077415, 0.030274728313088417, 0.005270793102681637, 0.09402255713939667], [0.0051037585362792015, 0.012007550336420536, 0.26398882269859314, 0.11905860155820847, 0.04067576304078102, 0.015787333250045776, 0.10373185575008392, 0.14273694157600403, 0.02786121889948845, 0.011937994509935379, 0.01585511490702629, 0.004399471916258335, 0.052416909486055374, 0.023468978703022003, 0.019778341054916382, 0.005762902554124594, 0.13542842864990234], [0.044971611350774765, 0.056763384491205215, 0.04289419203996658, 0.012148425914347172, 0.017068807035684586, 0.043881647288799286, 0.019254211336374283, 0.32153376936912537, 0.03993726521730423, 0.010806065984070301, 0.0185234397649765, 0.012098167091608047, 0.034731410443782806, 0.013814995996654034, 0.009631538763642311, 0.01248070690780878, 0.2894604504108429], [0.06046663224697113, 0.0009438990382477641, 0.00029467252898029983, 0.0004694225499406457, 0.00014705250214319676, 0.001256020157597959, 0.01351354643702507, 0.4953964650630951, 0.0005610951338894665, 0.0004014973237644881, 0.0006876891129650176, 0.0003559339384082705, 0.0020943903364241123, 0.0005946520250290632, 0.00042560143629089, 0.0032888746354728937, 0.41910243034362793], [0.012081529945135117, 0.012573553249239922, 0.028330763801932335, 0.016398390755057335, 0.005909213796257973, 0.016392506659030914, 0.04355902969837189, 0.3010939657688141, 0.021138066425919533, 0.030962640419602394, 0.019797133281826973, 0.01557218935340643, 0.1505585014820099, 0.03302048519253731, 0.014300093054771423, 0.0179585050791502, 0.2603534162044525], [0.005201674997806549, 0.011796063743531704, 0.03635196387767792, 0.026281410828232765, 0.011476158164441586, 0.017529722303152084, 0.031588561832904816, 0.4209766387939453, 0.016554513946175575, 0.006975475698709488, 0.009520676918327808, 0.0019399859011173248, 0.022098654881119728, 0.006683272309601307, 0.007702033966779709, 0.0020076401997357607, 0.36531543731689453], [0.05399094521999359, 0.011939999647438526, 0.010585847310721874, 0.002828157739713788, 0.002836566651239991, 0.010031893849372864, 0.02311435155570507, 0.34777799248695374, 0.0516529381275177, 0.005325790029019117, 0.03421425446867943, 0.022316185757517815, 0.022074049338698387, 0.06020361930131912, 0.016228992491960526, 0.024430548772215843, 0.300447940826416], [0.021983498707413673, 0.004345703404396772, 0.008262895978987217, 0.003720348933711648, 0.00533885695040226, 0.004137730225920677, 0.012295020744204521, 0.26610928773880005, 0.05190911144018173, 0.009000832214951515, 0.18570266664028168, 0.00391186261549592, 0.07709624618291855, 0.08833026885986328, 0.013261094689369202, 0.008684556931257248, 0.23591001331806183], [0.03228146582841873, 0.015680421143770218, 0.005403424613177776, 0.018526973202824593, 0.010842524468898773, 0.03664136677980423, 0.010927940718829632, 0.2523205876350403, 0.06714659929275513, 0.035007499158382416, 0.030700355768203735, 0.020984796807169914, 0.15488721430301666, 0.05140920355916023, 0.012946696020662785, 0.01891997456550598, 0.22537299990653992], [0.015747083351016045, 0.009987724013626575, 0.02234824188053608, 0.007975982502102852, 0.004856911953538656, 0.009397594258189201, 0.030109865590929985, 0.27750164270401, 0.06846160441637039, 0.018422191962599754, 0.0244930200278759, 0.022746030241250992, 0.13748927414417267, 0.060596369206905365, 0.024635685607790947, 0.02349485456943512, 0.24173593521118164], [0.06242319196462631, 0.004465267062187195, 0.06603453308343887, 0.002976061310619116, 0.0028727944009006023, 0.004175681620836258, 0.06131749972701073, 0.28562819957733154, 0.006936720106750727, 0.009882132522761822, 0.03549633175134659, 0.016951177269220352, 0.015434343367815018, 0.01984216831624508, 0.08790972828865051, 0.057722121477127075, 0.25993192195892334], [0.04182877019047737, 0.011658773757517338, 0.08602061867713928, 0.019495967775583267, 0.007693324703723192, 0.011874701827764511, 0.011004664935171604, 0.28295138478279114, 0.026775594800710678, 0.013011295348405838, 0.012999633327126503, 0.007822408340871334, 0.05453909933567047, 0.04275411367416382, 0.1128067597746849, 0.008107021450996399, 0.24865587055683136], [0.05821359530091286, 0.0009150478872470558, 0.00026590420748107135, 0.00043554563308134675, 0.00013661643606610596, 0.001207694411277771, 0.01303093507885933, 0.4977428615093231, 0.0005609805230051279, 0.0003836224786937237, 0.0006572830607183278, 0.00034280819818377495, 0.0020432048477232456, 0.0005850065499544144, 0.00040298793464899063, 0.0031167056877166033, 0.419959157705307]], [[0.034093376249074936, 0.00044489759602583945, 0.006101427599787712, 0.008624356240034103, 0.00025467932573519647, 0.0016919890185818076, 0.01056747604161501, 7.700968853896484e-05, 0.13774223625659943, 0.37366387248039246, 0.0168655626475811, 0.09972906112670898, 0.09415523707866669, 0.07423056662082672, 0.04163319244980812, 0.10004891455173492, 7.615468348376453e-05], [0.001162996282801032, 0.18318381905555725, 0.09754577279090881, 0.02519836463034153, 0.060916345566511154, 0.19578894972801208, 0.18307602405548096, 0.05169988051056862, 0.010652581229805946, 0.01708405278623104, 0.02795368805527687, 0.016898225992918015, 0.003948911093175411, 0.023456687107682228, 0.03743841126561165, 0.02236994169652462, 0.04162539541721344], [0.0008591158548370004, 0.21135738492012024, 0.026117568835616112, 0.016370050609111786, 0.08285021781921387, 0.2906479239463806, 0.01270249579101801, 0.18797393143177032, 0.006389293819665909, 0.0014653268735855818, 0.005866705905646086, 0.004215463530272245, 0.0012968142982572317, 0.004627143032848835, 0.00563897704705596, 0.0030748052522540092, 0.1385466605424881], [0.00025005819043144584, 0.2050454169511795, 0.05202070251107216, 0.016712374985218048, 0.05216941609978676, 0.21739263832569122, 0.014635143801569939, 0.24492202699184418, 0.0011587913613766432, 0.001233214046806097, 0.0027785450220108032, 0.0021564876660704613, 0.00020280934404581785, 0.0014044120907783508, 0.003654917236417532, 0.0008136756950989366, 0.1834494024515152], [0.002026214497163892, 0.3687891662120819, 0.023985715582966805, 0.013692590408027172, 0.0733562782406807, 0.20370225608348846, 0.011052021756768227, 0.10990811884403229, 0.006299892440438271, 0.006270294077694416, 0.03206678852438927, 0.013266410678625107, 0.0012885939795523882, 0.015546394512057304, 0.015567432157695293, 0.008422368206083775, 0.09475944191217422], [0.0006771596963517368, 0.29912441968917847, 0.0768122673034668, 0.025488488376140594, 0.08271053433418274, 0.17811036109924316, 0.02561686560511589, 0.11073505878448486, 0.008924730122089386, 0.01303890347480774, 0.022929146885871887, 0.01709006354212761, 0.0015140761388465762, 0.02096095122396946, 0.02173973247408867, 0.005514453165233135, 0.08901271224021912], [8.765157690504566e-05, 0.08225158601999283, 0.2241508513689041, 0.037248097360134125, 0.07412245124578476, 0.11606795340776443, 0.07404104620218277, 0.21733443439006805, 0.00248711952008307, 0.003886458231136203, 0.0019898873288184404, 0.0034698129165917635, 0.00046380195999518037, 0.0022734852973371744, 0.0036474033258855343, 0.0008067613816820085, 0.15567117929458618], [0.024379286915063858, 0.033237118273973465, 0.028610697016119957, 0.04959360882639885, 0.019112832844257355, 0.02906813845038414, 0.012010613456368446, 0.4017043113708496, 0.012275252491235733, 0.008052211254835129, 0.0034952485002577305, 0.009517203085124493, 0.010037039406597614, 0.006751872133463621, 0.008247197605669498, 0.008235905319452286, 0.33567139506340027], [0.10138751566410065, 0.00537125626578927, 0.000522296002600342, 0.001473960350267589, 0.0017028582515195012, 0.007519871927797794, 0.001009704195894301, 0.00015177084424067289, 0.07597704231739044, 0.15074947476387024, 0.05290530249476433, 0.08805067837238312, 0.12109722942113876, 0.13464492559432983, 0.10689391940832138, 0.15038001537322998, 0.00016204929852392524], [0.10707805305719376, 0.01229393482208252, 0.0015551559627056122, 0.0009244831744581461, 0.008584580384194851, 0.022964930161833763, 0.005412433762103319, 0.0005409545265138149, 0.1420038342475891, 0.05400732159614563, 0.0657041072845459, 0.06703547388315201, 0.15002742409706116, 0.1606147587299347, 0.03871360793709755, 0.16197504103183746, 0.0005638703587464988], [0.24567729234695435, 0.039897967129945755, 0.009237166494131088, 0.007731622084975243, 0.015656979754567146, 0.05056357383728027, 0.007087537087500095, 0.012723633088171482, 0.208438441157341, 0.027534401044249535, 0.05370236933231354, 0.04100580886006355, 0.08251162618398666, 0.08876346796751022, 0.014441372826695442, 0.08302704989910126, 0.011999574489891529], [0.14819999039173126, 0.00983673706650734, 0.0018336654175072908, 0.0022325911559164524, 0.009923204779624939, 0.0122905932366848, 0.003470506053417921, 0.0005050482577644289, 0.12569083273410797, 0.09835729002952576, 0.04992636665701866, 0.05539185181260109, 0.13692259788513184, 0.178776353597641, 0.03752817586064339, 0.12855477631092072, 0.000559396343305707], [0.12375345081090927, 0.008482333272695541, 0.0011029040906578302, 0.0005385702825151384, 0.004361131228506565, 0.010661887936294079, 0.0016766131157055497, 0.00021660990023519844, 0.10856969654560089, 0.07274451851844788, 0.06804376095533371, 0.11106707900762558, 0.05942279100418091, 0.16802683472633362, 0.13508062064647675, 0.1260061115026474, 0.00024522561579942703], [0.13264697790145874, 0.01634392701089382, 0.0016749895876273513, 0.0024514691904187202, 0.004384741187095642, 0.023555027320981026, 0.0026720408350229263, 0.00045120014692656696, 0.08000689744949341, 0.10356317460536957, 0.09842777997255325, 0.077020563185215, 0.09826817363500595, 0.13253247737884521, 0.11127527803182602, 0.11425405740737915, 0.00047128653386607766], [0.16268791258335114, 0.027344468981027603, 0.012957192957401276, 0.013063894584774971, 0.014941452071070671, 0.03352760523557663, 0.009412107057869434, 0.009531984105706215, 0.13912621140480042, 0.06188017874956131, 0.04550754278898239, 0.09206169098615646, 0.10809604078531265, 0.13595372438430786, 0.024692809209227562, 0.10007716715335846, 0.009138070046901703], [0.11179602146148682, 0.005330259911715984, 0.0006633760640397668, 0.0008560010464861989, 0.007808025926351547, 0.01209934987127781, 0.0012636417523026466, 0.0002970393979921937, 0.12106391042470932, 0.12159663438796997, 0.05709026753902435, 0.07752157002687454, 0.16159482300281525, 0.16974347829818726, 0.03384783864021301, 0.11711487174034119, 0.0003129672259092331], [0.03881574794650078, 0.03467615693807602, 0.029788369312882423, 0.05210678279399872, 0.022062212228775024, 0.031139567494392395, 0.013375245034694672, 0.368236243724823, 0.017806876450777054, 0.01173831894993782, 0.005038955714553595, 0.013386915437877178, 0.015540560707449913, 0.009780406951904297, 0.01197050791233778, 0.011986605823040009, 0.31255051493644714]], [[0.02729378640651703, 0.03536253422498703, 0.08335191756486893, 0.3878989517688751, 0.014964355155825615, 0.038065195083618164, 0.30013415217399597, 0.04404948279261589, 0.004249783232808113, 0.004211502615362406, 0.0026292274706065655, 0.0030510423239320517, 0.004369737114757299, 0.006810400169342756, 0.00400535948574543, 0.0028801853768527508, 0.036672353744506836], [0.2588442862033844, 0.03201316297054291, 0.02406052127480507, 0.00665647629648447, 0.01594468206167221, 0.023137908428907394, 0.02796773426234722, 0.2158208042383194, 0.041085030883550644, 0.02051910199224949, 0.02638826332986355, 0.02558039501309395, 0.04221643507480621, 0.020588869228959084, 0.00467566167935729, 0.015601657330989838, 0.1988990604877472], [0.050283029675483704, 0.018296467140316963, 0.0014617539709433913, 0.0004925610846839845, 0.010010014288127422, 0.014843718148767948, 0.008168998174369335, 0.41696056723594666, 0.02266453579068184, 0.00916711613535881, 0.008462819270789623, 0.013735280372202396, 0.03405670076608658, 0.01426875963807106, 0.0015653327573090792, 0.006621213164180517, 0.36894115805625916], [0.05901726335287094, 0.023250529542565346, 0.011620938777923584, 0.003990956582129002, 0.016308292746543884, 0.012991439551115036, 0.007696281652897596, 0.3829280138015747, 0.013247104361653328, 0.030829299241304398, 0.013937773182988167, 0.03415937349200249, 0.012230556458234787, 0.006321440450847149, 0.011997092515230179, 0.010489700362086296, 0.3489839434623718], [0.040798574686050415, 0.01455429196357727, 0.012210880406200886, 0.0017526961164548993, 0.013372279703617096, 0.00888400711119175, 0.001578652299940586, 0.44035473465919495, 0.024836909025907516, 0.005290244240313768, 0.003952735103666782, 0.009811311028897762, 0.007562096696346998, 0.009357893839478493, 0.0015036851400509477, 0.0034192088060081005, 0.400759756565094], [0.24477486312389374, 0.024861207231879234, 0.020533014088869095, 0.007552160881459713, 0.018674815073609352, 0.022970907390117645, 0.013715561479330063, 0.2905973792076111, 0.03283978998661041, 0.005764991510659456, 0.009293126873672009, 0.00778951495885849, 0.016797013580799103, 0.01323192473500967, 0.002411207649856806, 0.006582543719559908, 0.2616100609302521], [0.13274763524532318, 0.003641424933448434, 0.018571987748146057, 0.0034910617396235466, 0.010063904337584972, 0.008224730379879475, 0.00047137337969616055, 0.28465813398361206, 0.12342055886983871, 0.025775868445634842, 0.0037147703114897013, 0.012255358509719372, 0.014991555362939835, 0.08823368698358536, 0.0069405678659677505, 0.005353786051273346, 0.2574436366558075], [0.03628673031926155, 0.009304738603532314, 0.006289794575423002, 0.0035528612788766623, 0.004943149164319038, 0.01157505065202713, 0.0015372883062809706, 0.48690709471702576, 0.00371186388656497, 0.0035351442638784647, 0.008849959820508957, 0.002796432003378868, 0.003097851760685444, 0.003209034213796258, 0.005304727237671614, 0.003098355373367667, 0.4059998691082001], [0.04046959802508354, 0.006622095126658678, 0.009511884301900864, 0.006179176736623049, 0.004831917583942413, 0.006426215637475252, 0.010222545824944973, 0.47560080885887146, 0.0010967375710606575, 0.0011152173392474651, 0.005692863836884499, 0.002025051275268197, 0.0021525106858462095, 0.002163525437936187, 0.002685670042410493, 0.002804780611768365, 0.4203994572162628], [0.015019603073596954, 0.0017122410936281085, 0.0009891694644466043, 0.0008226657519116998, 0.0008123521110974252, 0.001400371314957738, 0.003574265632778406, 0.4972057044506073, 0.003294630441814661, 2.5230579922208562e-05, 0.002527952194213867, 0.00131884659640491, 0.0019261041888967156, 0.0029901268426328897, 0.001084045972675085, 0.0009308232110925019, 0.46436581015586853], [0.030673760920763016, 0.010640858672559261, 0.003959595691412687, 0.0009110965766012669, 0.005242099054157734, 0.012268726713955402, 0.00345136527903378, 0.4846135973930359, 0.003065731842070818, 0.001803313265554607, 0.0015373494243249297, 0.0013151546008884907, 0.0013138540089130402, 0.0020624033641070127, 0.0015817625680938363, 0.0014447985449805856, 0.4341145157814026], [0.023727212101221085, 0.003514275187626481, 0.01694120094180107, 0.0012630359269678593, 0.008591357618570328, 0.0023179377894848585, 0.0019829028751701117, 0.4641275107860565, 0.007375624496489763, 0.0012495252303779125, 0.00422075716778636, 0.001927169389091432, 0.0024360206443816423, 0.01286728959530592, 0.0014060960384085774, 0.0016036215238273144, 0.4444483518600464], [0.020130150020122528, 0.003381857182830572, 0.002231762744486332, 0.00017563650908414274, 0.0005566083709709346, 0.003496528835967183, 0.001751009956933558, 0.5052575469017029, 0.004826385527849197, 0.0001651347120059654, 0.0010764156468212605, 0.0003924760967493057, 0.00026413126033730805, 0.0021876480896025896, 0.00029284495394676924, 0.000152118198457174, 0.4536616802215576], [0.03535379841923714, 0.0022029310930520296, 0.0019453391432762146, 0.00039174946141429245, 0.0010378964943811297, 0.0022977220360189676, 0.003394561819732189, 0.4957892894744873, 0.001168391085229814, 0.00041394037543796003, 0.0034448220394551754, 0.0013533715391531587, 0.0023750464897602797, 0.001377107691951096, 0.001184921246021986, 0.0012930562952533364, 0.4449761211872101], [0.10276742279529572, 0.013341126963496208, 0.0012722212122753263, 0.0015854777302592993, 0.0025262755807489157, 0.006436736788600683, 0.0018561372999101877, 0.45142224431037903, 0.012807054445147514, 0.0005579125136137009, 0.009497048333287239, 0.004788849502801895, 0.0021324539557099342, 0.007497038226574659, 0.00010397647565696388, 0.0010562665993347764, 0.38035187125205994], [0.09168051183223724, 0.002327520865947008, 0.0023391987197101116, 0.0002618293510749936, 0.000719301518984139, 0.001972377300262451, 0.000573167169932276, 0.4416588842868805, 0.01063733734190464, 0.001986999763175845, 0.002757611684501171, 0.0055214413441717625, 0.0016559617361053824, 0.015112296678125858, 0.0006745947757735848, 0.0004715117684099823, 0.41964954137802124], [0.03487999364733696, 0.009396177716553211, 0.006349395029246807, 0.0034983891528099775, 0.0052128382958471775, 0.012137764133512974, 0.0015995703870430589, 0.48509475588798523, 0.004058254882693291, 0.003845496801659465, 0.009597711265087128, 0.0030873839277774096, 0.0033629003446549177, 0.003476201556622982, 0.0056302170269191265, 0.0034120765049010515, 0.4053608179092407]], [[0.010449378751218319, 0.002370302565395832, 0.006219631526619196, 0.0020938452798873186, 0.0019129350548610091, 0.003857811214402318, 0.025736236944794655, 0.49155163764953613, 0.0014399965293705463, 0.001768771675415337, 0.015581194311380386, 0.002629454480484128, 0.004018147476017475, 0.0011175984982401133, 0.0018624865915626287, 0.003862247336655855, 0.423528254032135], [0.03803052008152008, 0.21394740045070648, 0.001509453053586185, 0.0002270281402161345, 0.0004100881051272154, 0.04914749041199684, 0.0009754454949870706, 0.3727686405181885, 0.011593331582844257, 0.0009730999590829015, 0.0009064914775080979, 0.0002700536570046097, 0.001385212759487331, 0.004620653577148914, 0.0004733818641398102, 0.0001700967113720253, 0.30259162187576294], [0.003021927084773779, 5.961870920145884e-05, 0.520118236541748, 6.892891542520374e-05, 2.7443256840342656e-06, 6.059904626454227e-05, 0.000348845002008602, 0.26310667395591736, 6.813973413954955e-06, 0.0007182249100878835, 2.8821723390137777e-05, 1.4471286704065278e-05, 1.110891389544122e-05, 3.1255219710146775e-06, 3.063991971430369e-05, 1.5770485333632678e-05, 0.21238355338573456], [0.011006133630871773, 9.572628914611414e-05, 0.0002653111005201936, 0.04004378989338875, 9.049934305949137e-05, 0.0002411432797089219, 0.0002766268153209239, 0.526081383228302, 0.00032705935882404447, 0.0006493815453723073, 0.00013218331150710583, 9.892815432976931e-05, 0.00013021084305364639, 0.0002959792036563158, 5.6019063777057454e-05, 0.00030215363949537277, 0.41990751028060913], [0.030737649649381638, 0.00017254157864954323, 9.292146387451794e-06, 0.0001583635457791388, 0.08549880236387253, 7.794789416948333e-05, 1.1098673894593958e-05, 0.47184062004089355, 0.0006367333116941154, 1.6993679309962317e-05, 2.284593938384205e-05, 3.769696832023328e-06, 0.0007009499822743237, 0.0005728652467951179, 1.6045010852394626e-05, 1.1100664778496139e-05, 0.4095124304294586], [0.039847083389759064, 0.06075134873390198, 0.0020495601929724216, 0.0013267422327771783, 0.00043679377995431423, 0.1257016360759735, 0.00036435495712794363, 0.4124228358268738, 0.0023381970822811127, 0.0001769150694599375, 0.00023994660296011716, 0.00016591791063547134, 0.0002726983220782131, 0.001688492833636701, 0.00022142365924082696, 5.0809336244128644e-05, 0.3519451916217804], [0.02245575562119484, 0.00021161137556191534, 0.020850537344813347, 0.00038442775257863104, 3.8728844629076775e-06, 5.7452856708550826e-05, 0.2130790799856186, 0.40496838092803955, 0.00010623854905134067, 0.0015780474059283733, 0.001641078619286418, 2.3979557226994075e-05, 7.872303103795275e-05, 4.067181362188421e-05, 0.0005441592657007277, 2.2151803932501934e-05, 0.3339538276195526], [0.002198728732764721, 0.0003699703374877572, 0.00042359440703876317, 0.0007343862089328468, 0.00019095071183983237, 0.0003677646745927632, 0.0002741754869930446, 0.5434077978134155, 0.00010143443796550855, 0.00021705155086237937, 0.00034314286313019693, 0.00018007216567639261, 0.0002978287811856717, 0.00018081092275679111, 0.00015500218432862312, 0.0004584474372677505, 0.4500989019870758], [0.042560283094644547, 0.0065888771787285805, 0.00018686740077100694, 0.0010426031658425927, 0.0024851651396602392, 0.0015640078345313668, 0.0007922332733869553, 0.3988771438598633, 0.049687668681144714, 0.00011549433838808909, 5.377915294957347e-05, 6.93651381880045e-05, 0.0030703223310410976, 0.15561144053936005, 0.000215352134546265, 7.176768849603832e-05, 0.33700770139694214], [0.037441935390233994, 0.0004136347852181643, 0.011043844744563103, 0.0011196581181138754, 3.729869786184281e-05, 0.00014811629080213606, 0.0020002175588160753, 0.41009408235549927, 0.0006227876292541623, 0.1812625527381897, 2.068597859761212e-05, 4.5539829443441704e-05, 0.00020551001944113523, 0.0012307715369388461, 0.0004834496940020472, 0.0005131270154379308, 0.3533167541027069], [0.04857879504561424, 0.0001723002060316503, 0.0003532098198775202, 0.000560449087060988, 8.844788681017235e-05, 7.446219387929887e-05, 0.0012558092130348086, 0.4835381805896759, 0.00036571864620782435, 8.022908150451258e-05, 0.0429312102496624, 4.923061169392895e-06, 0.0005013263435102999, 0.00010158536315429956, 0.00014242979523260146, 4.41177689936012e-05, 0.42120686173439026], [0.029085757210850716, 7.234243821585551e-05, 0.00012580951442942023, 0.0005900296964682639, 1.621705996512901e-05, 3.17109479510691e-05, 0.00011350863496772945, 0.4861902594566345, 0.0005849428125657141, 0.0006992157432250679, 2.1981948066240875e-06, 0.07086525112390518, 0.00010718940757215023, 0.0002574517857283354, 0.00010740019934019074, 0.000951035472098738, 0.4101995825767517], [0.01808866113424301, 0.0001583242992637679, 7.193268538685516e-05, 0.000644534535240382, 0.00027209732797928154, 5.0320708396611735e-05, 0.00013142655370756984, 0.5067704319953918, 0.003091164631769061, 0.0002688555105123669, 8.13846563687548e-05, 2.846964525815565e-05, 0.027925826609134674, 0.0004425636143423617, 3.95697497879155e-05, 2.55415743595222e-05, 0.44190889596939087], [0.03216021880507469, 0.0007213260396383703, 4.0421393350698054e-05, 0.00032297358848154545, 0.0003534347633831203, 0.0005438064108602703, 0.00019766934565268457, 0.30977270007133484, 0.3253956139087677, 0.000357115117367357, 3.9029262552503496e-05, 9.155400039162487e-05, 0.00036315375473350286, 0.07443941384553909, 2.8258295060368255e-05, 2.1233567167655565e-05, 0.2551521360874176], [0.021212834864854813, 5.230680835666135e-05, 0.00039970065699890256, 0.0003058086149394512, 9.876871627056971e-05, 2.5868299417197704e-05, 0.0008874767809174955, 0.5200281143188477, 0.0009214602177962661, 0.004243588540703058, 3.8450696592917666e-05, 4.885780072072521e-05, 0.00024015876988414675, 0.00031510720145888627, 0.02063463255763054, 2.4892324290703982e-05, 0.4305218756198883], [0.027363931760191917, 1.9208800949854776e-05, 0.000636066310107708, 0.0008665363420732319, 4.061238087160746e-06, 8.586227522755507e-06, 0.00018449639901518822, 0.522641658782959, 5.679240348399617e-05, 0.0010260299313813448, 0.0001551132882013917, 0.0006890262593515217, 0.0001461226202081889, 3.545446816133335e-05, 2.685122308321297e-05, 0.018631594255566597, 0.4275084435939789], [0.0023526190780103207, 0.00034776434767991304, 0.0004243608273100108, 0.0006887199124321342, 0.00019291638454888016, 0.0003588495892472565, 0.0002830308221746236, 0.5436081290245056, 0.00010139773803530261, 0.00021934341930318624, 0.00037271188921295106, 0.00018052791710942984, 0.0003167309332638979, 0.00017876892525237054, 0.0001575702626723796, 0.0004735547991003841, 0.44974303245544434]], [[0.07597021758556366, 0.006919255945831537, 0.0029599866829812527, 0.028846262022852898, 0.0028434819541871548, 0.0038177575916051865, 0.005664214491844177, 0.18328829109668732, 0.08995450288057327, 0.06550900638103485, 0.021192628890275955, 0.03964025899767876, 0.19827915728092194, 0.03389964997768402, 0.011517140083014965, 0.04753292351961136, 0.18216530978679657], [0.012318715453147888, 0.11714029312133789, 0.11227518320083618, 0.3797248899936676, 0.05269322916865349, 0.030792899429798126, 0.04625771939754486, 0.04122282564640045, 0.043717559427022934, 0.018247203901410103, 0.006764526478946209, 0.010199305601418018, 0.045513249933719635, 0.018859393894672394, 0.008777127601206303, 0.01673431321978569, 0.03876157104969025], [0.009736874140799046, 0.07908301800489426, 0.021416103467345238, 0.44562655687332153, 0.060280781239271164, 0.06152030825614929, 0.039246879518032074, 0.08115898817777634, 0.029900282621383667, 0.01048114150762558, 0.005554671864956617, 0.005080200266093016, 0.03273739293217659, 0.01840830035507679, 0.011064207181334496, 0.01361101120710373, 0.0750933364033699], [0.008224625140428543, 0.07518687844276428, 0.021815257146954536, 0.06683996319770813, 0.23545487225055695, 0.10646995902061462, 0.013243765570223331, 0.09121642261743546, 0.05386512354016304, 0.02465301752090454, 0.012636888772249222, 0.023020245134830475, 0.11487726122140884, 0.03181128203868866, 0.017780859023332596, 0.016344236209988594, 0.08655936270952225], [0.007346766535192728, 0.09923283010721207, 0.024501094594597816, 0.036406949162483215, 0.07115937024354935, 0.060015931725502014, 0.02000141330063343, 0.1890898495912552, 0.04575125873088837, 0.0261117871850729, 0.012910847552120686, 0.023673996329307556, 0.17047719657421112, 0.023648090660572052, 0.0038942787796258926, 0.018966713920235634, 0.16681154072284698], [0.014505370520055294, 0.09858587384223938, 0.09252838045358658, 0.10415410250425339, 0.06758163869380951, 0.04646061733365059, 0.029723752290010452, 0.10237446427345276, 0.08911532908678055, 0.032949600368738174, 0.013182645663619041, 0.03133295103907585, 0.10100719332695007, 0.04467092454433441, 0.010150223039090633, 0.027891041710972786, 0.09378580003976822], [0.007267709355801344, 0.04615989699959755, 0.07572895288467407, 0.05580735206604004, 0.054281044751405716, 0.05616047605872154, 0.04393092170357704, 0.2627074122428894, 0.025471851229667664, 0.014888076111674309, 0.04238918051123619, 0.013592816889286041, 0.03668785095214844, 0.01570899412035942, 0.012538221664726734, 0.017344873398542404, 0.21933433413505554], [0.0004470108251553029, 0.0002877892693504691, 0.0004606699221767485, 0.0014353779843077064, 0.0002551719662733376, 0.0002365635591559112, 0.0004727586347144097, 0.5710947513580322, 0.00024402773124165833, 0.0003122408234048635, 0.00037764248554594815, 0.00013155993656255305, 0.0006600871565751731, 0.0001595701032783836, 0.00011618054850259796, 0.00011551237548701465, 0.4231931269168854], [0.0066459523513913155, 0.02284112758934498, 0.026174288243055344, 0.027046095579862595, 0.06022321805357933, 0.05978621542453766, 0.03948454186320305, 0.10292305797338486, 0.03604095056653023, 0.08145276457071304, 0.02543107606470585, 0.06686289608478546, 0.17348134517669678, 0.061417356133461, 0.03257506340742111, 0.08319999277591705, 0.09441407024860382], [0.0033439621329307556, 0.013494297862052917, 0.00951046496629715, 0.025303184986114502, 0.06697545945644379, 0.019412130117416382, 0.014736411161720753, 0.23537947237491608, 0.04714599996805191, 0.04347054287791252, 0.022238923236727715, 0.019498825073242188, 0.16459587216377258, 0.053431250154972076, 0.01087503507733345, 0.036922577768564224, 0.2136656641960144], [0.01460750587284565, 0.008798026479780674, 0.03186997398734093, 0.06215869262814522, 0.0131084518507123, 0.008235911838710308, 0.03266274556517601, 0.3689597249031067, 0.008283543400466442, 0.010916988365352154, 0.014220648445189, 0.05392098054289818, 0.015910834074020386, 0.006410167086869478, 0.009260648861527443, 0.01971750147640705, 0.32095763087272644], [0.009070897474884987, 0.006778859067708254, 0.011776060797274113, 0.02764994278550148, 0.02005440928041935, 0.011405045166611671, 0.007517619989812374, 0.368602991104126, 0.02516159787774086, 0.023731978610157967, 0.0522739514708519, 0.020105788484215736, 0.034212566912174225, 0.019038686528801918, 0.007841196842491627, 0.021220432594418526, 0.33355802297592163], [0.01117764227092266, 0.009820756502449512, 0.031176244840025902, 0.008311117999255657, 0.09045500308275223, 0.03456626087427139, 0.014328337274491787, 0.23565219342708588, 0.039786700159311295, 0.05051933228969574, 0.030611393973231316, 0.04036065191030502, 0.09937126934528351, 0.04594632610678673, 0.011639961041510105, 0.03783249482512474, 0.208444282412529], [0.009080884046852589, 0.013846849091351032, 0.02912377193570137, 0.015865998342633247, 0.04273603856563568, 0.050057753920555115, 0.01747593656182289, 0.20406168699264526, 0.033020202070474625, 0.08332094550132751, 0.027308376505970955, 0.08758986741304398, 0.08520613610744476, 0.05412470921874046, 0.019447466358542442, 0.04705364629626274, 0.18067970871925354], [0.011838131584227085, 0.011740190908312798, 0.01266005914658308, 0.009252853691577911, 0.017218273133039474, 0.015812871977686882, 0.005899511743336916, 0.3968925476074219, 0.01441878080368042, 0.01344319712370634, 0.06284923851490021, 0.01454130932688713, 0.011069831438362598, 0.018460463732481003, 0.008180155418813229, 0.020474731922149658, 0.3552478551864624], [0.007180192973464727, 0.017708424478769302, 0.024055900052189827, 0.04401049390435219, 0.03129751235246658, 0.018941527232527733, 0.021190479397773743, 0.24021752178668976, 0.04559827223420143, 0.12195011973381042, 0.03490599989891052, 0.036062076687812805, 0.06146353855729103, 0.04799607768654823, 0.02054181881248951, 0.014880000613629818, 0.21200010180473328], [0.0004155752540100366, 0.0002579358988441527, 0.00042668767855502665, 0.001331348903477192, 0.00023683438485022634, 0.00021040283900219947, 0.0004458858456928283, 0.5721243619918823, 0.00021547559299506247, 0.000282162829535082, 0.00033938809065148234, 0.00011812573211500421, 0.0005860547535121441, 0.00014078384265303612, 0.00010620499233482406, 0.00010536723857512698, 0.42265740036964417]], [[0.013206672854721546, 0.0015332826878875494, 0.004247786942869425, 0.005524610169231892, 0.0035232624504715204, 0.001979164546355605, 0.00354636088013649, 0.4232901632785797, 0.020715411752462387, 0.050074558705091476, 0.006689382717013359, 0.013757086358964443, 0.04077060893177986, 0.005963142961263657, 0.012649299576878548, 0.014775129966437817, 0.37775397300720215], [0.0387628935277462, 0.0056776199489831924, 0.5421768426895142, 0.10218501091003418, 0.06321538239717484, 0.019394556060433388, 0.027700193226337433, 0.1075383722782135, 0.0012832700740545988, 0.0010397243313491344, 0.0014342182548716664, 0.0006958724115975201, 0.0008767597610130906, 0.00028362771263346076, 0.0008178178104571998, 0.0004932757001370192, 0.08642467856407166], [0.011248037219047546, 0.002659727819263935, 0.024609841406345367, 0.044674407690763474, 0.07690677046775818, 0.0641825869679451, 0.033800046890974045, 0.41705629229545593, 0.000769249745644629, 0.0025948353577405214, 0.0005109086050651968, 0.0005578022100962698, 0.0001843144273152575, 0.00018131428805645555, 0.0011019381927326322, 0.0009880842408165336, 0.3179738223552704], [0.012124452739953995, 0.0016930506099015474, 0.005616888869553804, 0.017337894067168236, 0.13421432673931122, 0.07684291154146194, 0.05173870548605919, 0.3819313049316406, 0.003475822973996401, 0.001747561153024435, 0.0009801273699849844, 0.00044221538701094687, 0.00046626964467577636, 0.001149501884356141, 0.002756391651928425, 0.0007771446253173053, 0.3067053556442261], [0.07281091064214706, 0.0003934443520847708, 0.006217190530151129, 0.0038915404584258795, 0.00322246877476573, 0.03549439087510109, 0.7055401802062988, 0.07665655761957169, 0.014282843098044395, 0.008846498094499111, 0.009721793234348297, 0.00039342354284599423, 0.0005300250486470759, 0.0005415272316895425, 0.0012351901968941092, 0.00037099054316058755, 0.0598510280251503], [0.07467374205589294, 0.000912967196200043, 0.0075102937407791615, 0.011451687663793564, 0.02884584665298462, 0.048175521194934845, 0.22033751010894775, 0.3151739835739136, 0.012889866717159748, 0.016228269785642624, 0.0118765439838171, 0.0013648230815306306, 0.002573898760601878, 0.0005812491872347891, 0.002027815906330943, 0.00016453975695185363, 0.24521136283874512], [0.003622801508754492, 7.247672328958288e-05, 6.221976946108043e-05, 0.0009076372371055186, 0.0013292254880070686, 0.0003152289136778563, 0.00029119092505425215, 0.5510453581809998, 0.0011373133165761828, 0.004779613111168146, 0.00031177664641290903, 0.0008442172547802329, 0.0012870104983448982, 0.00019324499589856714, 0.00016735470853745937, 0.00017180822032969445, 0.4334615468978882], [0.0027854014188051224, 0.00022414341219700873, 0.0006922587053850293, 0.001304261269979179, 0.0012908950448036194, 0.0006807395839132369, 0.0008070301846601069, 0.5369035601615906, 0.0007738912245258689, 0.003911640495061874, 0.00048324151430279016, 0.0005967201432213187, 0.0026828080881386995, 0.0009651672444306314, 0.0009100367315113544, 0.0017351724673062563, 0.4432530999183655], [0.024708325043320656, 0.0008062993292696774, 0.0009322196710854769, 0.0006906930939294398, 0.0006919992156326771, 0.0006177034229040146, 0.006188294850289822, 0.16756001114845276, 0.02962748520076275, 0.22270900011062622, 0.11169373244047165, 0.1086287796497345, 0.13318923115730286, 0.01141897402703762, 0.01900392957031727, 0.013614606112241745, 0.14791879057884216], [0.015017969533801079, 0.00014674702833872288, 0.00011871708557009697, 0.00038661935832351446, 0.00022052458371035755, 9.99507392407395e-05, 0.0005309984553605318, 0.2981436550617218, 0.007017528638243675, 0.014008406549692154, 0.014474696479737759, 0.03518832474946976, 0.2686300277709961, 0.05893963947892189, 0.013562672771513462, 0.017359640449285507, 0.25615394115448], [0.010787694714963436, 0.00021934015967417508, 4.3296298827044666e-05, 0.00028484140057116747, 5.253265771898441e-05, 8.860318484948948e-05, 0.00012271829473320395, 0.503818929195404, 0.0014227618230506778, 0.015306280925869942, 0.0014115954982116818, 0.009455475956201553, 0.023021597415208817, 0.017656397074460983, 0.0031584366224706173, 0.004959648475050926, 0.40818989276885986], [0.019775046035647392, 0.0004979056539013982, 0.00039776592166163027, 0.0008033571066334844, 0.0004726545885205269, 0.000195058441022411, 0.0006021548178978264, 0.13419508934020996, 0.005916981492191553, 0.018768170848488808, 0.017711156979203224, 0.03505523130297661, 0.38131290674209595, 0.11678009480237961, 0.041961733251810074, 0.10060297697782516, 0.12495163083076477], [0.06878522783517838, 0.00044730465742759407, 0.00040123920189216733, 0.000536124047357589, 0.000248724507400766, 0.00021820802066940814, 0.0007885429658927023, 0.2416054755449295, 0.0008294736035168171, 0.004048370756208897, 0.0038150856271386147, 0.006473804824054241, 0.08428594470024109, 0.10048052668571472, 0.07577506452798843, 0.19562748074531555, 0.215633362531662], [0.04427770525217056, 0.0010959581704810262, 0.0021852341014891863, 0.0010863611241802573, 0.0005874501657672226, 0.0005492506898008287, 0.0018735717749223113, 0.18085716664791107, 0.0006010523065924644, 0.00358277908526361, 0.004831456579267979, 0.00983841810375452, 0.0326489694416523, 0.038303621113300323, 0.1301141232252121, 0.3835523724555969, 0.164014533162117], [0.010868726298213005, 0.000507623830344528, 0.000992283457890153, 0.0010654046200215816, 0.00195686100050807, 0.000794819847214967, 0.004638891667127609, 0.5138244032859802, 0.0003933765401598066, 0.0021304129622876644, 0.0005681429174728692, 0.0011823376407846808, 0.0029030509758740664, 0.0023735705763101578, 0.0030291450675576925, 0.025817444548010826, 0.42695352435112], [0.008689657784998417, 0.00010850232501979917, 0.0003444029134698212, 0.0006851213402114809, 0.000611038994975388, 0.0002182644821004942, 0.0005896264337934554, 0.4846785068511963, 0.00118989625480026, 0.0017621115548536181, 0.0002332257863599807, 0.001039093709550798, 0.0039991410449147224, 0.0033981353044509888, 0.004661280661821365, 0.044583383947610855, 0.4432086944580078], [0.002714197849854827, 0.00018101122986990958, 0.0005452269688248634, 0.0010359004372730851, 0.0010138415964320302, 0.000552557990886271, 0.000664101738948375, 0.5383545756340027, 0.0006913218530826271, 0.0034049120731651783, 0.00041311196400783956, 0.000538726628292352, 0.0023400436621159315, 0.0008439483353868127, 0.0008018675143830478, 0.0016404861817136407, 0.44426414370536804]], [[0.011345873586833477, 0.0007709477213211358, 0.0014038508525118232, 0.004738782998174429, 0.0007013700669631362, 0.001179673708975315, 0.003337634028866887, 0.49937576055526733, 0.0020365407690405846, 0.0030167163349688053, 0.000999649055302143, 0.0014266648795455694, 0.007196913938969374, 0.0011097381357103586, 0.0010770433582365513, 0.003063532756641507, 0.4572192430496216], [0.01053529791533947, 0.027524331584572792, 0.03197261691093445, 0.07738640904426575, 0.028833836317062378, 0.021090110763907433, 0.0924440249800682, 0.3720938265323639, 0.0037854250986129045, 0.002752794651314616, 0.0005459869280457497, 0.0017937066731974483, 0.00702672079205513, 0.0008293442078866065, 0.0007454490987583995, 0.0014969718176871538, 0.31914323568344116], [0.009984524920582771, 0.09105601161718369, 0.043333519250154495, 0.21863463521003723, 0.036004822701215744, 0.05453525856137276, 0.12730929255485535, 0.2157953977584839, 0.0069505032151937485, 0.0016770476941019297, 0.0005822463426738977, 0.0005809149006381631, 0.0017111212946474552, 0.003899676725268364, 0.002958230674266815, 0.0012217595940455794, 0.1837649643421173], [0.017450235784053802, 0.05098915845155716, 0.015524549409747124, 0.06202205643057823, 0.030074365437030792, 0.07405059784650803, 0.060297850519418716, 0.3576740324497223, 0.005175878759473562, 0.001557640265673399, 0.00392924714833498, 0.0013206643052399158, 0.005709785036742687, 0.0022886116057634354, 0.005686908960342407, 0.00462358957156539, 0.3016248345375061], [0.022489123046398163, 0.07351426035165787, 0.030595459043979645, 0.3329576253890991, 0.016714639961719513, 0.056758470833301544, 0.044932927936315536, 0.20562781393527985, 0.008106258697807789, 0.005058916285634041, 0.004220655187964439, 0.0020410320721566677, 0.011312801390886307, 0.003253693925216794, 0.006771337706595659, 0.0018157936865463853, 0.17382921278476715], [0.00996533501893282, 0.027673326432704926, 0.015001049265265465, 0.18530084192752838, 0.08966254442930222, 0.0357201062142849, 0.01424003578722477, 0.3181236982345581, 0.005593562498688698, 0.0019186713034287095, 0.0010867591481655836, 0.0010455887531861663, 0.029361935332417488, 0.0022985092364251614, 0.0008409272413700819, 0.0011124672600999475, 0.2610546052455902], [0.011177428998053074, 0.04854544624686241, 0.012283788993954659, 0.02792965993285179, 0.06821314990520477, 0.0958329290151596, 0.00607072189450264, 0.3744601309299469, 0.005121776368469, 0.0049051446840167046, 0.0020049819722771645, 0.010967169888317585, 0.014469522051513195, 0.002558438340201974, 0.0030325050465762615, 0.0010383353801444173, 0.31138888001441956], [0.0018288403516635299, 0.00034177026827819645, 0.000868281233124435, 0.0009339270181953907, 0.00018933658429887146, 0.0005072789499536157, 0.0005265168729238212, 0.5551133751869202, 0.0003731333708856255, 0.0004173602210357785, 0.00012384286674205214, 0.00014897013898007572, 0.00047953444300219417, 0.00022992586309555918, 0.00013811467215418816, 0.00014782029029447585, 0.4376320242881775], [0.028288448229432106, 0.013921759091317654, 0.019955700263381004, 0.01564882881939411, 0.014824780635535717, 0.03738304600119591, 0.016180751845240593, 0.3463776707649231, 0.022005783393979073, 0.025955889374017715, 0.008044268935918808, 0.026218431070446968, 0.07485531270503998, 0.006993490736931562, 0.006199105177074671, 0.009632807224988937, 0.3275138735771179], [0.015722323209047318, 0.012799189426004887, 0.024145575240254402, 0.005295650102198124, 0.024525925517082214, 0.04366506263613701, 0.008266536518931389, 0.1459624469280243, 0.13981643319129944, 0.029632318764925003, 0.06461349129676819, 0.050010647624731064, 0.21709710359573364, 0.05096643790602684, 0.008363843895494938, 0.02374880760908127, 0.13536809384822845], [0.012077665887773037, 0.013800821267068386, 0.0194963701069355, 0.0014476276701316237, 0.0062560392543673515, 0.02170725166797638, 0.004694177769124508, 0.44704803824424744, 0.032420624047517776, 0.014501435682177544, 0.0016951668076217175, 0.0038341255858540535, 0.010030636563897133, 0.015393426641821861, 0.00880120787769556, 0.0031624718103557825, 0.38363295793533325], [0.025811590254306793, 0.006270482204854488, 0.021383745595812798, 0.01874261163175106, 0.008837755769491196, 0.012617083266377449, 0.012052659876644611, 0.30576974153518677, 0.058332040905952454, 0.04825763404369354, 0.015365748666226864, 0.00734691321849823, 0.1366000920534134, 0.024931928142905235, 0.005069859325885773, 0.008471430279314518, 0.2841387093067169], [0.10116562247276306, 0.00922666396945715, 0.014383894391357899, 0.020008442923426628, 0.015994219109416008, 0.028799451887607574, 0.013592690229415894, 0.21266353130340576, 0.05109625309705734, 0.037287455052137375, 0.056167565286159515, 0.04833369702100754, 0.1298355758190155, 0.017450066283345222, 0.02126818336546421, 0.01967090740799904, 0.20305587351322174], [0.05394609645009041, 0.014758073724806309, 0.019108371809124947, 0.009098870679736137, 0.012177100405097008, 0.04340613633394241, 0.006621689070016146, 0.22507818043231964, 0.053362827748060226, 0.03042423725128174, 0.019929885864257812, 0.02921587973833084, 0.21143800020217896, 0.03299817442893982, 0.020433923229575157, 0.011744922958314419, 0.2062576860189438], [0.019683953374624252, 0.004259009845554829, 0.026455890387296677, 0.007167668081820011, 0.005343365017324686, 0.012682690285146236, 0.003177955513820052, 0.3726280927658081, 0.018664361909031868, 0.021120937541127205, 0.028131481260061264, 0.015818949788808823, 0.08525123447179794, 0.017485465854406357, 0.007051728665828705, 0.018232619389891624, 0.3368445634841919], [0.0320599302649498, 0.00398058770224452, 0.007576689589768648, 0.005517119541764259, 0.006193910725414753, 0.015832168981432915, 0.0027940492145717144, 0.2618427872657776, 0.0355912446975708, 0.020826207473874092, 0.01384084951132536, 0.012926173396408558, 0.29909566044807434, 0.030012499541044235, 0.006058905739337206, 0.006729679182171822, 0.23912139236927032], [0.0017211873782798648, 0.0002902017149608582, 0.0007449788390658796, 0.0007946657133288682, 0.00015725726552773267, 0.0004422204801812768, 0.0004456284805200994, 0.5556318163871765, 0.0003441806766204536, 0.00038795583532191813, 0.00011028895096387714, 0.00013749119534622878, 0.0004364426131360233, 0.00021114558330737054, 0.0001244174491148442, 0.00013390782987698913, 0.43788620829582214]], [[0.014110767282545567, 0.0024705822579562664, 0.002178468741476536, 0.00584516953676939, 0.004958294797688723, 0.009759688749909401, 0.00439324788749218, 0.46703261137008667, 0.004235672298818827, 0.0037168862763792276, 0.02746523730456829, 0.006583313923329115, 0.008500880561769009, 0.0028269877657294273, 0.002390121342614293, 0.006044536828994751, 0.4274875819683075], [0.014817937277257442, 0.01925741694867611, 0.028608912602066994, 0.02303149923682213, 0.036657921969890594, 0.0022971874568611383, 0.01103015523403883, 0.456119179725647, 0.0012087792856618762, 0.005246355663985014, 0.0015149918617680669, 0.004557821433991194, 0.009032451547682285, 0.0027235455345362425, 0.0019944782834500074, 0.0061264643445611, 0.37577494978904724], [0.01231284998357296, 0.16504409909248352, 0.013803359121084213, 0.00564383203163743, 0.0042946659959852695, 0.001500042388215661, 0.0014556397218257189, 0.43885353207588196, 0.0019745526369661093, 0.0004274443781469017, 0.00020986043091397732, 7.920734060462564e-05, 0.00011173914390383288, 0.0013359427684918046, 0.001320637995377183, 0.0008220553863793612, 0.35081055760383606], [0.0060074664652347565, 0.004852284677326679, 0.08951862901449203, 0.022039301693439484, 0.00435992144048214, 0.0026769288815557957, 0.0045935423113405704, 0.46727070212364197, 3.2905514672165737e-05, 1.4345622730616014e-05, 0.0001393159182043746, 4.3390155042288825e-05, 0.00014946729061193764, 4.6499193558702245e-05, 0.0006850280915386975, 0.0009455970721319318, 0.3966245949268341], [0.0790741890668869, 0.0030712017323821783, 0.0073019894771277905, 0.5538678169250488, 0.004058005288243294, 0.009985352866351604, 0.06533657014369965, 0.13933014869689941, 0.00015800043183844537, 0.0004964639665558934, 0.00029919258668087423, 0.0002250416437163949, 0.000303982145851478, 0.00011089473264291883, 0.018882932141423225, 0.004893782548606396, 0.11260446161031723], [0.02261258475482464, 0.0016960124485194683, 0.012181530706584454, 0.12831464409828186, 0.28239110112190247, 0.0186010729521513, 0.024396926164627075, 0.27063822746276855, 0.0005782949738204479, 0.0035174130462110043, 0.0006784854922443628, 0.001847911742515862, 0.0007668919861316681, 0.00015998353774193674, 0.0013151849852874875, 0.003741527209058404, 0.22656220197677612], [0.06165337935090065, 0.005149534437805414, 0.0006064609042368829, 0.006854188162833452, 0.028681613504886627, 0.08585155010223389, 0.025986842811107635, 0.4233226478099823, 0.017968611791729927, 0.0009039347642101347, 0.0019463000353425741, 0.0009101430187001824, 0.0004345017368905246, 0.0011427525896579027, 0.00016804800543468446, 0.0012700501829385757, 0.3371495008468628], [0.0019813745748251677, 0.0014737880555912852, 0.0011357017792761326, 0.001283604302443564, 0.0023400718346238136, 0.0016586726997047663, 0.0010090606519952416, 0.5301252603530884, 0.0009908988140523434, 0.0012801074190065265, 0.001005787868052721, 0.0014225491322577, 0.0009969007223844528, 0.0004917752812616527, 0.0008698684396222234, 0.0007379172602668405, 0.4511966407299042], [0.014917157590389252, 0.0005298461765050888, 0.00012571160914376378, 0.003080169903114438, 0.0018169389804825187, 0.0008548601763322949, 0.021253390237689018, 0.445070743560791, 0.018381619825959206, 0.10098598152399063, 0.006466514430940151, 0.007354699075222015, 0.009819825179874897, 0.0005326664540916681, 0.00173884944524616, 0.003765143919736147, 0.36330586671829224], [0.12149844318628311, 0.0012529102386906743, 0.0001934340689331293, 0.0013229299802333117, 0.0004528471326921135, 0.0020638646092265844, 0.014508597552776337, 0.20786873996257782, 0.4181981384754181, 0.01956816017627716, 0.004226239863783121, 0.010012696497142315, 0.00897121336311102, 0.00893235020339489, 0.0007013690192252398, 0.0014338763430714607, 0.17879419028759003], [0.15141110122203827, 0.003202573861926794, 6.483922334155068e-05, 0.000140385702252388, 0.001051912666298449, 0.006273757666349411, 0.0064452276565134525, 0.34135207533836365, 0.050235822796821594, 0.07263109087944031, 0.019692091271281242, 0.01686038449406624, 0.026062026619911194, 0.004923694301396608, 0.0030663232319056988, 0.0028896499425172806, 0.2936969995498657], [0.01598452404141426, 0.00032322219340130687, 0.00014409887080546468, 0.00027765377308242023, 0.00010223880963167176, 0.00019296172831673175, 0.012768030166625977, 0.054579317569732666, 0.0011383716482669115, 0.0097724674269557, 0.828976035118103, 0.0006810228805989027, 0.02237636037170887, 0.0010828038211911917, 0.00251349457539618, 0.0003637057961896062, 0.04872376099228859], [0.06842824071645737, 0.0004867010284215212, 2.3710548703093082e-05, 0.0007310395012609661, 4.5141492591938004e-05, 0.0002220443420810625, 0.0015508838696405292, 0.3573071360588074, 0.004921369720250368, 0.041440464556217194, 0.0019064503721892834, 0.09132841229438782, 0.06100711226463318, 0.03653496876358986, 0.015409774146974087, 0.015622846782207489, 0.3030337691307068], [0.025666601955890656, 0.0009164935909211636, 0.00010547880810918286, 0.0019255708903074265, 0.00018025054305326194, 7.468437979696319e-05, 0.0012315738713368773, 0.3011631369590759, 0.001192352850921452, 0.031612835824489594, 0.009583639912307262, 0.031684670597314835, 0.2531246840953827, 0.026209086179733276, 0.024296654388308525, 0.03905860334634781, 0.2519736588001251], [0.026538629084825516, 0.001564147649332881, 0.003308072919026017, 0.00563762616366148, 0.002290179021656513, 0.00032372851273976266, 0.0012666749535128474, 0.37881410121917725, 0.01853073574602604, 0.0014329113764688373, 0.0022116375621408224, 0.009517709724605083, 0.03530459478497505, 0.10692887753248215, 0.008047818206250668, 0.07240409404039383, 0.325878381729126], [0.037326615303754807, 7.227421156130731e-05, 4.257094133208739e-06, 8.799164061201736e-05, 1.0423711501061916e-05, 1.3401118849287741e-05, 0.00014162968727760017, 0.018169838935136795, 6.649984698015032e-06, 9.621450590202585e-05, 0.0004485661047510803, 0.000213238614378497, 0.0010632271878421307, 0.0005494948127306998, 0.9247152209281921, 0.00042902069981209934, 0.01665179617702961], [0.0019838919397443533, 0.0013083294034004211, 0.001013571978546679, 0.0011206583585590124, 0.0020337167661637068, 0.0014117201790213585, 0.000931464834138751, 0.5307431221008301, 0.0009215709287673235, 0.0011540006380528212, 0.0009628679836168885, 0.0013596871867775917, 0.0009609250701032579, 0.0004782821051776409, 0.0008783574448898435, 0.000755774206481874, 0.45198214054107666]], [[0.0854300931096077, 0.0030179969035089016, 0.003994830884039402, 0.0056284512393176556, 0.004460444673895836, 0.002580961911007762, 0.002086714841425419, 0.378406286239624, 0.019140319898724556, 0.029018983244895935, 0.01661057397723198, 0.0064309281297028065, 0.04220118746161461, 0.004914135672152042, 0.012070589698851109, 0.02011053077876568, 0.36389702558517456], [0.10772743821144104, 0.08537203818559647, 0.02353050746023655, 0.04524604603648186, 0.05669783800840378, 0.20296666026115417, 0.07708732783794403, 0.18906867504119873, 0.011571051552891731, 0.004700229037553072, 0.005370364524424076, 0.003649464575573802, 0.011815380305051804, 0.004359361715614796, 0.00414521899074316, 0.005047038663178682, 0.16164535284042358], [0.042992495000362396, 0.03335381671786308, 0.006874927785247564, 0.00949427392333746, 0.0059476010501384735, 0.03842833265662193, 0.007127955090254545, 0.44648465514183044, 0.009897420182824135, 0.005644648801535368, 0.008060656487941742, 0.0017722725169733167, 0.005780004430562258, 0.006809908431023359, 0.0014666742645204067, 0.0019642391707748175, 0.36790013313293457], [0.02723478339612484, 0.010761923156678677, 0.0038043612148612738, 0.005540398880839348, 0.014922885224223137, 0.014753122813999653, 0.009761949069797993, 0.4393165409564972, 0.030955437570810318, 0.024643436074256897, 0.012778385542333126, 0.003452734788879752, 0.01537474524229765, 0.013690480962395668, 0.0014762291684746742, 0.002523239701986313, 0.36900946497917175], [0.037258297204971313, 0.007767591625452042, 0.004381546750664711, 0.032339051365852356, 0.0075458986684679985, 0.013827256858348846, 0.01443540956825018, 0.4348143935203552, 0.01115723792463541, 0.016055386513471603, 0.01146245189011097, 0.010428386740386486, 0.020782742649316788, 0.00823657214641571, 0.0034177976194769144, 0.0037883396726101637, 0.36230161786079407], [0.035372767597436905, 0.011724458076059818, 0.0033400419633835554, 0.017536710947752, 0.009815460070967674, 0.034825462847948074, 0.02377331256866455, 0.4405779242515564, 0.008879315108060837, 0.007596019189804792, 0.004719038028270006, 0.004539698362350464, 0.011677262373268604, 0.010525363497436047, 0.0014933859929442406, 0.006522198207676411, 0.36708158254623413], [0.007848944514989853, 0.004437538795173168, 0.001386468531563878, 0.003685190575197339, 0.0011456039501354098, 0.00161272962577641, 0.000268500589299947, 0.45037078857421875, 0.03153678774833679, 0.0371718592941761, 0.006742254365235567, 0.012345647439360619, 0.0115497512742877, 0.031305402517318726, 0.004572288598865271, 0.00146691151894629, 0.3925533592700958], [0.004816779866814613, 0.002383073791861534, 0.00892702117562294, 0.0069795288145542145, 0.0023289548698812723, 0.0029933147598057985, 0.005681321024894714, 0.5037974715232849, 0.0027090199291706085, 0.013236936181783676, 0.0043112714774906635, 0.00250759138725698, 0.0021912134252488613, 0.002705693244934082, 0.006174657493829727, 0.0035611451603472233, 0.4246949851512909], [0.018210338428616524, 0.007109534461051226, 0.0036550371441990137, 0.018650906160473824, 0.0034694517962634563, 0.0022885838989168406, 0.0018192676361650229, 0.41893625259399414, 0.004611875396221876, 0.0035335258580744267, 0.00683232955634594, 0.010060845874249935, 0.03501167893409729, 0.005499502178281546, 0.0351838581264019, 0.04106462374329567, 0.3840624690055847], [0.018574588000774384, 0.003245359053835273, 0.001176782650873065, 0.009598059579730034, 0.0013845268404111266, 0.001814891118556261, 0.002023482695221901, 0.2843804359436035, 0.027884092181921005, 0.00533917173743248, 0.007165585644543171, 0.03314221277832985, 0.17999014258384705, 0.08728275448083878, 0.010214883834123611, 0.06921931356191635, 0.25756365060806274], [0.01686643622815609, 0.0015880335122346878, 0.001718031009659171, 0.002575324149802327, 0.00046804227167740464, 0.0008317902684211731, 0.0012658920604735613, 0.4040377736091614, 0.026010040193796158, 0.020744938403367996, 0.00807048287242651, 0.02047741413116455, 0.03527773171663284, 0.08031362295150757, 0.0095042260363698, 0.017074642702937126, 0.35317546129226685], [0.0172945074737072, 0.007280809339135885, 0.015748262405395508, 0.011971774511039257, 0.0034243310801684856, 0.0025515342131257057, 0.0039002783596515656, 0.3353986144065857, 0.03345007449388504, 0.054423898458480835, 0.009309571236371994, 0.009196617640554905, 0.0383567251265049, 0.0792415589094162, 0.011035078205168247, 0.0546928308904171, 0.31272345781326294], [0.007672100327908993, 0.0029364190995693207, 0.002351032570004463, 0.011706591583788395, 0.001210115966387093, 0.0014930680627003312, 0.0012183137005195022, 0.4761522114276886, 0.003254745854064822, 0.003243603277951479, 0.0009521990432403982, 0.001919349073432386, 0.011584454216063023, 0.006537989713251591, 0.009366673417389393, 0.02173188142478466, 0.4366692900657654], [0.013616631738841534, 0.0025452175177633762, 0.003681562142446637, 0.012900063768029213, 0.0016006738878786564, 0.0019238409586250782, 0.0015870446804910898, 0.45826655626296997, 0.001953268889337778, 0.0016292273066937923, 0.0011369376443326473, 0.0029098610393702984, 0.006554546765983105, 0.004265179857611656, 0.017152519896626472, 0.06934789568185806, 0.39892899990081787], [0.008897403255105019, 0.0007873740396462381, 0.0007181507535278797, 0.0007311116205528378, 0.0006096087163314223, 0.0008161278674378991, 0.000388050772016868, 0.1990060657262802, 0.25867992639541626, 0.029603339731693268, 0.005455044098198414, 0.015667835250496864, 0.055782247334718704, 0.23900572955608368, 0.0018490678630769253, 0.01015045028179884, 0.1718524694442749], [0.010036182589828968, 0.0009416015818715096, 0.0015584181528538465, 0.0010633409256115556, 0.0005779470666311681, 0.0004175567300990224, 0.0004960694932378829, 0.3224344849586487, 0.11238254606723785, 0.06543972343206406, 0.011825899593532085, 0.020038258284330368, 0.04533001407980919, 0.0777665376663208, 0.01010243222117424, 0.014740225858986378, 0.30484864115715027], [0.00506402924656868, 0.002151767024770379, 0.008284646086394787, 0.006230692379176617, 0.0021166489459574223, 0.0026998238172382116, 0.005223971791565418, 0.5057055950164795, 0.0026926833670586348, 0.013170181773602962, 0.004327911883592606, 0.0024216766469180584, 0.002121603349223733, 0.002613781252875924, 0.005846674554049969, 0.0034707067534327507, 0.4258575439453125]], [[0.022880421951413155, 0.03529440611600876, 0.019696753472089767, 0.050164226442575455, 0.03654306381940842, 0.019499309360980988, 0.0031719955150038004, 0.43258923292160034, 0.006270706653594971, 0.003737811231985688, 0.0020605844911187887, 0.001559276832267642, 0.008204489015042782, 0.011015087366104126, 0.007928246632218361, 0.002333955140784383, 0.3370504379272461], [0.10088784992694855, 0.06402290612459183, 0.01898825354874134, 0.02063124068081379, 0.005467532202601433, 0.07689423114061356, 0.02549259178340435, 0.2859945595264435, 0.027556829154491425, 0.008004356175661087, 0.011517329141497612, 0.03605891764163971, 0.026614727452397346, 0.03240635246038437, 0.003420459106564522, 0.012771648354828358, 0.24327027797698975], [0.08177720755338669, 0.008181660436093807, 0.00527913449332118, 0.016394097357988358, 0.00894406158477068, 0.012210753746330738, 0.015227371826767921, 0.43873006105422974, 0.004990908782929182, 0.00395273556932807, 0.005402241367846727, 0.012191261164844036, 0.015615029260516167, 0.006603778339922428, 0.00502715865150094, 0.006678436417132616, 0.35279402136802673], [0.12789654731750488, 0.018632663413882256, 0.03575440123677254, 0.028624137863516808, 0.025642065331339836, 0.04912392422556877, 0.09279120713472366, 0.3144552409648895, 0.0044326079078018665, 0.004398720804601908, 0.007934763096272945, 0.0055986326187849045, 0.008985401131212711, 0.00439957482740283, 0.0018971936078742146, 0.003387649543583393, 0.2660452723503113], [0.05445573106408119, 0.013056723400950432, 0.017959441989660263, 0.041777029633522034, 0.0649150162935257, 0.04769081622362137, 0.06029227003455162, 0.32089272141456604, 0.007306837011128664, 0.006061144173145294, 0.009770243428647518, 0.005389487836509943, 0.052157700061798096, 0.006284154951572418, 0.003173522185534239, 0.008830040693283081, 0.2799871265888214], [0.06801784783601761, 0.06522335857152939, 0.027051173150539398, 0.04242398217320442, 0.023999491706490517, 0.1040259525179863, 0.05790415406227112, 0.2381482720375061, 0.05209377035498619, 0.012707652524113655, 0.0026207517366856337, 0.01460757851600647, 0.030093079432845116, 0.03553071990609169, 0.0030516120605170727, 0.01591617241501808, 0.20658448338508606], [0.06857061386108398, 0.0468660406768322, 0.026686836034059525, 0.025448715314269066, 0.014131818898022175, 0.018788283690810204, 0.007903056219220161, 0.4045487642288208, 0.006904031615704298, 0.007226529065519571, 0.006309757474809885, 0.00576722202822566, 0.0026206308975815773, 0.0033020966220647097, 0.004183491226285696, 0.008033758029341698, 0.3427083194255829], [0.011853308416903019, 0.0019049011170864105, 0.0020108323078602552, 0.00835198163986206, 0.005694197025150061, 0.0032695652917027473, 0.0036214820574969053, 0.5135090351104736, 0.0014499028911814094, 0.00243352516554296, 0.0033011252526193857, 0.002650870941579342, 0.0028537230100482702, 0.0021097061689943075, 0.002249825047329068, 0.004644565749913454, 0.4280913472175598], [0.028149627149105072, 0.0343836173415184, 0.025277364999055862, 0.012027369812130928, 0.014542841352522373, 0.035944972187280655, 0.015858503058552742, 0.3811124265193939, 0.0019646536093205214, 0.008018952794373035, 0.039537396281957626, 0.008972112089395523, 0.0027407859452068806, 0.001302420161664486, 0.03561503440141678, 0.023814842104911804, 0.3307371139526367], [0.020547104999423027, 0.007403127383440733, 0.00887544360011816, 0.02184613235294819, 0.013340843841433525, 0.005759109742939472, 0.005755488760769367, 0.4319256544113159, 0.007227728143334389, 0.007625177036970854, 0.01320359855890274, 0.018372714519500732, 0.023425132036209106, 0.008892761543393135, 0.010289324447512627, 0.027462823316454887, 0.368047833442688], [0.03553614392876625, 0.0033822571858763695, 0.002275908598676324, 0.004806940909475088, 0.009293694980442524, 0.0026291401591151953, 0.009329186752438545, 0.37922099232673645, 0.06237481161952019, 0.03264700248837471, 0.0010076938197016716, 0.00669941445812583, 0.018506808206439018, 0.0463733971118927, 0.040306784212589264, 0.01613820530474186, 0.3294716477394104], [0.03902363032102585, 0.011857344768941402, 0.029996488243341446, 0.010018144734203815, 0.004983636550605297, 0.0033116769045591354, 0.003961519338190556, 0.382441908121109, 0.016015594825148582, 0.06205238774418831, 0.00618678517639637, 0.019296489655971527, 0.01242037583142519, 0.01670861802995205, 0.010608470067381859, 0.026647863909602165, 0.3444690406322479], [0.024816537275910378, 0.020351756364107132, 0.05591588467359543, 0.017345983535051346, 0.018865063786506653, 0.01279106829315424, 0.01165543869137764, 0.3361521363258362, 0.010467136278748512, 0.067290298640728, 0.010313466191291809, 0.032353851944208145, 0.00629299646243453, 0.005577710922807455, 0.010494670830667019, 0.047657277435064316, 0.31165868043899536], [0.01933697611093521, 0.021230395883321762, 0.030936021357774734, 0.009255247190594673, 0.0056896451860666275, 0.016703080385923386, 0.011242924258112907, 0.39001569151878357, 0.0020494975615292788, 0.009716367349028587, 0.023819079622626305, 0.01601150445640087, 0.003656909801065922, 0.0024994804989546537, 0.043761271983385086, 0.0476105660200119, 0.3464653491973877], [0.021183954551815987, 0.0024019889533519745, 0.004433831200003624, 0.003882960882037878, 0.0018094646511599422, 0.0032343307975679636, 0.0132086630910635, 0.4638918340206146, 0.012041247449815273, 0.004053337499499321, 0.03659071400761604, 0.00323711265809834, 0.011096393689513206, 0.015315958298742771, 0.00036005032598041, 0.00608314061537385, 0.39717498421669006], [0.02599569410085678, 0.0029310311656445265, 0.009629348292946815, 0.003399013541638851, 0.0012451994698494673, 0.0015715686604380608, 0.0036003459244966507, 0.34181633591651917, 0.018057383596897125, 0.04739438742399216, 0.005861514713615179, 0.07060449570417404, 0.06857673078775406, 0.04932856559753418, 0.0038784630596637726, 0.029119020327925682, 0.3169908821582794], [0.012022064067423344, 0.0018082662718370557, 0.0018135112477466464, 0.0076340315863490105, 0.005415252409875393, 0.003205433487892151, 0.003551595378667116, 0.5135653614997864, 0.0014594000531360507, 0.0024116092827171087, 0.003419057000428438, 0.002717947354540229, 0.0029872881714254618, 0.002136907773092389, 0.0022560011129826307, 0.004935144912451506, 0.4286612570285797]], [[0.00929126888513565, 0.017191775143146515, 0.011651807464659214, 0.01235037948936224, 0.0045356242917478085, 0.009318961761891842, 0.008056349121034145, 0.4993261992931366, 0.0023149061016738415, 0.001564012374728918, 0.001460631494410336, 0.0017081897240132093, 0.0009094775305129588, 0.0021517244167625904, 0.001912733307108283, 0.002200718503445387, 0.4140552580356598], [0.02446877770125866, 0.04359372705221176, 0.029880598187446594, 0.00696707284078002, 0.0021395254880189896, 0.0012497034622356296, 0.009011239744722843, 0.47458216547966003, 0.00720573402941227, 0.00378995924256742, 0.0016322663286700845, 0.0035026094410568476, 0.006198841612786055, 0.004283383022993803, 0.002555274637416005, 0.00538793858140707, 0.3735511302947998], [0.02930302359163761, 0.08564784377813339, 0.014852220192551613, 0.00471562659367919, 0.0027983069885522127, 0.0010617021471261978, 0.00814543291926384, 0.46687400341033936, 0.0015140946488827467, 0.0011518927058205009, 0.001203755964525044, 0.0008280725451186299, 0.004201377276331186, 0.003927871584892273, 0.003063049167394638, 0.002539727371186018, 0.3681719899177551], [0.029762674123048782, 0.10279069095849991, 0.39416149258613586, 0.01715431548655033, 0.01005059015005827, 0.0016533824382349849, 0.00619015796110034, 0.2362605780363083, 0.0019211572362110019, 0.001063868636265397, 0.0011556497775018215, 0.0004680499550886452, 0.0061555905267596245, 0.0016367505304515362, 0.0014136601239442825, 0.0017721574986353517, 0.18638914823532104], [0.05517926439642906, 0.08208931982517242, 0.3490092158317566, 0.2984776496887207, 0.018951701000332832, 0.011232759803533554, 0.01455780677497387, 0.08810630440711975, 0.002505831653252244, 0.0008589171920903027, 0.0008082542335614562, 0.0010820094030350447, 0.001361397560685873, 0.0007305339677259326, 0.0012765300925821066, 0.0019563110545277596, 0.07181615382432938], [0.069474957883358, 0.13416263461112976, 0.20173947513103485, 0.19440624117851257, 0.053877852857112885, 0.018970435485243797, 0.016045285388827324, 0.1536427140235901, 0.01159395556896925, 0.003921298775821924, 0.0007665042649023235, 0.00287225772626698, 0.001958930166438222, 0.002384862396866083, 0.005944040138274431, 0.004751773085445166, 0.12348677963018417], [0.03200415149331093, 0.04867372661828995, 0.14188799262046814, 0.27125105261802673, 0.054742615669965744, 0.044986095279455185, 0.005503072403371334, 0.21964265406131744, 0.00258469651453197, 0.0011289140675216913, 0.00022259070829022676, 0.0006396376993507147, 0.0004103493702132255, 0.0008578583947382867, 0.001415062928572297, 0.0007625702419318259, 0.1732870489358902], [0.003594295121729374, 0.0029551757033914328, 0.0021916665136814117, 0.002443571574985981, 0.0010741599835455418, 0.00332530215382576, 0.002390101784840226, 0.5312371253967285, 0.0008651370299048722, 0.0013985804980620742, 0.0005148756899870932, 0.0005316686583682895, 0.0005679134046658874, 0.0017168186604976654, 0.0007962448871694505, 0.001154624274931848, 0.44324278831481934], [0.013400969095528126, 0.006885268725454807, 0.0029821305070072412, 0.004140232689678669, 0.004326180089265108, 0.015951605513691902, 0.009797601029276848, 0.49987009167671204, 0.012189355678856373, 0.001999528845772147, 0.0022221363615244627, 0.00199200795032084, 0.0005530448397621512, 0.00038220331771299243, 0.0007172237383201718, 0.0013923528604209423, 0.4211980402469635], [0.008692437782883644, 0.005577174015343189, 0.00348121440038085, 0.005139702931046486, 0.005881609860807657, 0.006603053770959377, 0.010084631852805614, 0.4377695620059967, 0.09897876530885696, 0.019061705097556114, 0.0024581176694482565, 0.009331346489489079, 0.0031951831188052893, 0.0017115366645157337, 0.001182248815894127, 0.0019272314384579659, 0.3789244592189789], [0.045874789357185364, 0.0022652929183095694, 0.0012088241055607796, 0.007476700469851494, 0.005944989621639252, 0.0031458651646971703, 0.02097923681139946, 0.443372517824173, 0.0675770491361618, 0.019192401319742203, 0.002762912306934595, 0.0120363999158144, 0.0017859921790659428, 0.0018685529939830303, 0.0008911244221962988, 0.0013174525229260325, 0.3622998297214508], [0.02091415412724018, 0.011107457801699638, 0.004869705531746149, 0.0017451482126489282, 0.00397754879668355, 0.010159249417483807, 0.014687626622617245, 0.3539581596851349, 0.06332702934741974, 0.08626952022314072, 0.04617675766348839, 0.008191022090613842, 0.042312975972890854, 0.006634667981415987, 0.004943626467138529, 0.0058705504052340984, 0.31485483050346375], [0.02598239853978157, 0.003810475580394268, 0.005120215471833944, 0.003810820635408163, 0.0035531020257622004, 0.002124243648722768, 0.0016662784619256854, 0.3997665345668793, 0.05421795696020126, 0.06993283331394196, 0.030443234369158745, 0.037636298686265945, 0.014768162742257118, 0.00509606534615159, 0.0030609883833676577, 0.001049359911121428, 0.33796101808547974], [0.026836581528186798, 0.004885324276983738, 0.0024727191776037216, 0.0026851315051317215, 0.0014422073727473617, 0.002416890347376466, 0.00156425591558218, 0.4346471428871155, 0.015200946480035782, 0.022759273648262024, 0.019903168082237244, 0.04250545799732208, 0.02814398892223835, 0.01189536415040493, 0.007383238524198532, 0.002080547856166959, 0.37317782640457153], [0.00743082445114851, 0.004954962991178036, 0.0060543823055922985, 0.003196436446160078, 0.0013914935989305377, 0.0010928340489044785, 0.002211059909313917, 0.4644028842449188, 0.011310427449643612, 0.026804577559232712, 0.002976618707180023, 0.016197824850678444, 0.0286333579570055, 0.01696348935365677, 0.004267004784196615, 0.0036603854969143867, 0.39845147728919983], [0.014812071807682514, 0.0048075332306325436, 0.004455265123397112, 0.0021066959016025066, 0.0011766948737204075, 0.0006930087110958993, 0.0004399390600156039, 0.39671948552131653, 0.004291894845664501, 0.01779583841562271, 0.001758127473294735, 0.0037072314880788326, 0.11319597065448761, 0.06394808739423752, 0.01410135067999363, 0.007343296427279711, 0.34864744544029236], [0.0036764133255928755, 0.002440114738419652, 0.0017950433539226651, 0.0019981469959020615, 0.000879047205671668, 0.00273315841332078, 0.0020253874827176332, 0.5329792499542236, 0.000845536298584193, 0.0013565077679231763, 0.0004982445389032364, 0.0005212237010709941, 0.0005679410533048213, 0.0016901021590456367, 0.0007692141807638109, 0.0011368937557563186, 0.44408777356147766]], [[0.02362612448632717, 0.02427562139928341, 0.004650562535971403, 0.005623771343380213, 0.0018807738088071346, 0.0012156838783994317, 0.008609775453805923, 0.15466398000717163, 0.5512253642082214, 0.03815317898988724, 0.010205508209764957, 0.008779735304415226, 0.007172062527388334, 0.006789559964090586, 0.01312484685331583, 0.0032977599184960127, 0.13670572638511658], [0.09047234058380127, 0.014567430131137371, 0.282764732837677, 0.011648581363260746, 0.0003048956859856844, 0.0006245659897103906, 0.0182026419788599, 0.320551335811615, 0.000742948439437896, 0.0005548172630369663, 0.0012565336655825377, 0.0003052475512959063, 0.00022943336807657033, 0.0010330253280699253, 0.001934821717441082, 0.0003034280671272427, 0.25450316071510315], [0.022552987560629845, 0.006214485969394445, 0.004649084527045488, 0.10121580213308334, 0.0076131015084683895, 0.002900101011618972, 0.004490031860768795, 0.46181392669677734, 0.0002535934327170253, 0.0004849644028581679, 0.001072361832484603, 0.0012038740096613765, 0.0002803131064865738, 0.00022057766909711063, 0.00038083482650108635, 0.0007615726790390909, 0.3838924467563629], [0.01873127371072769, 0.004037341568619013, 0.0055734338238835335, 0.012536394409835339, 0.1324429214000702, 0.09985945373773575, 0.017158987000584602, 0.38615402579307556, 0.0002708380052354187, 0.0004719299904536456, 0.0003420118591748178, 0.00017118509276770055, 0.00011580483260331675, 0.00015223106311168522, 0.0006329281022772193, 0.002381951082497835, 0.3189672529697418], [0.006352698430418968, 0.0005157016566954553, 0.0013872975250706077, 0.00039063056465238333, 0.004193692933768034, 0.5982937812805176, 0.14877165853977203, 0.1314929574728012, 0.00027685367967933416, 0.000316930643748492, 0.0001753415708662942, 9.039610449690372e-05, 9.989932550524827e-06, 3.9299859054153785e-05, 0.00038552729529328644, 0.000362535152817145, 0.1069447249174118], [0.018137339502573013, 0.0007293533999472857, 0.0020262000616639853, 0.0007749767391942441, 0.00443369010463357, 0.021947085857391357, 0.8002176284790039, 0.08242310583591461, 0.0012392179341986775, 0.001321692019701004, 0.0011541912099346519, 0.00022694906510878354, 1.4300473594630603e-05, 1.874738336482551e-05, 0.0004145015846006572, 0.00011395500041544437, 0.06480713188648224], [0.004031297285109758, 0.0067943693138659, 0.00026893490576185286, 0.0003427844203542918, 8.384278771700338e-05, 0.0007662678835913539, 0.0005927934544160962, 0.5352180600166321, 0.023701859638094902, 0.0006680027581751347, 0.0006122649647295475, 0.0002968394255731255, 2.247359043394681e-05, 4.387016815599054e-05, 2.1808027668157592e-05, 1.858772520790808e-05, 0.42651593685150146], [0.007280635181814432, 0.007592980284243822, 0.0033206199295818806, 0.003178331535309553, 0.0037511074915528297, 0.004654223565012217, 0.0035528303124010563, 0.5006781220436096, 0.002777552930638194, 0.002305650617927313, 0.0011544993612915277, 0.002511365106329322, 0.0023881744127720594, 0.0019119420321658254, 0.0018225401872768998, 0.0024332269094884396, 0.4486862123012543], [0.02610435150563717, 0.00030966848134994507, 0.00048803596291691065, 0.0003012116940226406, 4.589529999066144e-05, 0.00010399595339549705, 0.0023972210474312305, 0.14773815870285034, 0.005722071975469589, 0.6686967611312866, 0.013623042963445187, 0.001088281744159758, 0.001115524908527732, 0.00043630623258650303, 0.006188457366079092, 4.965712287230417e-05, 0.12559138238430023], [0.05554744228720665, 0.00044528223224915564, 9.817007958190516e-05, 2.751592910499312e-05, 3.120167457382195e-05, 0.0003091120452154428, 0.00015244103269651532, 0.3799756169319153, 0.008744265884160995, 0.01381969265639782, 0.17314547300338745, 0.00464810524135828, 0.01890740729868412, 0.006382541265338659, 0.001089670928195119, 0.00022417159925680608, 0.33645179867744446], [0.006000999361276627, 0.0006592776044271886, 2.06712975341361e-05, 0.000274181948043406, 0.00025896812439896166, 0.0006092973053455353, 0.0008885219576768577, 0.1126176193356514, 0.00042435526847839355, 0.0021327293943613768, 0.002024474088102579, 0.7679799795150757, 0.0008986655157059431, 0.0009718940127640963, 0.0015350972535088658, 0.0005347359692677855, 0.1021684855222702], [0.00890259351581335, 0.0007610797765664756, 0.0002384407416684553, 9.361530828755349e-05, 5.6527154811192304e-05, 0.00018591048137750477, 8.493989298585802e-05, 0.08931314945220947, 0.006592101417481899, 0.0023419533390551805, 0.011531253345310688, 0.0016765694599598646, 0.7480523586273193, 0.04368055611848831, 0.004323867615312338, 0.0006637907354161143, 0.08150133490562439], [0.005771770142018795, 0.00048654881538823247, 0.0005069742328487337, 0.00014665456546936184, 5.1430932217044756e-05, 5.334823435987346e-05, 0.00010980598744936287, 0.19393512606620789, 0.002306627109646797, 0.005576169118285179, 0.00036731045111082494, 0.005911033134907484, 0.0531952939927578, 0.55089271068573, 0.006657206453382969, 0.003828260349109769, 0.1702038198709488], [0.056521642953157425, 0.0007306957268156111, 0.0015754476189613342, 0.0005547812907025218, 0.0001042987933033146, 4.179157258477062e-05, 0.0008041991968639195, 0.20161153376102448, 0.00014450159505940974, 0.015635691583156586, 0.0030651758424937725, 0.0006524906493723392, 0.014576708897948265, 0.07221268862485886, 0.44635045528411865, 0.005009711720049381, 0.18040819466114044], [0.022162850946187973, 0.0009166505187749863, 0.000493201136123389, 0.0008983672014437616, 0.004571357276290655, 0.0021392719354480505, 0.0004797519068233669, 0.3653039038181305, 2.5122175429714844e-05, 0.00036312671727500856, 0.0017965015722438693, 0.0005681113689206541, 0.0005586883635260165, 0.0009680179646238685, 0.004053974524140358, 0.26664480566978455, 0.328056275844574], [0.010465921834111214, 0.008381130173802376, 0.0021456293761730194, 0.001199878635816276, 0.0002964165178127587, 0.0003625671670306474, 0.00039239853504113853, 0.4880073368549347, 0.00338413892313838, 0.00020065468561369926, 0.0015091702807694674, 0.0006070233648642898, 0.008503571152687073, 0.00584521284326911, 0.006344358902424574, 0.00409461697563529, 0.4582599103450775], [0.007095439825206995, 0.0069713294506073, 0.003127570729702711, 0.0028081308118999004, 0.0034372746013104916, 0.004409275483340025, 0.0031920955516397953, 0.501924991607666, 0.0025585845578461885, 0.0020071626640856266, 0.0010653807548806071, 0.002279026200994849, 0.0022955858148634434, 0.0018385223811492324, 0.0017380132339894772, 0.002388506196439266, 0.4508630931377411]]], [[[0.004083481151610613, 0.010266514495015144, 0.005836408585309982, 0.002041516825556755, 0.001631894614547491, 0.0011460310779511929, 0.00258393632248044, 0.06393755227327347, 0.3628099858760834, 0.36560237407684326, 0.010922528803348541, 0.02133435755968094, 0.04707973822951317, 0.010180212557315826, 0.015219781547784805, 0.016995001584291458, 0.05832871422171593], [0.0471838042140007, 0.04388238117098808, 0.30321961641311646, 0.051181256771087646, 0.03728484362363815, 0.006823327858000994, 0.029177667573094368, 0.25135326385498047, 0.0045381877571344376, 0.0018491029040887952, 0.0014868690632283688, 0.00035225480678491294, 0.001049170852638781, 0.003269101958721876, 0.002378885867074132, 0.0012385027948766947, 0.21373172104358673], [0.016021642833948135, 0.00934872031211853, 0.03365415334701538, 0.04665527492761612, 0.05169636756181717, 0.032226502895355225, 0.03233293816447258, 0.4206611216068268, 0.0016816986026242375, 0.0010042608482763171, 0.000550229218788445, 0.00024528440553694963, 0.00020937572116963565, 0.0005810664151795208, 0.00039817820652388036, 0.0005791793228127062, 0.3521539270877838], [0.012056097388267517, 0.0035390802659094334, 0.014607013203203678, 0.018763015046715736, 0.07704766094684601, 0.2239285558462143, 0.05175204947590828, 0.31614628434181213, 0.00179469829890877, 0.0008313191356137395, 0.0003906309721060097, 0.000695938419084996, 0.0003453704703133553, 0.000496912223752588, 0.0010078243212774396, 0.0007133716135285795, 0.275884211063385], [0.006247136276215315, 0.00033699083724059165, 0.0043464950285851955, 0.0060152397491037846, 0.01900547929108143, 0.5900396704673767, 0.10194841772317886, 0.143333300948143, 0.002663862658664584, 0.0012629416305571795, 0.00146248540841043, 0.00039225220098160207, 5.990822319290601e-05, 6.349512113956735e-05, 0.000210279889870435, 0.00021549040684476495, 0.12239653617143631], [0.019161371514201164, 0.0007033912697806954, 0.0030785133130848408, 0.005670036189258099, 0.019960638135671616, 0.03846687078475952, 0.40337875485420227, 0.25959110260009766, 0.018065504729747772, 0.006658558733761311, 0.0033417169470340014, 0.0014244046760722995, 0.00032341404585167766, 0.00022051797714084387, 0.00041163110290654004, 0.00022338639246299863, 0.2193201631307602], [0.021539896726608276, 0.004957678262144327, 0.002334498567506671, 0.0061911167576909065, 0.008874042890965939, 0.013944328762590885, 0.01621401496231556, 0.3876628875732422, 0.14049220085144043, 0.04027939587831497, 0.00615672254934907, 0.005812295712530613, 0.007646613288670778, 0.0022178643848747015, 0.0005753374425694346, 0.0013287181500345469, 0.33377233147621155], [0.007617037743330002, 0.0024497760459780693, 0.004515711218118668, 0.005138356238603592, 0.0032676239497959614, 0.004109091125428677, 0.007824215106666088, 0.511775016784668, 0.003105049254372716, 0.002679311903193593, 0.001402824535034597, 0.002359375823289156, 0.004239645320922136, 0.0021949298679828644, 0.001242075813934207, 0.0019698466639965773, 0.4341100752353668], [0.04192143306136131, 0.0003148631367366761, 0.000248601078055799, 0.0003886763588525355, 0.000142870529089123, 0.00034486514050513506, 0.004023955203592777, 0.29722899198532104, 0.04113367572426796, 0.20716656744480133, 0.06783179193735123, 0.044581446796655655, 0.030444927513599396, 0.0018056677654385567, 0.0007835851865820587, 0.001402143738232553, 0.2602359652519226], [0.009796972386538982, 0.00045521423453465104, 0.0004612221091520041, 0.0002781054354272783, 6.961745384614915e-05, 0.00047293660463765264, 0.001953547354787588, 0.26280686259269714, 0.019125495105981827, 0.06960467249155045, 0.07979968935251236, 0.16364158689975739, 0.11046423763036728, 0.027346424758434296, 0.004330432508140802, 0.015605127438902855, 0.2337879091501236], [0.007497874554246664, 0.0006281228270381689, 0.0003290157183073461, 0.000560607819352299, 0.0001788319059414789, 0.00019266043091192842, 0.001489504356868565, 0.3899276554584503, 0.003951693885028362, 0.05373711138963699, 0.015914125367999077, 0.02256133034825325, 0.1210661306977272, 0.021808939054608345, 0.005661309231072664, 0.0037624293472617865, 0.35073262453079224], [0.0201492328196764, 0.0008671115501783788, 0.0009348484454676509, 0.0012377066304907203, 0.00023453019093722105, 0.0007145720301195979, 0.0013708645710721612, 0.2616354823112488, 0.0037428438663482666, 0.010561984963715076, 0.01693057082593441, 0.07273165881633759, 0.1608487069606781, 0.1570143699645996, 0.024759210646152496, 0.029044240713119507, 0.23722198605537415], [0.013863440603017807, 0.0009968613740056753, 0.0017203486058861017, 0.0007964314427226782, 0.0002512383507564664, 0.000508443103171885, 0.0023207247722893953, 0.22237025201320648, 0.000980651006102562, 0.006551274564117193, 0.0025334483943879604, 0.014891887083649635, 0.05995171144604683, 0.20110347867012024, 0.06615838408470154, 0.20605990290641785, 0.19894157350063324], [0.029679127037525177, 0.0007730416837148368, 0.002115977928042412, 0.00475419033318758, 0.000509217323269695, 0.0006299729575403035, 0.003024585312232375, 0.2555743157863617, 0.0004509483987931162, 0.002532568760216236, 0.002277133986353874, 0.0035485082771629095, 0.024690184742212296, 0.05076415464282036, 0.054738640785217285, 0.33741432428359985, 0.22652314603328705], [0.0031695826910436153, 0.0003325995639897883, 0.000281975488178432, 0.0009938717121258378, 0.00035639634006656706, 0.0005657272413372993, 0.0005899776006117463, 0.5154324769973755, 0.00025797044509090483, 0.0002750036073848605, 0.0003977921442128718, 0.0004940443905070424, 0.005724288988858461, 0.004078098107129335, 0.0034477279987186193, 0.0037142348010092974, 0.4598882794380188], [0.018920890986919403, 0.0008689638925716281, 0.0012451376533135772, 0.004829060286283493, 0.0019166212296113372, 0.003307041712105274, 0.0067899757996201515, 0.43310561776161194, 0.0027897274121642113, 0.003279728116467595, 0.0011909687891602516, 0.003926041070371866, 0.008639281615614891, 0.008941789157688618, 0.011754726059734821, 0.09899520874023438, 0.3894991874694824], [0.006861280649900436, 0.0021093254908919334, 0.004004065878689289, 0.00449757743626833, 0.0028789546340703964, 0.003663398092612624, 0.007247333414852619, 0.5152990818023682, 0.0027661919593811035, 0.002444947836920619, 0.0012910881778225303, 0.002153773093596101, 0.00378129119053483, 0.001934899715706706, 0.0011249339440837502, 0.0018051861552521586, 0.4361366331577301]], [[0.006640621926635504, 0.0018097672145813704, 0.0025099809281527996, 0.001014064997434616, 0.0003981577174272388, 0.021083062514662743, 0.008034562692046165, 0.5003902316093445, 0.0017131459899246693, 0.0004714426177088171, 0.0032314180862158537, 0.0009747802396304905, 0.0011783323716372252, 0.001574094989337027, 0.0005130699137225747, 0.00038119463715702295, 0.4480821490287781], [0.005610721185803413, 0.01801030896604061, 0.005320126656442881, 0.002519084606319666, 0.0005323417717590928, 0.0022562427911907434, 0.0041033001616597176, 0.5088375806808472, 0.0002905707515310496, 2.116906580340583e-05, 0.0011140169808641076, 0.0007272794609889388, 0.005650756414979696, 0.0005599311552941799, 0.00036048897891305387, 0.00031290820334106684, 0.443773090839386], [0.003862259676679969, 0.01923271454870701, 0.0035116255749017, 0.01159456092864275, 0.013776338659226894, 0.0006801421986892819, 0.0006472010863944888, 0.5022839903831482, 2.8539090635604225e-05, 2.6510640964261256e-05, 0.0005876823561266065, 7.66843804740347e-05, 8.145233732648194e-05, 6.887218478368595e-05, 0.0006215582834556699, 0.00010518021008465439, 0.4428147077560425], [0.0014964548172429204, 0.015894856303930283, 0.007998041808605194, 0.00981275737285614, 0.014848231337964535, 0.0016507002292200923, 0.0003774325014092028, 0.5120782852172852, 4.96775028295815e-05, 3.367659155628644e-05, 0.0011221698950976133, 0.00019823435286525637, 0.00034292449709028006, 0.0001553969195811078, 0.00042609075899235904, 0.00025924862711690366, 0.4332558810710907], [0.05731039494276047, 0.014817115850746632, 0.00393249886110425, 0.32759469747543335, 0.01658163033425808, 0.14140360057353973, 0.002406408078968525, 0.22733789682388306, 0.00010942589869955555, 1.838424759625923e-05, 0.0001136446007876657, 0.000494811509270221, 0.001073671504855156, 0.0005494000506587327, 0.005106938071548939, 0.001328573445789516, 0.19982099533081055], [0.018363814800977707, 0.0023813648149371147, 0.0032748528756201267, 0.030158162117004395, 0.1581299751996994, 0.011901133693754673, 0.00951849389821291, 0.40328699350357056, 0.0003053880063816905, 4.557170541374944e-05, 0.0008907102164812386, 0.0003067919460590929, 0.0007250209455378354, 7.544220716226846e-05, 0.0012910173973068595, 0.0003908830403815955, 0.35895437002182007], [0.014583328738808632, 0.001506542437709868, 0.004905585665255785, 0.009288981556892395, 0.0031461655162274837, 0.25536397099494934, 0.03543156012892723, 0.3496476709842682, 0.002587232505902648, 0.0008398568606935441, 0.0019015561556443572, 0.0002983945596497506, 0.00014167024346534163, 0.00026837564655579627, 0.00010337432468077168, 0.0018542632460594177, 0.3181314170360565], [0.002774134511128068, 0.0009952118853107095, 0.00044433033326640725, 0.0014356330502778292, 0.0007511977455578744, 0.003419009270146489, 0.0006639195489697158, 0.52400141954422, 0.00027826358564198017, 0.0001647885364945978, 0.00045937582035548985, 0.0003968923701904714, 0.00030393293127417564, 0.00019963386876042932, 0.0003492451214697212, 0.00014748505782335997, 0.46321550011634827], [0.017433766275644302, 0.0004089678404852748, 0.00014158480917103589, 0.000447319180238992, 0.0012962608598172665, 0.004420291632413864, 0.022016258910298347, 0.4597463309764862, 0.02661087177693844, 0.000785876763984561, 0.03244047611951828, 0.01264140848070383, 0.004986383952200413, 0.00039549352368339896, 0.00019033250282518566, 0.0011408781865611672, 0.41489747166633606], [0.0996033325791359, 0.0010661811102181673, 0.000655685958918184, 0.0011618860298767686, 0.006712760776281357, 0.0019110942957922816, 0.013591555878520012, 0.382196307182312, 0.06624986231327057, 0.028937630355358124, 0.019741058349609375, 0.014060970395803452, 0.003300856566056609, 0.00036169777740724385, 0.0002267359523102641, 0.00045197634608484805, 0.35977044701576233], [0.013239442370831966, 0.00044689595233649015, 0.00032682702294550836, 0.0003587512474041432, 0.00013625826977659017, 0.0003253217728342861, 0.0036083334125578403, 0.49936386942863464, 0.0046426658518612385, 0.02662898413836956, 0.0011756467865779996, 0.001843846868723631, 0.0020454302430152893, 0.0002997281844727695, 0.0016349426005035639, 0.0004215297813061625, 0.4435015618801117], [0.02804984711110592, 0.0008390527218580246, 0.00014484270650427788, 0.0004606385191436857, 0.0008052413468249142, 0.0001073426246875897, 0.005835285410284996, 0.1545025110244751, 0.015224426053464413, 0.009595396928489208, 0.6226685643196106, 0.00204093218781054, 0.014330615289509296, 0.0022620302625000477, 0.0018143244087696075, 0.0004185418947599828, 0.1409004181623459], [0.005276072770357132, 0.0003650781000033021, 2.1817602828377858e-05, 0.0004760106385219842, 6.952255262149265e-06, 7.029796688584611e-05, 0.00040658327634446323, 0.4834519028663635, 0.0004288666241336614, 0.000607833091635257, 0.03700881823897362, 0.013901754282414913, 0.021115509793162346, 0.0012138357851654291, 0.0024888641200959682, 0.0026992966886609793, 0.4304604232311249], [0.016472425311803818, 0.0003363966243341565, 7.36086440156214e-05, 9.371815394842997e-05, 7.502674270654097e-05, 0.00010294261301169172, 0.0024506691843271255, 0.2977106273174286, 0.0015166756929829717, 0.0007380940951406956, 0.047028712928295135, 0.06333784759044647, 0.2629038989543915, 0.027818135917186737, 0.0025195099879056215, 0.010954486206173897, 0.2658672630786896], [0.015679681673645973, 0.0006053984980098903, 0.00010532861779211089, 0.0001780543097993359, 0.00022948473633732647, 0.0001152951517724432, 0.0002248886157758534, 0.5013548135757446, 0.0017739135073497891, 0.0002931178023573011, 0.00082678027683869, 0.0013597175711765885, 0.0032266092021018267, 0.025666100904345512, 0.001478736405260861, 0.003211310598999262, 0.44367069005966187], [0.023467374965548515, 0.0010534296743571758, 2.713039066293277e-05, 0.0010348653886467218, 0.0002579364809207618, 0.0003147345269098878, 0.0008283740608021617, 0.044748660176992416, 7.58034730097279e-05, 0.000670621870085597, 0.0007485253154300153, 0.003908259328454733, 0.0047384039498865604, 0.005386083386838436, 0.8676328659057617, 0.0020850251894444227, 0.04302188381552696], [0.002711727749556303, 0.0009087404469028115, 0.0004064514650963247, 0.0013229845790192485, 0.0006673884927295148, 0.0031175410840660334, 0.0006192386499606073, 0.5247302651405334, 0.0002568238414824009, 0.00015148324018809944, 0.0004484520177356899, 0.00037594381137751043, 0.00028538802871480584, 0.00018772503244690597, 0.0003250270674470812, 0.00014226870553102344, 0.4633425176143646]], [[0.03876497969031334, 0.022126680240035057, 0.0103555116802454, 0.005037189461290836, 0.002751772990450263, 0.006483457516878843, 0.0275814700871706, 0.2886623740196228, 0.039803650230169296, 0.08298151195049286, 0.009118134155869484, 0.01993367075920105, 0.08359593152999878, 0.02048637717962265, 0.004613171797245741, 0.06120169535279274, 0.27650249004364014], [0.020735636353492737, 0.18080101907253265, 0.09438339620828629, 0.037630680948495865, 0.021607106551527977, 0.10220450162887573, 0.25501832365989685, 0.0474819540977478, 0.040802497416734695, 0.03361200541257858, 0.003109042765572667, 0.011785808019340038, 0.05537364259362221, 0.015619650483131409, 0.009559083729982376, 0.027501536533236504, 0.042774055153131485], [0.03177400305867195, 0.11751070618629456, 0.028056221082806587, 0.02296084724366665, 0.02083009108901024, 0.05056319013237953, 0.03823808208107948, 0.27009543776512146, 0.03174891322851181, 0.01347424928098917, 0.00827068742364645, 0.01142788678407669, 0.0664491280913353, 0.013697739690542221, 0.005421096459031105, 0.033135898411273956, 0.2363457828760147], [0.039213769137859344, 0.08718899637460709, 0.03629929572343826, 0.015279576182365417, 0.033467333763837814, 0.05603574588894844, 0.04724758863449097, 0.26336684823036194, 0.035768333822488785, 0.022858239710330963, 0.006760303396731615, 0.009604404680430889, 0.06222238019108772, 0.014405464753508568, 0.01047576405107975, 0.02519325725734234, 0.23461279273033142], [0.02170759066939354, 0.07290694117546082, 0.06711792200803757, 0.037599433213472366, 0.0173651035875082, 0.04318316653370857, 0.08134033530950546, 0.22826775908470154, 0.036034271121025085, 0.03558309003710747, 0.005060053430497646, 0.048922572284936905, 0.0544293150305748, 0.01548344362527132, 0.002886928850784898, 0.027625951915979385, 0.2044862061738968], [0.013152213767170906, 0.2217816561460495, 0.06836797297000885, 0.06536387652158737, 0.013532874174416065, 0.104002445936203, 0.12223000824451447, 0.1324240267276764, 0.02194560319185257, 0.011199736036360264, 0.0016497995238751173, 0.010398166254162788, 0.05470310524106026, 0.009779373183846474, 0.004581306129693985, 0.028963813558220863, 0.11592409759759903], [0.17135226726531982, 0.3311707675457001, 0.06837645918130875, 0.06003308668732643, 0.03025788441300392, 0.04735049232840538, 0.007376356050372124, 0.1327797919511795, 0.012032506987452507, 0.005699047818779945, 0.0018891083309426904, 0.003496119286864996, 0.006733338348567486, 0.005027373321354389, 0.00044935112236998975, 0.00994600635021925, 0.10603003948926926], [0.008763023652136326, 0.0030135209672152996, 0.000634426367469132, 0.00047694332897663116, 0.0009441105066798627, 0.0012360878754407167, 0.0007386246579699218, 0.5203231573104858, 0.0018118080915883183, 0.000437173672253266, 0.0015619267942383885, 0.0006402708822861314, 0.0014853114262223244, 0.0007494789315387607, 0.0002804916293825954, 0.0010225086007267237, 0.45588117837905884], [0.013150902464985847, 0.03245753422379494, 0.11305023729801178, 0.052529383450746536, 0.022034816443920135, 0.033888883888721466, 0.05330939590930939, 0.20744991302490234, 0.01903660222887993, 0.07757310569286346, 0.0057386248372495174, 0.03152724355459213, 0.06668046861886978, 0.020894350484013557, 0.005881402641534805, 0.052494291216135025, 0.1923028528690338], [0.007649420760571957, 0.009666509926319122, 0.008746130391955376, 0.0042511168867349625, 0.0035816689487546682, 0.008443129248917103, 0.0057349069975316525, 0.4438745677471161, 0.016291992738842964, 0.008416545577347279, 0.0015642602229490876, 0.006453413981944323, 0.023659490048885345, 0.006778723560273647, 0.001809898647479713, 0.02294142358005047, 0.42013677954673767], [0.030145768076181412, 0.04984056577086449, 0.025190211832523346, 0.010861831717193127, 0.0041696783155202866, 0.029273705556988716, 0.0030322240199893713, 0.33715149760246277, 0.08034559339284897, 0.009701335802674294, 0.002904896391555667, 0.009433066472411156, 0.021289357915520668, 0.0696469396352768, 0.002125467173755169, 0.014937041327357292, 0.2999506890773773], [0.009758849628269672, 0.03703921660780907, 0.03634372353553772, 0.02543223835527897, 0.025850653648376465, 0.05589013174176216, 0.024128077551722527, 0.29243671894073486, 0.05739610642194748, 0.03775964304804802, 0.0036061694845557213, 0.010608300566673279, 0.03746131435036659, 0.04398012161254883, 0.0040132892318069935, 0.027268096804618835, 0.2710273265838623], [0.034646399319171906, 0.02288265898823738, 0.03461689129471779, 0.011713674291968346, 0.008231036365032196, 0.013418920338153839, 0.009845656342804432, 0.3393171429634094, 0.06567858159542084, 0.031617384403944016, 0.0022852676920592785, 0.013201496563851833, 0.037200313061475754, 0.03300933167338371, 0.002704222220927477, 0.027280626818537712, 0.3123503625392914], [0.00912979245185852, 0.03145089000463486, 0.08726119250059128, 0.0415167361497879, 0.01635095663368702, 0.030798984691500664, 0.02918403223156929, 0.18993403017520905, 0.05052550882101059, 0.09130696207284927, 0.005074433982372284, 0.038552116602659225, 0.09877417236566544, 0.03906463086605072, 0.007693320047110319, 0.05659914389252663, 0.17678314447402954], [0.01998925395309925, 0.014997678808867931, 0.012553824111819267, 0.004337131977081299, 0.006514464970678091, 0.017683975398540497, 0.002952870214357972, 0.3401181101799011, 0.0326274149119854, 0.010301818139851093, 0.004583478439599276, 0.02250749059021473, 0.12422005087137222, 0.026111392304301262, 0.003878270508721471, 0.039610058069229126, 0.31701260805130005], [0.013137602247297764, 0.041059356182813644, 0.039435263723134995, 0.025700561702251434, 0.009083135984838009, 0.0233684703707695, 0.015237030573189259, 0.1750115007162094, 0.10279016196727753, 0.13121238350868225, 0.0059186373837292194, 0.03298543393611908, 0.06696045398712158, 0.04063171520829201, 0.006234882399439812, 0.10606605559587479, 0.16516730189323425], [0.007749977521598339, 0.0028419147711247206, 0.0006022349116392434, 0.0004622976703103632, 0.0008658506558276713, 0.0011800676584243774, 0.0007281464640982449, 0.521176278591156, 0.0017022948013618588, 0.00042232326813973486, 0.0014915397623553872, 0.0006168946856632829, 0.00144483451731503, 0.0007075549219734967, 0.0002685310610104352, 0.000995673588477075, 0.45674359798431396]], [[0.010651291348040104, 0.04809771850705147, 0.03837062418460846, 0.05298202112317085, 0.006948170717805624, 0.04825332760810852, 0.026434265077114105, 0.41313666105270386, 0.0010254706721752882, 0.0010317113483324647, 0.0010007532546296716, 0.0009914664551615715, 0.0013289713533595204, 0.00042631823453120887, 0.00022831273963674903, 0.001576291979290545, 0.3475165069103241], [0.014919126406311989, 0.021318677812814713, 0.03674667328596115, 0.029242640361189842, 0.003777979174628854, 0.006424322258681059, 0.016700472682714462, 0.4640410840511322, 0.002469720784574747, 0.0010629519820213318, 0.0015861743595451117, 0.0021083310712128878, 0.002545704832300544, 0.0014238560106605291, 0.0007297685951925814, 0.002370092086493969, 0.3925323486328125], [0.023660337552428246, 0.034027595072984695, 0.012960715219378471, 0.01680544763803482, 0.0023265210911631584, 0.0022834965493530035, 0.004435372073203325, 0.4843156039714813, 0.0006881437147967517, 0.00023438839707523584, 0.00016126909758895636, 0.0012076067505404353, 0.0008306339732371271, 0.0004832753911614418, 0.00037797485128976405, 0.0009249240974895656, 0.414276659488678], [0.02799510769546032, 0.04243740066885948, 0.025586305186152458, 0.041481826454401016, 0.010210365988314152, 0.004426650702953339, 0.014089532196521759, 0.4427638649940491, 0.0008504438446834683, 0.000722077616956085, 0.00033967633498832583, 0.00204147188924253, 0.0019032250856980681, 0.0009658546186983585, 0.001058993162587285, 0.0032561009284108877, 0.3798711597919464], [0.06285551190376282, 0.08082985877990723, 0.04660291597247124, 0.1682017743587494, 0.004451194312423468, 0.02477400191128254, 0.03118986263871193, 0.30581629276275635, 0.0024290482979267836, 0.001410001888871193, 0.0001800085447030142, 0.003304483834654093, 0.0016586334677413106, 0.0014656806597486138, 0.0009977830341085792, 0.005537942051887512, 0.25829508900642395], [0.08650829643011093, 0.12271664291620255, 0.08358984440565109, 0.06928369402885437, 0.008653477765619755, 0.021964335814118385, 0.03701992332935333, 0.2974291443824768, 0.0014521965058520436, 0.0014835780020803213, 0.0004410279798321426, 0.00224329368211329, 0.0038228461053222418, 0.0012591707054525614, 0.0010920945787802339, 0.0038691353984177113, 0.2571713626384735], [0.08830947428941727, 0.1496984362602234, 0.1131545752286911, 0.06324763596057892, 0.046971119940280914, 0.05959483981132507, 0.027199089527130127, 0.2275477796792984, 0.0033322866074740887, 0.0017076900694519281, 0.0007157978252507746, 0.0020855353213846684, 0.0034660876262933016, 0.0015899846330285072, 0.0017984001897275448, 0.007070397026836872, 0.20251090824604034], [0.003373810788616538, 0.0019078099867329001, 0.0026289490051567554, 0.0021171006374061108, 0.0008784524980001152, 0.0016454076394438744, 0.003511374583467841, 0.5309630632400513, 0.0013943014200776815, 0.0012888247147202492, 0.00030531216179952025, 0.0013363654725253582, 0.001364801893942058, 0.0004895798047073185, 0.0005624095792882144, 0.0017100380500778556, 0.4445224404335022], [0.0173655953258276, 0.012332079000771046, 0.00822543352842331, 0.017941059544682503, 0.013750811107456684, 0.06732749938964844, 0.07059092074632645, 0.3821801543235779, 0.031292904168367386, 0.013070831075310707, 0.002057603793218732, 0.01646636240184307, 0.004295059479773045, 0.00374570838175714, 0.0011815667385235429, 0.007967965677380562, 0.3302084505558014], [0.018085109069943428, 0.00342858606018126, 0.0006941453902982175, 0.0017370340647175908, 0.001500184298492968, 0.008416896685957909, 0.030099373310804367, 0.4786571264266968, 0.02434701658785343, 0.005821674130856991, 0.0006122009363025427, 0.0014491616748273373, 0.0026358875911682844, 0.0009106830111704767, 0.0001363616465823725, 0.004542248789221048, 0.41692638397216797], [0.008778916671872139, 0.0007945428951643407, 0.00018470697978045791, 0.0003769229515455663, 0.0005714495200663805, 0.0012644171947613358, 0.007875319570302963, 0.5242693424224854, 0.0042025926522910595, 0.0012988975504413247, 0.0003172823053319007, 0.00035968300653621554, 0.001122661167755723, 0.00015210478159133345, 6.0453618061728776e-05, 0.0001846879458753392, 0.44818612933158875], [0.025506924837827682, 0.0022416897118091583, 0.000865351059474051, 0.0004380106693133712, 0.0003928901569452137, 0.0045121763832867146, 0.024780401960015297, 0.4575302302837372, 0.036716528236866, 0.018769215792417526, 0.0034955008886754513, 0.006563994567841291, 0.006595167797058821, 0.0009471868397668004, 0.0006536885630339384, 0.005240521859377623, 0.4047505557537079], [0.05743158236145973, 0.005027624312788248, 0.001369799138046801, 0.007941259071230888, 0.0008900559041649103, 0.00382581097073853, 0.035327788442373276, 0.3596605658531189, 0.12167783826589584, 0.04556646943092346, 0.003132336540147662, 0.028026103973388672, 0.005150756798684597, 0.0035878149792551994, 0.0013962306547909975, 0.007043134421110153, 0.31294485926628113], [0.01980031281709671, 0.0059632714837789536, 0.0015726395649835467, 0.002767251804471016, 0.0010932899313047528, 0.006237796042114496, 0.016724906861782074, 0.3652184009552002, 0.04725727438926697, 0.04931535944342613, 0.009173160418868065, 0.08575954288244247, 0.03465646505355835, 0.017156459391117096, 0.0033527254126966, 0.012498130090534687, 0.3214530348777771], [0.013980960473418236, 0.0009090476669371128, 0.0003003641904797405, 0.00018562388140708208, 0.000131691136630252, 0.00043581053614616394, 0.0017971521010622382, 0.5087334513664246, 0.0033680491615086794, 0.004310212563723326, 0.0007405822398141026, 0.002859764965251088, 0.01034164521843195, 0.00408777454867959, 0.0003879657597281039, 0.004286731593310833, 0.4431430995464325], [0.04793037474155426, 0.0015925753396004438, 0.001197886886075139, 0.002495001768693328, 0.00012958589650224894, 0.0005296710878610611, 0.008238089270889759, 0.24836114048957825, 0.049676563590765, 0.04282013699412346, 0.0015413867076858878, 0.07862626016139984, 0.16847676038742065, 0.08147833496332169, 0.008081121370196342, 0.04021488130092621, 0.2186102420091629], [0.0032133355271071196, 0.0016774703981354833, 0.0023382508661597967, 0.0018640425987541676, 0.0007565852720290422, 0.0014587444020435214, 0.003157190978527069, 0.5327511429786682, 0.0012912615202367306, 0.001208747737109661, 0.0002833276812452823, 0.0012581556802615523, 0.001255515730008483, 0.00045139979920350015, 0.0005192536045797169, 0.0016066492535173893, 0.44490891695022583]], [[0.02171350084245205, 0.043961405754089355, 0.027329180389642715, 0.09552139788866043, 0.01987740583717823, 0.017374461516737938, 0.024121228605508804, 0.34085068106651306, 0.010602046735584736, 0.015572507865726948, 0.002919234102591872, 0.016444621607661247, 0.00948400143533945, 0.007504765409976244, 0.005665397737175226, 0.028669318184256554, 0.3123888373374939], [0.10502757877111435, 0.1395493894815445, 0.060530662536621094, 0.09872180968523026, 0.00847651343792677, 0.041486553847789764, 0.06679069250822067, 0.20897352695465088, 0.02419460564851761, 0.018597327172756195, 0.0012517449213191867, 0.0035619433037936687, 0.007562574464827776, 0.008795957081019878, 0.0018029508646577597, 0.013797793537378311, 0.19087831676006317], [0.06388872861862183, 0.07454962283372879, 0.014247567392885685, 0.021648194640874863, 0.0031306990422308445, 0.007831403985619545, 0.14405226707458496, 0.332382470369339, 0.004642088431864977, 0.004319594241678715, 0.0009615204762667418, 0.002426187274977565, 0.013558994978666306, 0.001495914999395609, 0.0010653856443241239, 0.01319281104952097, 0.29660651087760925], [0.09268822520971298, 0.024194443598389626, 0.07024107128381729, 0.015059195458889008, 0.017504636198282242, 0.025016695261001587, 0.06702037900686264, 0.3298759460449219, 0.007229585666209459, 0.01375966053456068, 0.009553618729114532, 0.0049579450860619545, 0.0142865851521492, 0.005110567901283503, 0.00464595528319478, 0.006266890559345484, 0.29258856177330017], [0.05810362845659256, 0.010257489047944546, 0.019693927839398384, 0.06425942480564117, 0.020529065281152725, 0.03000796213746071, 0.09162784367799759, 0.3513181209564209, 0.0070266081020236015, 0.013583390973508358, 0.0047304402105510235, 0.003121491987258196, 0.006715251132845879, 0.0029583838768303394, 0.003388304030522704, 0.0029179633129388094, 0.3097607493400574], [0.06856624037027359, 0.03256785124540329, 0.014049772173166275, 0.0805036649107933, 0.016232246533036232, 0.06494304537773132, 0.09566733241081238, 0.29886895418167114, 0.017963798716664314, 0.015393528155982494, 0.001230248250067234, 0.003286918858066201, 0.004311148077249527, 0.011128339916467667, 0.0016509564593434334, 0.008068233728408813, 0.26556769013404846], [0.03624676540493965, 0.0190057884901762, 0.05280241742730141, 0.07030589878559113, 0.05238155648112297, 0.023108039051294327, 0.012540639378130436, 0.32202988862991333, 0.028573546558618546, 0.012158461846411228, 0.00569486478343606, 0.010967280715703964, 0.016376277431845665, 0.02362043596804142, 0.008242800831794739, 0.015335899777710438, 0.2906094193458557], [0.00661076745018363, 0.0014183290768414736, 0.0031545336823910475, 0.005009150598198175, 0.0044376542791724205, 0.0026984727010130882, 0.010319417342543602, 0.489732027053833, 0.0025765933096408844, 0.0028269358444958925, 0.010803340002894402, 0.0025763106532394886, 0.005805999506264925, 0.002693208632990718, 0.004991770721971989, 0.003912807442247868, 0.44043275713920593], [0.047381069511175156, 0.012359660118818283, 0.004353589378297329, 0.00653199153020978, 0.005484114401042461, 0.023549117147922516, 0.053630370646715164, 0.30780676007270813, 0.0069723245687782764, 0.036561209708452225, 0.00951547734439373, 0.01772806979715824, 0.027495892718434334, 0.006739427335560322, 0.09637050330638885, 0.05533894896507263, 0.28218144178390503], [0.034640539437532425, 0.006264038383960724, 0.005435112398117781, 0.007389870472252369, 0.0033141898456960917, 0.00421362416818738, 0.012954137288033962, 0.3591756522655487, 0.06823336333036423, 0.008139592595398426, 0.004202970769256353, 0.029871249571442604, 0.036503441631793976, 0.05490024387836456, 0.0033086237963289022, 0.03949131816625595, 0.32196205854415894], [0.01058492437005043, 0.0007235348457470536, 0.0007941168732941151, 0.002931152004748583, 0.001791826798580587, 0.0007360016461461782, 0.00902820099145174, 0.4846690893173218, 0.012151957489550114, 0.013282187283039093, 0.0014336848398670554, 0.0032457103952765465, 0.014058132655918598, 0.00560242123901844, 0.00534347677603364, 0.006035238970071077, 0.42758825421333313], [0.0384397879242897, 0.006133506540209055, 0.005038060713559389, 0.005221458617597818, 0.003494358155876398, 0.006681034341454506, 0.01237774919718504, 0.34927693009376526, 0.036411579698324203, 0.06803665310144424, 0.0020022455137223005, 0.015333653427660465, 0.04682603105902672, 0.02572544291615486, 0.003129608230665326, 0.06415924429893494, 0.3117125630378723], [0.03743297979235649, 0.005005800165235996, 0.004028538707643747, 0.010839811526238918, 0.005112554877996445, 0.012263841927051544, 0.023093393072485924, 0.3504691421985626, 0.03278140351176262, 0.04700236767530441, 0.0029000556096434593, 0.023893162608146667, 0.007907158695161343, 0.020426703616976738, 0.01360974833369255, 0.09100428223609924, 0.31222906708717346], [0.05911274626851082, 0.007463172543793917, 0.001607951009646058, 0.0021520424634218216, 0.0027579511515796185, 0.029024049639701843, 0.031181633472442627, 0.3219906985759735, 0.00858994759619236, 0.04992857947945595, 0.004128327127546072, 0.024572070688009262, 0.019279589876532555, 0.010351375676691532, 0.04234553128480911, 0.09213536977767944, 0.293379008769989], [0.04326111450791359, 0.001792263356037438, 0.001884629251435399, 0.002049521543085575, 0.0016244181897491217, 0.0033100415021181107, 0.020199742168188095, 0.40729179978370667, 0.05345752090215683, 0.01687440276145935, 0.008109679445624352, 0.004959262907505035, 0.02565581165254116, 0.02906103804707527, 0.0010828787926584482, 0.011575673706829548, 0.3678101897239685], [0.053817104548215866, 0.003218897385522723, 0.008273841813206673, 0.0031428206712007523, 0.002010904485359788, 0.003404677379876375, 0.022902250289916992, 0.26990529894828796, 0.07430361211299896, 0.1382715255022049, 0.003058353206142783, 0.04542470723390579, 0.03490329906344414, 0.05246013402938843, 0.004059332888573408, 0.040145374834537506, 0.24069781601428986], [0.006460660602897406, 0.0013142969692125916, 0.002969311783090234, 0.0046918513253331184, 0.004023895598948002, 0.002570820739492774, 0.009793058037757874, 0.4923669993877411, 0.0024306396953761578, 0.0026485170237720013, 0.010228059254586697, 0.002346064429730177, 0.005267712753266096, 0.0025212131440639496, 0.004633523058146238, 0.0036575598642230034, 0.4420759081840515]], [[0.013448210433125496, 0.0036647433880716562, 0.0012319096131250262, 0.0033310509752482176, 0.00045568624045699835, 0.0029230229556560516, 0.012249689549207687, 0.41929471492767334, 0.030255531892180443, 0.01950734667479992, 0.007640687748789787, 0.007631195243448019, 0.04963015019893646, 0.006127719301730394, 0.0011967391474172473, 0.027146028354763985, 0.39426562190055847], [0.021480988711118698, 0.20030753314495087, 0.043708205223083496, 0.007401058450341225, 0.0058999271132051945, 0.08866851031780243, 0.016682833433151245, 0.300577312707901, 0.02600044012069702, 0.0027000524569302797, 0.0015832876088097692, 0.0006119809113442898, 0.0017696209251880646, 0.0030809317249804735, 0.0005779119092039764, 0.0019842691253870726, 0.27696508169174194], [0.012251793406903744, 0.046869199723005295, 0.02878580056130886, 0.005244807805866003, 0.0007193086203187704, 0.017941992729902267, 0.0037366722244769335, 0.45272722840309143, 0.01414021197706461, 0.004053107462823391, 0.0025738326366990805, 0.0006255095941014588, 0.0006205015815794468, 0.0017212223028764129, 0.00038242997834458947, 0.0034947048407047987, 0.40411174297332764], [0.0041989777237176895, 0.00040324119618162513, 0.001691016019321978, 0.021200507879257202, 0.0019671281334012747, 0.0017548376927152276, 0.011250638402998447, 0.5035886168479919, 0.0011291989358142018, 0.0011552784126251936, 0.0030268945265561342, 0.0007752844248898327, 0.0003820057900156826, 0.0001932286686496809, 9.917026909533888e-05, 0.002725322265177965, 0.44445863366127014], [0.00999806821346283, 0.0020211555529385805, 0.0015719713410362601, 0.0025809563230723143, 0.021287374198436737, 0.021788880228996277, 0.009073415771126747, 0.49419695138931274, 0.00348889478482306, 0.0039589665830135345, 0.002632461255416274, 0.0007558973738923669, 0.002064186381176114, 0.00031555493478663266, 0.0004444464866537601, 0.001451763091608882, 0.42236897349357605], [0.01917724311351776, 0.010487818159162998, 0.0038471068255603313, 0.0035521609243005514, 0.004626747220754623, 0.049995020031929016, 0.010201611556112766, 0.45090365409851074, 0.025451844558119774, 0.004574872553348541, 0.006401558872312307, 0.0018549918895587325, 0.0020246573258191347, 0.004124858416616917, 0.0019844488706439734, 0.0014888044679537416, 0.39930251240730286], [0.01637098379433155, 0.010938969440758228, 0.0013468789402395487, 0.005412278231233358, 0.0012291853781789541, 0.015919027850031853, 0.04087802767753601, 0.4309636652469635, 0.01244663167744875, 0.017555953934788704, 0.014009170234203339, 0.009744644165039062, 0.006223001051694155, 0.003360082395374775, 0.0022692023776471615, 0.020978758111596107, 0.39035356044769287], [0.002892599208280444, 0.0006124047213234007, 0.0008798516937531531, 0.001340598682872951, 0.00022416871797759086, 0.0011904370039701462, 0.002633914817124605, 0.5279956459999084, 0.0005072889616712928, 0.0012499174335971475, 0.0009055507835000753, 0.0005750969285145402, 0.00020429751020856202, 0.00021154091518837959, 0.0003355686494614929, 0.0015126094222068787, 0.456728458404541], [0.04075222089886665, 0.012954768724739552, 0.005223289132118225, 0.0033225025981664658, 0.0019433426205068827, 0.009072254411876202, 0.0054921857081353664, 0.21823474764823914, 0.0463719442486763, 0.011752347461879253, 0.00147629389539361, 0.007255225442349911, 0.14189735054969788, 0.2490113228559494, 0.011619366705417633, 0.03508799523115158, 0.19853287935256958], [0.016528114676475525, 0.0036279086489230394, 0.002210112288594246, 0.0009533863631077111, 0.0008303510840050876, 0.00223417766392231, 0.004966397304087877, 0.32700714468955994, 0.00591988256201148, 0.04761603847146034, 0.0007012527785263956, 0.008117501623928547, 0.013477770611643791, 0.01972106657922268, 0.006106044631451368, 0.2420887053012848, 0.29789406061172485], [0.0019254329381510615, 0.00026645755860954523, 0.0008027945295907557, 0.0005093683139421046, 6.479684088844806e-05, 0.00023225342738442123, 0.000847706978674978, 0.5265746712684631, 8.768107363721356e-05, 0.0006163345533423126, 0.0007661805720999837, 0.0013035588199272752, 0.0005238872836343944, 0.0002731305721681565, 0.002233217703178525, 0.006698576267808676, 0.45627400279045105], [0.007177839521318674, 0.0005732851568609476, 0.0008213947876356542, 0.0004909157287329435, 0.0001423762005288154, 0.0008940909756347537, 0.002146889688447118, 0.413679838180542, 0.0013246947200968862, 0.004548837896436453, 0.0003284709819126874, 0.006007183808833361, 0.005461385939270258, 0.0025567919947206974, 0.0010900050401687622, 0.1837591826915741, 0.36899688839912415], [0.02561011165380478, 0.0021829032339155674, 0.0010717230616137385, 0.00046676420606672764, 0.0006157277966849506, 0.0027721738442778587, 0.005489006172865629, 0.47369009256362915, 0.0017233727267012, 0.007978984154760838, 0.00045849656453356147, 0.0019395765848457813, 0.0303040724247694, 0.001505997614003718, 0.0004254317900631577, 0.014789247885346413, 0.42897629737854004], [0.03696446493268013, 0.010212318040430546, 0.003192691830918193, 0.0015125449281185865, 0.0025499719195067883, 0.01926470175385475, 0.011892521753907204, 0.41882482171058655, 0.04431489482522011, 0.021493088454008102, 0.0007703466108068824, 0.0014951257035136223, 0.010517427697777748, 0.016892556101083755, 0.004053744953125715, 0.01722819358110428, 0.37882062792778015], [0.0012698386562988162, 0.0008998833945952356, 0.0019171577878296375, 0.0005755862803198397, 0.0003075127606280148, 0.0007472692523151636, 0.00040291453478857875, 0.5199238061904907, 0.0015692142769694328, 0.004481369163841009, 0.00031185447005555034, 0.0026504655834287405, 0.0005258583114482462, 0.0004566351417452097, 0.008261977694928646, 0.00889710895717144, 0.44680145382881165], [0.017162121832370758, 0.008930070325732231, 0.002960881683975458, 0.008458403870463371, 0.0014808080159127712, 0.002465372672304511, 0.004272541031241417, 0.37467750906944275, 0.006283679511398077, 0.048825398087501526, 0.00467823026701808, 0.01272621750831604, 0.007684091571718454, 0.0016075108433142304, 0.004277735482901335, 0.15360160171985626, 0.33990785479545593], [0.0027782453689724207, 0.0005548556218855083, 0.0008010520250536501, 0.001239580218680203, 0.00020432769088074565, 0.0011087950551882386, 0.002456858055666089, 0.5290663838386536, 0.00046014960389584303, 0.0011583742452785373, 0.000824282004032284, 0.0005381780210882425, 0.00018530858505982906, 0.0001946971460711211, 0.00030399125535041094, 0.001403992297127843, 0.4567209482192993]], [[0.007425202056765556, 0.0025974437594413757, 0.0026101202238351107, 0.0008209756342694163, 0.0010302385780960321, 0.014317428693175316, 0.003922363743185997, 0.5023362636566162, 0.002805171301588416, 0.0015503281028941274, 0.0008579694549553096, 0.0011240445310249925, 0.001057591405697167, 0.0021739264484494925, 0.0010860738111659884, 0.002075693104416132, 0.45220914483070374], [0.006770264357328415, 0.005972276441752911, 0.0021420244593173265, 0.005134768318384886, 0.0009162653004750609, 0.0005742288194596767, 0.00016202159167733043, 0.5221955180168152, 0.0002853810728993267, 4.2538566049188375e-05, 3.8002737710485235e-05, 0.0004491515865083784, 0.002704157028347254, 0.00027734399191103876, 0.00019701973360497504, 5.448918091133237e-05, 0.45208466053009033], [0.008566777221858501, 0.10941433906555176, 0.0014877432258799672, 0.0033271692227572203, 0.0007327860221266747, 0.0012509416555985808, 0.00037246147985570133, 0.47473740577697754, 0.0002347318222746253, 7.626810111105442e-05, 2.266075534862466e-05, 2.5693872885312885e-05, 0.00020165428577456623, 0.000552683777641505, 0.0003840049612335861, 0.00012086399510735646, 0.3984917402267456], [0.013074098154902458, 0.011503158137202263, 0.009244999848306179, 0.019243838265538216, 0.002696577226743102, 0.004209107253700495, 0.0013205696595832705, 0.49940025806427, 2.3900136511656456e-05, 7.994800398591906e-06, 7.149560406105593e-05, 5.2491996029857546e-05, 9.829633927438408e-05, 0.00014081670087762177, 0.00014764699153602123, 0.00020243963808752596, 0.4385622441768646], [0.026646461337804794, 0.006142487283796072, 0.03462157025933266, 0.7925723791122437, 0.0004254003579262644, 0.003416311228647828, 0.004597627092152834, 0.06949139386415482, 0.000125798731460236, 2.772988227661699e-05, 0.000123670426546596, 3.09559291054029e-05, 1.8254648239235394e-05, 0.00011723047646228224, 0.00016665719158481807, 0.0001639237452764064, 0.06131211668252945], [0.05091645196080208, 0.005027181003242731, 0.010756370611488819, 0.0665605217218399, 0.018176529556512833, 0.01172067690640688, 0.0027567048091441393, 0.4416898787021637, 0.00011143204756081104, 2.68235871772049e-05, 2.610041519801598e-05, 0.00019703178259078413, 0.00021132739493623376, 9.477348066866398e-05, 7.093177555361763e-05, 0.00011916201037820429, 0.39153802394866943], [0.011334138922393322, 0.0032153422944247723, 0.0017572115175426006, 0.0024629547260701656, 0.042147472500801086, 0.38664260506629944, 0.005464859306812286, 0.28278908133506775, 0.000676966505125165, 0.0006975207943469286, 3.374054722371511e-05, 0.0008054838399402797, 0.0001781523897079751, 0.00024365585704799742, 0.00010423064668430015, 0.001475379685871303, 0.2599712610244751], [0.008622555062174797, 0.011306416243314743, 0.0019826276693493128, 0.0018644648371264338, 0.003865110920742154, 0.01828988827764988, 0.004855854902416468, 0.4800243675708771, 0.0028362588491290808, 0.001980483066290617, 0.0007950146682560444, 0.004053756594657898, 0.0030991502571851015, 0.0038108404260128736, 0.001117386040277779, 0.007953991182148457, 0.44354182481765747], [0.0025143208913505077, 0.000777240376919508, 0.00018222740618512034, 0.00043763197027146816, 0.00032251790980808437, 0.006862354464828968, 0.00038709273212589324, 0.4879594147205353, 0.05079947039484978, 0.0043563698418438435, 0.00039685971569269896, 0.001448364811949432, 0.0017803650116547942, 0.0008035608916543424, 3.635586108430289e-05, 0.0003442949673626572, 0.4405914545059204], [0.004076679237186909, 0.0003453223325777799, 0.00010504107194719836, 6.428507185773924e-05, 0.00023372520809061825, 0.006864870432764292, 0.0011121939169242978, 0.2626172602176666, 0.4548511207103729, 0.028709890320897102, 0.004185290541499853, 0.003194513963535428, 0.0017407663399353623, 0.00314951385371387, 6.387726898537949e-05, 7.771598029648885e-05, 0.22860784828662872], [0.032990384846925735, 0.00014013412874192, 0.0003217288467567414, 7.145150448195636e-05, 0.0004984312108717859, 0.0006993322167545557, 0.0021276529878377914, 0.48495247960090637, 0.004532465711236, 0.011890790425240993, 0.009924204088747501, 0.011929348111152649, 0.0010077401529997587, 0.00040214572800323367, 0.0011490372708067298, 0.000364261562936008, 0.43699848651885986], [0.07505974173545837, 0.00037197707570157945, 0.00014326436212286353, 0.00016874067659955472, 1.0254897460981738e-05, 0.001567467232234776, 0.002743912162259221, 0.4310899078845978, 0.017633169889450073, 0.0048622558824718, 0.07169146835803986, 0.008951961062848568, 0.0009754991624504328, 0.0016519746277481318, 0.0006357599631883204, 0.00016508132102899253, 0.38227760791778564], [0.04973914846777916, 0.00028656210633926094, 0.00023465976119041443, 9.031646914081648e-05, 2.1595305952359922e-06, 5.35813596798107e-05, 0.0004354088450782001, 0.4788591265678406, 0.0004155740316491574, 0.015408193692564964, 0.0012672515586018562, 0.017031732946634293, 0.005494358018040657, 0.0004708004416897893, 0.00019231485202908516, 0.0010652100900188088, 0.42895376682281494], [0.004097452387213707, 0.0006705439300276339, 8.236811117967591e-05, 0.00018505503248889, 2.786518234643154e-05, 0.0003961905313190073, 0.00014462658145930618, 0.46966102719306946, 0.004676370415836573, 0.002447503386065364, 0.0007320625591091812, 0.003017253475263715, 0.05609467998147011, 0.03042515367269516, 0.00020597438560798764, 0.0014591793296858668, 0.42567673325538635], [0.0027298363856971264, 0.0015751059399917722, 5.002762554795481e-05, 0.00017712997214403003, 0.0009842074941843748, 0.0012724032858386636, 0.00048385566333308816, 0.4958328902721405, 0.005367135163396597, 0.0011284564388915896, 0.0008783271186985075, 0.002233640756458044, 0.012682851403951645, 0.026623155921697617, 0.00038559251697734, 0.009621751494705677, 0.43797364830970764], [0.031615011394023895, 0.000279454659903422, 0.00013370125088840723, 0.0005555158131755888, 7.119304063962772e-05, 0.0008404497639276087, 0.00015780021203681827, 0.42147353291511536, 0.0003714954655151814, 0.0003968031669501215, 0.00698791304603219, 0.0059258099645376205, 0.0036003245040774345, 0.019507480785250664, 0.12392933666706085, 0.0005593386013060808, 0.3835948407649994], [0.008464308455586433, 0.010421286337077618, 0.0017435017507523298, 0.0016736970283091068, 0.003416377352550626, 0.015824034810066223, 0.004358411300927401, 0.48361077904701233, 0.002666371176019311, 0.0018336459761485457, 0.0007482916698791087, 0.003748172428458929, 0.002972487360239029, 0.0035974611528217793, 0.001035738387145102, 0.007420164067298174, 0.44646525382995605]], [[0.013429098762571812, 0.0036419862881302834, 0.011602386832237244, 0.028114715591073036, 0.0010216834489256144, 0.005451391916722059, 0.014386426657438278, 0.3882030248641968, 0.015244131907820702, 0.048722583800554276, 0.003299308242276311, 0.034027066081762314, 0.05746200308203697, 0.0023432376328855753, 0.00691359955817461, 0.006130550522357225, 0.3600068688392639], [0.012528679333627224, 0.011536486446857452, 0.16012762486934662, 0.06681282073259354, 0.010236645117402077, 0.025194769725203514, 0.02052130550146103, 0.36283063888549805, 0.00019723729928955436, 0.00025557904154993594, 0.00025309386546723545, 4.766435449710116e-05, 0.0002895476936828345, 5.679569949279539e-05, 0.0001920399081427604, 4.5437205699272454e-05, 0.32887357473373413], [0.001518090721219778, 0.0005738929030485451, 0.0010259355185553432, 0.0029834455344825983, 0.000458627037005499, 0.0027533764950931072, 0.0013766746269538999, 0.5308411717414856, 2.408939690212719e-05, 0.0001123573092627339, 3.4430879622959765e-06, 2.3866443370934576e-05, 6.017334817443043e-05, 1.3530736396205612e-05, 2.140068136213813e-05, 1.9084871382801794e-05, 0.45819076895713806], [0.00364386267028749, 0.0008668334339745343, 0.002377714030444622, 0.007918374612927437, 0.005713435355573893, 0.01023100409656763, 0.00792755838483572, 0.5135588645935059, 0.0004096633056178689, 0.0005258533637970686, 2.253528691653628e-05, 0.00017303484492003918, 4.085870023118332e-05, 8.950007031671703e-05, 9.385969315189868e-05, 0.00011150473437737674, 0.4462955594062805], [0.01878291182219982, 0.0001994948397623375, 0.0026753866113722324, 0.00344107486307621, 0.013488512486219406, 0.02392675168812275, 0.8094979524612427, 0.06581313163042068, 0.00169488659594208, 0.002502292860299349, 0.00018336030188947916, 2.2459464162238874e-05, 8.241332579927985e-06, 5.986135874991305e-05, 1.2208309271954931e-05, 2.9859245842089877e-05, 0.057661570608615875], [0.018263019621372223, 0.0009714372572489083, 0.008089694194495678, 0.005888329818844795, 0.0034387505147606134, 0.017152858898043633, 0.21500320732593536, 0.38266468048095703, 0.003197029698640108, 0.004877487663179636, 0.0026732904370874166, 0.0002160479489248246, 0.00011549169721547514, 0.00014358569751493633, 0.0003788558824453503, 0.00014024302072357386, 0.33678603172302246], [0.0018867620965465903, 0.00015349597379099578, 1.5062193597259466e-05, 0.001711062272079289, 0.00011897140211658552, 0.0017383157974109054, 0.0004760660231113434, 0.5249843001365662, 0.0002814894251059741, 0.0006330386968329549, 5.538395998883061e-05, 0.0020936818327754736, 0.0030259904451668262, 8.97209465620108e-05, 0.00010048997501144186, 9.444625902688131e-05, 0.4625416696071625], [0.0019404401537030935, 0.00026710497331805527, 0.00043646080303005874, 0.0013448568060994148, 0.0002520801208447665, 0.0007993921753950417, 0.001253852853551507, 0.5319430828094482, 0.00019108355627395213, 0.00038255180697888136, 0.00011508238821988925, 0.00033804005943238735, 0.0006259795045480132, 0.00014497406664304435, 0.0001294982503168285, 0.00026305404026061296, 0.45957252383232117], [0.004359433893114328, 5.899837560718879e-05, 5.3954863687977195e-05, 4.453422661754303e-05, 5.324056473909877e-06, 8.950634219218045e-05, 0.0024279041681438684, 0.5195211172103882, 0.0012249051360413432, 0.022827623412013054, 0.0006791167543269694, 0.0005938378162682056, 0.0022384419571608305, 0.00018041506700683385, 0.00011777713370975107, 0.00018077304412145168, 0.4453962743282318], [0.001174118253402412, 9.919091098709032e-06, 6.992869657551637e-06, 1.2367061572149396e-05, 1.4958388874219963e-06, 1.5068068933032919e-05, 5.1338567573111504e-05, 0.5365377068519592, 7.132679456844926e-05, 0.00040911592077463865, 5.2560735639417544e-05, 0.000568644842132926, 0.0010704934829846025, 0.00015808446914888918, 5.533575313165784e-05, 5.29913158970885e-05, 0.45975250005722046], [0.008563623763620853, 0.0003431167860981077, 0.0002186862548114732, 0.00018562193145044148, 4.889450792688876e-06, 5.6954762840177864e-05, 0.0002719702897593379, 0.5050053000450134, 0.0007429488468915224, 0.007736029103398323, 0.0003802100254688412, 0.02151867002248764, 0.010546011850237846, 0.001344748423434794, 0.0002716335584409535, 0.003991624340415001, 0.43881791830062866], [0.0029660931322723627, 3.356702291057445e-05, 1.0141250641027e-05, 4.3203133827773854e-05, 5.798979600513121e-06, 2.59182106674416e-05, 0.00019521309877745807, 0.5272481441497803, 0.00017121329437941313, 0.0020905574783682823, 0.00047549520968459547, 0.001543577411212027, 0.006083074025809765, 0.0015674022724851966, 0.001168195391073823, 0.0036872567143291235, 0.4526851773262024], [0.010147598572075367, 0.00010542810196056962, 2.791151200653985e-05, 0.00012226782564539462, 1.7211483282153495e-05, 3.7915913708275184e-05, 0.0005615013651549816, 0.5207734107971191, 0.00027389396564103663, 0.00097418058430776, 0.00020318047609180212, 0.00013933134323451668, 0.0036254117731004953, 0.015212814323604107, 0.004228777252137661, 0.006219623610377312, 0.4373294413089752], [0.005111003760248423, 4.909404015052132e-05, 0.00010454035509610549, 7.38386734155938e-05, 1.671996506047435e-05, 5.804696775157936e-05, 0.0007894966402091086, 0.533173143863678, 3.253994873375632e-05, 0.00039570798981003463, 7.7376353146974e-05, 2.6055749913211912e-05, 0.0011553148506209254, 0.002018977887928486, 0.0007838904857635498, 0.0022354591637849808, 0.45389893651008606], [0.0026686619967222214, 7.297358388314024e-05, 4.0600421925773844e-05, 2.6441079171490856e-05, 6.856406798760872e-06, 3.103095878032036e-05, 0.0005197684513404965, 0.5150782465934753, 3.3197928132722154e-05, 0.0009468038915656507, 0.00015436076500918716, 0.0005838457145728171, 0.00041339886956848204, 0.00022115104366093874, 0.00026926607824862003, 0.03097512573003769, 0.44795823097229004], [0.0024070434737950563, 1.4101845408731606e-05, 1.3391898392001167e-05, 0.00010181788093177602, 2.8720201953547075e-05, 4.992862886865623e-05, 0.00023564348521176726, 0.535584568977356, 2.4876797397155315e-05, 8.097758109215647e-05, 1.265942682948662e-05, 3.748729795916006e-05, 0.00019603455439209938, 6.647333066212013e-05, 6.171799032017589e-05, 0.0011960945557802916, 0.4598884880542755], [0.0018854152876883745, 0.00024195419973693788, 0.0003892919630743563, 0.0011964889708906412, 0.0002212225808762014, 0.0007074782624840736, 0.0011521686101332307, 0.5327167510986328, 0.00017868781287688762, 0.00036445166915655136, 0.00010535194451222196, 0.0003095974388998002, 0.000576945545617491, 0.00013362325262278318, 0.0001223348081111908, 0.0002493462816346437, 0.4594489336013794]], [[0.006175864953547716, 0.003945250995457172, 0.00628415122628212, 0.00897055584937334, 0.0038659111596643925, 0.004598570987582207, 0.0033187842927873135, 0.048225488513708115, 0.24289733171463013, 0.13769705593585968, 0.015315343625843525, 0.0843270868062973, 0.22311508655548096, 0.05308913812041283, 0.016709063202142715, 0.0939989686012268, 0.04746631532907486], [0.015252792276442051, 0.04862416908144951, 0.09531359374523163, 0.13979670405387878, 0.06462687999010086, 0.19970370829105377, 0.1847115010023117, 0.11240889877080917, 0.007423353847116232, 0.004724742844700813, 0.00517642218619585, 0.003517323173582554, 0.009624018333852291, 0.003620052244514227, 0.001573661807924509, 0.003775583580136299, 0.10012663900852203], [0.013099843636155128, 0.031403522938489914, 0.0437195710837841, 0.1527964323759079, 0.05508628487586975, 0.14972637593746185, 0.12614937126636505, 0.20968861877918243, 0.006676316261291504, 0.005382124800235033, 0.0042049698531627655, 0.0040451777167618275, 0.00628983648493886, 0.004927252884954214, 0.001016740221530199, 0.0023255967535078526, 0.1834620237350464], [0.019437162205576897, 0.012755387462675571, 0.023538270965218544, 0.06124407425522804, 0.04496537894010544, 0.06899400800466537, 0.1826242357492447, 0.29200685024261475, 0.008207333274185658, 0.005039602052420378, 0.0029892302118241787, 0.0024144684430211782, 0.006978695280849934, 0.0028308345936238766, 0.0022003480698913336, 0.005020852200686932, 0.25875332951545715], [0.009907702915370464, 0.0018682557856664062, 0.006101534701883793, 0.01869630068540573, 0.0050467136316001415, 0.009216560050845146, 0.032967571169137955, 0.48622605204582214, 0.0034733815118670464, 0.003229400608688593, 0.0010227082530036569, 0.0009959868621081114, 0.0026514981873333454, 0.0010646467562764883, 0.0006860384601168334, 0.00045955844689160585, 0.41638606786727905], [0.006017141509801149, 0.0028323547448962927, 0.005344317760318518, 0.012749806977808475, 0.006344646215438843, 0.008436936885118484, 0.03260650485754013, 0.4763806462287903, 0.011315946467220783, 0.00689718360081315, 0.0013661691918969154, 0.004801446571946144, 0.00597792724147439, 0.004388711880892515, 0.0008289804100058973, 0.0023347896058112383, 0.41137656569480896], [0.006094898097217083, 0.003370960708707571, 0.003929188009351492, 0.01474632415920496, 0.01013614796102047, 0.012572094798088074, 0.008729614317417145, 0.39078688621520996, 0.0410468615591526, 0.032445598393678665, 0.009561053477227688, 0.031201500445604324, 0.04865817353129387, 0.009710405021905899, 0.003362491028383374, 0.012224867939949036, 0.36142292618751526], [0.004317966289818287, 0.00032809170079417527, 0.0006002673762850463, 0.0023465193808078766, 0.0006917279679328203, 0.0011808231938630342, 0.003571414854377508, 0.5369846224784851, 0.0004878594190813601, 0.0006555644213221967, 0.0005305142258293927, 0.0003943619958590716, 0.0011346598621457815, 0.00046252066385932267, 0.00034405410406179726, 0.0007058824994601309, 0.445263147354126], [0.0047135562635958195, 0.001496805576607585, 0.0008691507391631603, 0.0047647589817643166, 0.0006039981381036341, 0.0010083969682455063, 0.01732650399208069, 0.2374427169561386, 0.016609011217951775, 0.05085546895861626, 0.011805305257439613, 0.034794338047504425, 0.28470730781555176, 0.022188395261764526, 0.010919101536273956, 0.0834425613284111, 0.21645261347293854], [0.0036470501217991114, 0.0006071061943657696, 0.00031795151880942285, 0.0025497472379356623, 0.0002803914248943329, 0.00037247707950882614, 0.001475389814004302, 0.43816325068473816, 0.0046654753386974335, 0.005412637256085873, 0.0037714422214776278, 0.0107771847397089, 0.06160571798682213, 0.0150587884709239, 0.004622121341526508, 0.05366591736674309, 0.39300736784935], [0.0019491214770823717, 0.00033375126076862216, 0.00025799963623285294, 0.0011548256734386086, 8.759389311308041e-05, 0.00020983777358196676, 0.0008302581263706088, 0.508965015411377, 0.0017834737664088607, 0.0022154662292450666, 0.00023068675363902003, 0.0023993130307644606, 0.02320173755288124, 0.010359846986830235, 0.0013436316512525082, 0.004490940365940332, 0.4401865601539612], [0.004381018225103617, 0.001511939219199121, 0.0007991287857294083, 0.0035301579628139734, 0.0007575963390991092, 0.0007673099753446877, 0.001495296019129455, 0.4250960946083069, 0.006629580166190863, 0.008303679525852203, 0.005802220664918423, 0.006066073663532734, 0.051677949726581573, 0.028533771634101868, 0.022534245625138283, 0.040578022599220276, 0.39153581857681274], [0.006490889471024275, 0.0007184636197052896, 0.0018010372295975685, 0.0038288177456706762, 0.0013747181510552764, 0.0024147359654307365, 0.004594300873577595, 0.34336721897125244, 0.006001268047839403, 0.0060815042816102505, 0.0022297606337815523, 0.006361328065395355, 0.07384029775857925, 0.05843540281057358, 0.03870110213756561, 0.13202668726444244, 0.3117324709892273], [0.004799284972250462, 0.0005600673030130565, 0.000984428683295846, 0.003986151423305273, 0.0004072503943461925, 0.000391324982047081, 0.003005967941135168, 0.4744367301464081, 0.0012301995884627104, 0.0024145557545125484, 0.0007176084909588099, 0.0016344105824828148, 0.015994219109416008, 0.007028825581073761, 0.012257235124707222, 0.048636484891176224, 0.42151525616645813], [0.0007758397259749472, 9.463675087317824e-05, 0.00010434713476570323, 0.0006669131107628345, 0.00022609616280533373, 0.0004218752437736839, 0.0006185831152833998, 0.5243107676506042, 0.0005813146126456559, 0.0010741215664893389, 0.00032726104836910963, 0.001012048334814608, 0.005130719393491745, 0.002348396461457014, 0.0016353614628314972, 0.010688836686313152, 0.4499829113483429], [0.008535031229257584, 0.00124287698417902, 0.0007581211975775659, 0.0037881650496274233, 0.0013703524600714445, 0.0027693803422152996, 0.006553663406521082, 0.4495081603527069, 0.00795458722859621, 0.007624363526701927, 0.001745105255395174, 0.00566369853913784, 0.01906799152493477, 0.01299371663480997, 0.00965962279587984, 0.04445880278944969, 0.41630634665489197], [0.003962979186326265, 0.0002826295094564557, 0.0005205389461480081, 0.0020721559412777424, 0.0006098378798924387, 0.0010362726170569658, 0.0031905127689242363, 0.5386877059936523, 0.000428089959314093, 0.0005787468398921192, 0.0004807990335393697, 0.00034740526461973786, 0.0010038474574685097, 0.0004103929386474192, 0.0003062132454942912, 0.0006334269419312477, 0.44544848799705505]], [[0.0057395449839532375, 0.034898482263088226, 0.021690944209694862, 0.0015437914989888668, 0.0007724487804807723, 0.0008452574256807566, 0.011978999711573124, 0.17404402792453766, 0.4507349729537964, 0.1070479154586792, 0.018422221764922142, 0.004494649823755026, 0.005217648111283779, 0.002134372480213642, 0.008646692149341106, 0.003765205154195428, 0.1480228304862976], [0.00017560465494170785, 0.014144674874842167, 0.8812409043312073, 0.021323075518012047, 0.0007535643526352942, 0.0003517350705806166, 0.0063140373677015305, 0.03975151106715202, 0.0005505993030965328, 0.0004255290259607136, 3.3057815016945824e-05, 6.526439392473549e-06, 1.342506493529072e-05, 0.0001334703410975635, 0.00044544198317453265, 8.956203964771703e-05, 0.0342472568154335], [0.004669233225286007, 0.0018677617190405726, 0.005225381813943386, 0.4255354404449463, 0.004361957777291536, 0.0020976110827177763, 0.004492426756769419, 0.2924737334251404, 5.1102048018947244e-05, 2.8029238819726743e-05, 0.0005088843172416091, 7.189149619080126e-05, 0.00020526518346741796, 3.0187296943040565e-05, 5.2813076763413846e-05, 0.00032151761115528643, 0.2580067813396454], [0.008065272122621536, 0.0005423752008937299, 0.006205158308148384, 0.012323830276727676, 0.47498512268066406, 0.041645441204309464, 0.10149016231298447, 0.1886427402496338, 2.139467869710643e-05, 7.940344949020073e-05, 1.2292213796172291e-05, 2.0096278603887185e-05, 1.3449286598188337e-05, 6.081704668758903e-06, 5.505526496563107e-05, 0.002120624529197812, 0.16377151012420654], [0.00031651402241550386, 0.0016607873840257525, 0.04897366091609001, 0.0030587073415517807, 0.007253868505358696, 0.826690673828125, 0.065462626516819, 0.023037049919366837, 0.0028736458625644445, 0.0003362225543241948, 0.00012965593487024307, 2.970620244013844e-06, 6.758063477718679e-07, 2.3727472580503672e-05, 6.281709647737443e-05, 2.9669639843632467e-05, 0.02008666843175888], [0.0008942944114096463, 7.756378181511536e-05, 0.005150484852492809, 0.00036026540328748524, 0.006401918362826109, 0.00420917896553874, 0.927274763584137, 0.028796246275305748, 0.0011359560303390026, 0.0010280614951625466, 3.362509960425086e-05, 8.512562999385409e-06, 1.401302370140911e-06, 1.5549273939541308e-06, 1.1907707630598452e-05, 7.5080993156007025e-06, 0.024606691673398018], [0.0021271570585668087, 7.24953060853295e-05, 7.968379941303283e-05, 3.654774991446175e-05, 8.778824849287048e-06, 0.00013967775157652795, 0.0001836732408264652, 0.5406206250190735, 0.0008528832113370299, 0.00021936013945378363, 0.00024103006580844522, 6.219970964593813e-05, 1.4126781934464816e-05, 2.107683485519374e-06, 5.116896204526711e-07, 1.825200888561085e-05, 0.45532092452049255], [0.1939414143562317, 0.004207228776067495, 0.007781782187521458, 0.004237873945385218, 0.0041242255829274654, 0.0059728133492171764, 0.011701170355081558, 0.3897494673728943, 0.0017077679513022304, 0.00295432866550982, 0.0024328476283699274, 0.0028704700525850058, 0.0023356906604021788, 0.0020061719696968794, 0.0015157136367633939, 0.010150437243282795, 0.3523106873035431], [0.0001477715850342065, 1.1027561413357034e-05, 5.8816603996092454e-05, 1.5217424333968665e-05, 9.242963869837695e-07, 1.3892597507947357e-06, 0.0014048184966668487, 0.05455755442380905, 0.017759917303919792, 0.855491042137146, 0.023981744423508644, 0.00023354555014520884, 2.8893135095131584e-05, 4.698729753727093e-06, 0.00034749569022096694, 8.048285963013768e-06, 0.0459471270442009], [0.0005433694459497929, 1.5446536053786986e-05, 7.632502820342779e-05, 3.2322859624400735e-05, 7.213153367047198e-06, 1.6642412447254173e-05, 0.00036609749076887965, 0.33601677417755127, 0.0033035362139344215, 0.01100230123847723, 0.3101525604724884, 0.014827695675194263, 0.029352525249123573, 0.0025342488661408424, 0.0007018875912763178, 0.0004939206410199404, 0.2905570864677429], [0.0018131741089746356, 5.0253485824214295e-05, 1.2261157280590851e-05, 9.541688996250741e-06, 2.5826655019045575e-06, 1.1392711712687742e-05, 4.182837074040435e-05, 0.09569817781448364, 0.00014276904403232038, 0.0048253838904201984, 0.0019814225379377604, 0.8089271187782288, 0.0021713904570788145, 0.0002474809007253498, 5.86905343880062e-06, 0.0005838021752424538, 0.0834755226969719], [0.0006300511304289103, 5.9208097809460014e-05, 2.971307185362093e-05, 1.7761936760507524e-05, 4.7044948587426916e-05, 6.217254849616438e-06, 0.0003699070366565138, 0.027058668434619904, 0.0002372639428358525, 0.001391625264659524, 0.005546775180846453, 0.0020576382521539927, 0.9324154257774353, 0.0037649297155439854, 0.0007491814321838319, 0.0004918042686767876, 0.025126680731773376], [0.001699381391517818, 0.0009227353148162365, 0.002186755882576108, 0.0001108287979150191, 2.2968608391238376e-05, 2.5683664716780186e-05, 8.489964966429397e-05, 0.1112765371799469, 0.0005128133343532681, 0.0007594844209961593, 0.0009199709165841341, 0.0010126670822501183, 0.04829464852809906, 0.7155154943466187, 0.011472368612885475, 0.005590327084064484, 0.09959232807159424], [0.0004729439096990973, 9.460962610319257e-05, 0.0014289006358012557, 0.0004330741357989609, 2.1109253793838434e-05, 3.439404963501147e-06, 0.0006501249736174941, 0.0957702025771141, 2.7550726372282952e-05, 0.0019254108192399144, 0.0052498686127364635, 0.0008173897513188422, 0.001191838993690908, 0.031604181975126266, 0.6989022493362427, 0.07844530791044235, 0.08296176046133041], [0.0006847004988230765, 2.7315905754221603e-05, 0.0002011048054555431, 3.82870202884078e-05, 0.00015577209705952555, 7.022283534752205e-05, 0.0001641656126594171, 0.06406591087579727, 1.7839237216321635e-06, 5.3118659707251936e-05, 0.0004949983558617532, 0.0012367035960778594, 8.386502304347232e-05, 0.00031186151318252087, 0.0007020328775979578, 0.876522421836853, 0.0551857054233551], [0.002597780665382743, 0.00016966034309007227, 0.00019751473155338317, 0.0005385612021200359, 0.00012868043268099427, 0.00011791646829806268, 0.0001660403941059485, 0.5229778289794922, 0.00011765754607040435, 1.8530892702983692e-05, 0.0001089180150302127, 3.386388561921194e-05, 0.0023883029352873564, 0.0017526083393022418, 0.00030817664810456336, 0.0020702555775642395, 0.46630778908729553], [0.18669544160366058, 0.0038705889601260424, 0.007368527818471193, 0.0037887864746153355, 0.004014580510556698, 0.005711921025067568, 0.011122770607471466, 0.39559870958328247, 0.0016193253686651587, 0.0027696313336491585, 0.00231151538901031, 0.002583674853667617, 0.0023099915124475956, 0.001943324925377965, 0.001440319698303938, 0.0094740130007267, 0.3573768734931946]], [[0.003440571017563343, 0.010880602523684502, 0.02744508907198906, 0.019404573366045952, 0.01659337431192398, 0.02782086282968521, 0.041557200253009796, 0.4520622491836548, 0.0028665766585618258, 0.000810016121249646, 0.0037664955016225576, 0.0011141609866172075, 0.0011485782451927662, 0.0012300190282985568, 0.000790411897469312, 0.0007010676781646907, 0.3883681297302246], [0.10289955884218216, 0.029845770448446274, 0.045795366168022156, 0.023173559457063675, 0.0036170308012515306, 0.000955504656303674, 0.008273197337985039, 0.419643759727478, 0.0010368991643190384, 0.0012868238845840096, 0.00041938628419302404, 0.0016917828470468521, 0.005136492662131786, 0.0014233733527362347, 0.0005495689692907035, 0.0024534522090107203, 0.3517986238002777], [0.0680876225233078, 0.2535924017429352, 0.04967757686972618, 0.07282523810863495, 0.015687506645917892, 0.006525369361042976, 0.002380979713052511, 0.28112366795539856, 0.001131804077886045, 0.0005172902019694448, 0.00031889084493741393, 0.0005281688063405454, 0.004586036782711744, 0.0020697657018899918, 0.001068195910193026, 0.0008402175735682249, 0.23903927206993103], [0.018545055761933327, 0.10370425879955292, 0.4498167335987091, 0.08382479101419449, 0.0035887337289750576, 0.00277325208298862, 0.002717266557738185, 0.17746742069721222, 0.0004359005833975971, 8.122670988086611e-05, 0.00018335366621613503, 0.00023879151558503509, 0.00042872902122326195, 0.0011281869374215603, 0.0005219769082032144, 0.0008287490927614272, 0.1537155956029892], [0.015519064851105213, 0.020838193595409393, 0.15030981600284576, 0.6000524163246155, 0.004322231747210026, 0.014264766126871109, 0.006336112041026354, 0.09699022769927979, 0.001233358052559197, 0.00014386122347787023, 0.00039052986539900303, 0.00040046489448286593, 7.404699135804549e-05, 0.0005283773061819375, 0.001007423154078424, 0.0007875253795646131, 0.08680156618356705], [0.003047418314963579, 0.003457003040239215, 0.039402082562446594, 0.48182621598243713, 0.16994164884090424, 0.009276178665459156, 0.004938339348882437, 0.15056729316711426, 0.00036906340392306447, 0.00012627645628526807, 5.937435707892291e-05, 0.00018193147843703628, 0.0002493946230970323, 0.00022274210641626269, 0.0003394064260646701, 0.0006558246677741408, 0.13533978164196014], [0.008527873083949089, 0.007258834782987833, 0.010181975550949574, 0.17046326398849487, 0.20007948577404022, 0.44867488741874695, 0.007201803382486105, 0.07501641660928726, 0.0034886577632278204, 0.000571828568354249, 0.00018957628344651312, 0.0002765682293102145, 0.00023334509751293808, 0.0007874305010773242, 0.00026527224690653384, 0.0002959469857160002, 0.0664868950843811], [0.007320607081055641, 0.00566095532849431, 0.002247384050861001, 0.004465071018785238, 0.004407054278999567, 0.0045122550800442696, 0.005406326148658991, 0.4965779483318329, 0.004300887696444988, 0.0035423680674284697, 0.0034445892088115215, 0.00217936048284173, 0.003026118967682123, 0.004386344458907843, 0.0018377554370090365, 0.0025727797765284777, 0.4441121518611908], [0.2214091718196869, 0.002929472364485264, 0.0017475795466452837, 0.0013661517295986414, 0.008364797569811344, 0.020577499642968178, 0.07796277105808258, 0.24973013997077942, 0.09809868037700653, 0.08607316762208939, 0.0031449098605662584, 0.005501076579093933, 0.002677298616617918, 0.00028063173522241414, 0.00010143366671400145, 0.0015421733260154724, 0.21849308907985687], [0.037128616124391556, 0.0006448199856095016, 0.00039114427636377513, 0.0004617329977918416, 0.0005827924469485879, 0.0010484462836757302, 0.004569334909319878, 0.03115818277001381, 0.7952329516410828, 0.05366581305861473, 0.03291746973991394, 0.010212584398686886, 0.003347051562741399, 0.0005680648027919233, 0.00017261238826904446, 7.270722562680021e-05, 0.027825601398944855], [0.022663377225399017, 0.00039646547520533204, 0.0007369153317995369, 0.0003822481376118958, 0.00012961715401615947, 0.0015209624543786049, 0.014932522550225258, 0.38320720195770264, 0.11310999095439911, 0.08783471584320068, 0.011071893386542797, 0.01976374350488186, 0.0027226272504776716, 0.002128613879904151, 0.0005077141686342657, 0.00042511546052992344, 0.3384663164615631], [0.015247219242155552, 0.0008372714510187507, 0.0003675698535516858, 0.0002641345199663192, 0.0001396610023221001, 0.0006273903418332338, 0.0023229257203638554, 0.09065282344818115, 0.2543458044528961, 0.09058079868555069, 0.30392321944236755, 0.07051252573728561, 0.07200298458337784, 0.015055429190397263, 0.00094330043066293, 0.0002604113833513111, 0.08191651850938797], [0.006750714965164661, 0.0006810250342823565, 0.00020807948021683842, 0.00018833656213246286, 1.3156471140973736e-05, 0.00013248895993456244, 0.0005804749089293182, 0.1443941295146942, 0.053339485079050064, 0.05998692661523819, 0.0811389610171318, 0.4085361361503601, 0.09753821790218353, 0.016631102189421654, 0.0025038940366357565, 0.0008385027176700532, 0.12653839588165283], [0.004715628456324339, 0.000935802236199379, 0.0002654826093930751, 0.0002353394083911553, 4.3023403122788295e-05, 9.805598529055715e-05, 0.0003767317975871265, 0.07784398645162582, 0.001655018306337297, 0.03298838436603546, 0.00918527040630579, 0.0998644232749939, 0.6178584098815918, 0.07097785919904709, 0.0028955070301890373, 0.009885328821837902, 0.07017578184604645], [0.012364134192466736, 0.002775064669549465, 0.001260725548490882, 0.001959483837708831, 0.00024812520132400095, 0.00016816471179481596, 0.000226765958359465, 0.3868865966796875, 0.0032958402298390865, 0.0037555666640400887, 0.0035683782771229744, 0.022087788209319115, 0.08879271149635315, 0.11169511079788208, 0.006073730997741222, 0.00809676293283701, 0.3467450439929962], [0.010810469277203083, 0.0017047140281647444, 0.0004338170401751995, 0.001174248056486249, 0.00012571302067954093, 0.00010604305134620517, 0.0001508207933511585, 0.05373001471161842, 0.0012077274732291698, 0.0049799238331615925, 0.00605483865365386, 0.021346213296055794, 0.14662303030490875, 0.41330698132514954, 0.2739194631576538, 0.01490285899490118, 0.0494232214987278], [0.006803965661674738, 0.005220658145844936, 0.0020801853388547897, 0.004016949329525232, 0.00393712380900979, 0.0040673245675861835, 0.005031764507293701, 0.4991752505302429, 0.004091160371899605, 0.0034153074957430363, 0.0033397572115063667, 0.0020689526572823524, 0.002870053518563509, 0.004139008466154337, 0.0017325193621218204, 0.002451210515573621, 0.4455588459968567]], [[0.014440380036830902, 0.0024352974724024534, 0.0018017878755927086, 0.009026736952364445, 0.0008410246227867901, 0.002888438757508993, 0.02956835925579071, 0.4710801839828491, 0.004819110501557589, 0.002614728407934308, 0.011192835867404938, 0.008896052837371826, 0.01319106575101614, 0.003592859022319317, 0.0018756671342998743, 0.016239065676927567, 0.4054964482784271], [0.04798517003655434, 0.024043407291173935, 0.02687765657901764, 0.4849538207054138, 0.015073864720761776, 0.1215098649263382, 0.06075640767812729, 0.11261707544326782, 0.0011261440813541412, 0.0008208783692680299, 0.002831036690622568, 0.0014744948130100965, 0.0006539718597196043, 0.000732982181943953, 0.00035468724672682583, 0.001168350805528462, 0.09702024608850479], [0.02463426999747753, 0.016254328191280365, 0.016583509743213654, 0.2966495454311371, 0.009841088205575943, 0.025865640491247177, 0.018203619867563248, 0.31003156304359436, 0.0005840617814101279, 0.0007133132894523442, 0.0032667270861566067, 0.0011311076814308763, 0.0004842136986553669, 0.00022001925390213728, 0.0007448350079357624, 0.0017880105879157782, 0.2730042636394501], [0.009727727621793747, 0.00655414629727602, 0.003984907176345587, 0.011692997068166733, 0.048646315932273865, 0.007386611308902502, 0.018537964671850204, 0.47611796855926514, 0.0015214915620163083, 0.0005093598156236112, 0.0005313513684086502, 0.00011774931772379205, 0.0016934757586568594, 0.00034772080834954977, 0.0011858053039759398, 0.0007519843056797981, 0.4106924831867218], [0.002327314577996731, 0.00047327266656793654, 0.000505640113260597, 0.002556554740294814, 0.0004941041115671396, 0.0031087035313248634, 0.008915291167795658, 0.535338282585144, 0.00023432080342900008, 0.0002102612634189427, 0.00011343327787471935, 0.00018520398589316756, 0.00013442046474665403, 4.476711910683662e-05, 6.43779058009386e-05, 0.0001545885461382568, 0.44513949751853943], [0.004443404730409384, 0.0015456087421625853, 0.0005804640240967274, 0.0008983268053270876, 0.002845048438757658, 0.004139023832976818, 0.010338282212615013, 0.5248148441314697, 0.0011919512180611491, 0.0007699420093558729, 0.00034105320810340345, 0.0004539853835012764, 0.0008091671043075621, 0.00041490676812827587, 7.163146801758558e-05, 0.0003782362036872655, 0.44596412777900696], [0.010116705670952797, 0.0028069966938346624, 0.0007607156876474619, 0.0027354543562978506, 0.0043440586887300014, 0.0017847944982349873, 0.014208587817847729, 0.49573445320129395, 0.002725639846175909, 0.0014008230064064264, 0.0003409167402423918, 0.004040850792080164, 0.006733515299856663, 0.004132870119065046, 0.0005884367856197059, 0.002992566442117095, 0.44455257058143616], [0.005210913252085447, 0.002713436260819435, 0.0024838570971041918, 0.003947200253605843, 0.004117727745324373, 0.0026788036338984966, 0.005776779726147652, 0.49914804100990295, 0.0023259783629328012, 0.002620273968204856, 0.002508284756913781, 0.0028121185023337603, 0.004384955856949091, 0.0022111220750957727, 0.0022741868160665035, 0.004002066794782877, 0.45078423619270325], [0.024539778009057045, 0.0006011960213072598, 0.0005050112958997488, 0.0028635242488235235, 0.0003974633291363716, 0.0005619957810267806, 0.02069980651140213, 0.2555364668369293, 0.01254504919052124, 0.01745782233774662, 0.06002669408917427, 0.1348293572664261, 0.08860830962657928, 0.0180517565459013, 0.008795454166829586, 0.12487274408340454, 0.22910761833190918], [0.006941982079297304, 0.00038756447611376643, 0.00015058134158607572, 0.0009448044584132731, 0.00017708410450723022, 0.0004227448080200702, 0.004905796609818935, 0.47828903794288635, 0.004162816796451807, 0.007043411955237389, 0.0051765404641628265, 0.013008101843297482, 0.018811719492077827, 0.011008846573531628, 0.005730487406253815, 0.026834027841687202, 0.416004478931427], [0.015111254528164864, 0.0005826752167195082, 0.00024791978648863733, 0.000573134224396199, 5.744310328736901e-05, 0.00013534702884498984, 0.006220465060323477, 0.4845297932624817, 0.009056572802364826, 0.009682436473667622, 0.004036608152091503, 0.014424985274672508, 0.005704889073967934, 0.010819056071341038, 0.013044719584286213, 0.006429624743759632, 0.4193432033061981], [0.018200015649199486, 0.00038784457137808204, 0.0002531968639232218, 0.00016438277089037, 2.6519939638092183e-05, 0.00025292151258327067, 0.002164713339880109, 0.5075667500495911, 0.003474367782473564, 0.0005658806767314672, 0.0002546230098232627, 0.0005690058460459113, 0.010042199864983559, 0.005828756373375654, 0.0013137548230588436, 0.014839527197182178, 0.4340955913066864], [0.015351295471191406, 0.0007287039770744741, 0.00016824135673232377, 0.0009152168058790267, 0.0002544331655371934, 0.0002455971552990377, 0.0010894873412325978, 0.4655109643936157, 0.0015341357793658972, 0.0007732787053100765, 0.0009815931553021073, 0.0008880509994924068, 0.020219115540385246, 0.014216090552508831, 0.004407091531902552, 0.07239099591970444, 0.4003257751464844], [0.013404683209955692, 0.0004175263165961951, 0.00029183723381720483, 0.0014615829568356276, 0.0004576477222144604, 0.0003948585654143244, 0.0038722888566553593, 0.4164462983608246, 0.0028963391669094563, 0.002000988693907857, 0.0038559637032449245, 0.004672961309552193, 0.02300344780087471, 0.018134830519557, 0.010075806640088558, 0.11463042348623276, 0.38398250937461853], [0.0032383911311626434, 0.0001139999003498815, 8.294112194562331e-05, 0.001077339518815279, 0.00019792802049778402, 6.725641287630424e-05, 0.00046361604472622275, 0.5209450125694275, 0.0006567223463207483, 0.0005903710261918604, 0.00016616906214039773, 0.00034436825080774724, 0.001607914688065648, 0.0034573241136968136, 0.001863019773736596, 0.008785981684923172, 0.45634159445762634], [0.01300361193716526, 0.00018407718744128942, 5.160276123206131e-05, 0.0003921446914318949, 0.00043437289423309267, 9.893515380099416e-05, 0.002178665716201067, 0.5124618411064148, 0.0035824012011289597, 0.00043897164869122207, 0.00012950101518072188, 0.0003230892471037805, 0.0071645393036305904, 0.002322521060705185, 0.001444441033527255, 0.007167648524045944, 0.448621541261673], [0.005100694485008717, 0.002380759222432971, 0.002169976942241192, 0.003463252680376172, 0.003490941133350134, 0.002369044115766883, 0.005142476875334978, 0.5019798278808594, 0.0021299703512340784, 0.0023996764793992043, 0.002260474720969796, 0.002536521991714835, 0.003969251178205013, 0.002007785951718688, 0.0020910478197038174, 0.003558834781870246, 0.4529494345188141]]], [[[0.15162232518196106, 0.10971194505691528, 0.03927284851670265, 0.12180411070585251, 0.007406389806419611, 0.04297800362110138, 0.05651146546006203, 0.24628019332885742, 0.0047410340048372746, 0.0035994015634059906, 2.9562075724243186e-05, 0.0009802805725485086, 0.0004473467415664345, 0.001506675616838038, 3.284397826064378e-05, 0.0013963790843263268, 0.21167916059494019], [0.0008379274513572454, 0.14040544629096985, 0.04065341502428055, 0.004631598945707083, 0.000680327124428004, 0.0030875559896230698, 0.0010817584116011858, 0.4129577875137329, 0.001525851315818727, 0.0004354039265308529, 0.0001817269658204168, 6.307653529802337e-05, 0.00010641593689797446, 0.001676537562161684, 7.237045065267012e-05, 0.0006852989317849278, 0.39091756939888], [0.000435569672845304, 0.015458516776561737, 0.021969569846987724, 0.0013672400964424014, 0.00019268198229838163, 0.0014097458915784955, 0.0007354922709055245, 0.49640288949012756, 0.0001775386044755578, 0.00037457322468981147, 0.00025415749405510724, 0.00011290940165054053, 3.335673682158813e-05, 0.00019666469597723335, 0.0001063113086274825, 0.00026891144807450473, 0.4605039060115814], [0.0007417799206450582, 0.01002402976155281, 0.011279181577265263, 0.02220406010746956, 0.003365602344274521, 0.0031453724950551987, 0.0019389520166441798, 0.49212297797203064, 0.000198520821868442, 0.0009109890088438988, 0.00040686383727006614, 0.0003546166990417987, 0.0006436220137402415, 0.0003706440329551697, 7.661977724637836e-05, 0.0006709762383252382, 0.4515452980995178], [0.002102070953696966, 0.005805108230561018, 0.0033333466853946447, 0.0037518555764108896, 0.0075051672756671906, 0.008867616765201092, 0.0026106261648237705, 0.4984143078327179, 0.0011590142967179418, 0.0006260921363718808, 0.0002037344384007156, 0.0004197617236059159, 0.002084961626678705, 0.001419029082171619, 2.968913031509146e-05, 0.0005458381492644548, 0.46112164855003357], [0.004511271137744188, 0.04235135763883591, 0.012413250282406807, 0.0035222498700022697, 0.008466082625091076, 0.056719664484262466, 0.022529175505042076, 0.4257921278476715, 0.005296122748404741, 0.0023711728863418102, 0.0005830498412251472, 0.0006439212011173368, 0.0015566437505185604, 0.00810290314257145, 0.0004547798016574234, 0.002120276214554906, 0.40256598591804504], [0.0063224961049854755, 0.011514393612742424, 0.0058580306358635426, 0.008719604462385178, 0.0110350102186203, 0.034775421023368835, 0.014921019785106182, 0.48174166679382324, 0.00034953668364323676, 0.0005827781278640032, 8.551823702873662e-05, 0.0003156126767862588, 0.00047284780885092914, 0.00024783299886621535, 3.6832920159213245e-05, 0.0006621272768825293, 0.42235925793647766], [0.0022581887897104025, 0.001147789298556745, 0.0012683223467320204, 0.0018038679845631123, 0.0002506634045857936, 0.000794385327026248, 0.0015311819734051824, 0.5178257822990417, 0.0003137900785077363, 0.0007425009971484542, 0.00019786010670941323, 0.0004004598595201969, 0.0002937010722234845, 0.00017818355991039425, 9.80863842414692e-05, 0.0007366269128397107, 0.4701586663722992], [0.013858581893146038, 0.00709044374525547, 0.004777722526341677, 0.0023933923803269863, 0.0017736002337187529, 0.006418530363589525, 0.003300949465483427, 0.48851776123046875, 0.00790928304195404, 0.004056336358189583, 0.00014034259947948158, 0.0007017344469204545, 0.0005179547588340938, 0.006900356151163578, 0.0001466434623580426, 0.0015325637068599463, 0.4499638080596924], [0.007691337261348963, 0.0013388273073360324, 0.003775484161451459, 0.002919538877904415, 0.0006412917282432318, 0.0027484893798828125, 0.0020052860490977764, 0.5124815702438354, 0.0015227263793349266, 0.0033990847878158092, 0.00012225699902046472, 0.000259931170148775, 7.550359441665933e-05, 0.00023588957265019417, 0.000149906292790547, 0.004306497052311897, 0.4563263952732086], [0.002330730203539133, 0.0003014556714333594, 0.0004531018785201013, 0.0012653361773118377, 0.0005290283588692546, 0.0009622600628063083, 0.0008559423731639981, 0.5113089084625244, 0.00015497059212066233, 0.0006678461795672774, 0.00030202747439034283, 0.0005115241510793567, 0.0001591813488630578, 1.72836116689723e-05, 3.5303582990309224e-05, 0.0010801417520269752, 0.4790648818016052], [0.005819212645292282, 0.00021483530872501433, 0.0006852124934084713, 0.0018345721764490008, 0.00024525431217625737, 0.000588182476349175, 0.0014814576134085655, 0.5106444954872131, 0.0009050146327354014, 0.004380492493510246, 0.0009218018967658281, 0.0018163217464461923, 0.00014580517017748207, 2.1437563191284426e-05, 7.335636473726481e-05, 0.0016012336127460003, 0.4686214327812195], [0.008563409559428692, 0.0014967328170314431, 0.0018438724800944328, 0.0038195978850126266, 0.005937446374446154, 0.008097606711089611, 0.002516165841370821, 0.4588181972503662, 0.05804136022925377, 0.020236942917108536, 0.0007753559038974345, 0.0022065294906497, 0.0021761099342256784, 0.000581152446102351, 8.617012645117939e-05, 0.0008391879964619875, 0.42396411299705505], [0.0035708188079297543, 0.0018005740130320191, 0.002772502601146698, 0.00040547605021856725, 0.0012572268024086952, 0.00506405858322978, 0.002426678780466318, 0.43854063749313354, 0.09574415534734726, 0.028675615787506104, 0.0011997005203738809, 0.0018275566399097443, 0.0010486081009730697, 0.005789272021502256, 0.00032231383374892175, 0.0015148274833336473, 0.40803998708724976], [0.0008133704541251063, 0.0004984234692528844, 0.0021919405553489923, 0.005278429947793484, 0.0003053145483136177, 0.0009671378065831959, 0.0005959445261396468, 0.515633225440979, 0.0010578557848930359, 0.002933816285803914, 0.0012055500410497189, 0.0010139814112335443, 0.00035427865805104375, 0.0003622338699642569, 0.0022079104091972113, 0.0010145187843590975, 0.4635660946369171], [0.00954220350831747, 0.002636347198858857, 0.010769193060696125, 0.009865867905318737, 0.0008806141559034586, 0.0023686010390520096, 0.004097809083759785, 0.45221176743507385, 0.0071501350030303, 0.04089708253741264, 0.002739539137110114, 0.008865737356245518, 0.001317090936936438, 0.0012959641171619296, 0.0017854789039120078, 0.029335923492908478, 0.4142405688762665], [0.0022447144147008657, 0.0010734116658568382, 0.0011683915508911014, 0.001717484905384481, 0.00023086908913683146, 0.0007393963751383126, 0.00145628210157156, 0.5185094475746155, 0.00030199639149941504, 0.0007291302317753434, 0.00019331704243086278, 0.0003905851044692099, 0.00028759113047271967, 0.00016863590280991048, 9.724332630867139e-05, 0.0007416990702040493, 0.46994978189468384]], [[0.03129994124174118, 0.05397389084100723, 0.03612878918647766, 0.06880293041467667, 0.009010836482048035, 0.042996156960725784, 0.18930235505104065, 0.08727110177278519, 0.021433716639876366, 0.16774217784404755, 0.007310076616704464, 0.01882689632475376, 0.06909318268299103, 0.011242610402405262, 0.004475067835301161, 0.09938367456197739, 0.08170662820339203], [0.09871657937765121, 0.09463932365179062, 0.03297179937362671, 0.0182070042937994, 0.021062668412923813, 0.11692299693822861, 0.20972827076911926, 0.19322611391544342, 0.004545035772025585, 0.004832039587199688, 0.0009570408728905022, 0.0018668370321393013, 0.010231228545308113, 0.004413231275975704, 0.0008356698672287166, 0.007240993436425924, 0.17960324883460999], [0.03629371151328087, 0.02435954660177231, 0.01011139526963234, 0.011824710294604301, 0.017329847440123558, 0.05727069452404976, 0.04356948658823967, 0.4040355980396271, 0.0030012577772140503, 0.0023756285663694143, 0.0008529227925464511, 0.0005335372989065945, 0.004823393654078245, 0.001697343192063272, 0.0003192056610714644, 0.004285227041691542, 0.37731653451919556], [0.032456330955028534, 0.19714994728565216, 0.15353699028491974, 0.023963244631886482, 0.024091064929962158, 0.05091731250286102, 0.044780388474464417, 0.22628700733184814, 0.008570521138608456, 0.0028142097871750593, 0.0018266913248226047, 0.000771894701756537, 0.003745390335097909, 0.01064717024564743, 0.0028436852153390646, 0.0037821924779564142, 0.2118159681558609], [0.021946735680103302, 0.11858170479536057, 0.08319760113954544, 0.04987845569849014, 0.005240418016910553, 0.01651344634592533, 0.010430018417537212, 0.3407377004623413, 0.0032757564913481474, 0.006637522019445896, 0.0010336334817111492, 0.006381471175700426, 0.009055445902049541, 0.0016053136205300689, 0.0001432521385140717, 0.009627648629248142, 0.3157138228416443], [0.05243082344532013, 0.2586538791656494, 0.14304545521736145, 0.07273533940315247, 0.0032944355625659227, 0.01708938181400299, 0.03147002309560776, 0.20326100289821625, 0.004605574067682028, 0.00763388816267252, 0.00042283316724933684, 0.002630773466080427, 0.006925344932824373, 0.0013102114899083972, 0.000327576941344887, 0.007790989242494106, 0.18637242913246155], [0.0497361496090889, 0.27932223677635193, 0.11239410191774368, 0.06700069457292557, 0.02463410422205925, 0.048935454338788986, 0.04947773367166519, 0.17399564385414124, 0.007480769883841276, 0.006080171559005976, 0.0004352664982434362, 0.001014033448882401, 0.008351865224540234, 0.0008619399741292, 0.0003128921380266547, 0.010354693047702312, 0.1596122831106186], [0.005096247885376215, 0.004514685366302729, 0.0037696922663599253, 0.003816161770373583, 0.0015304754488170147, 0.0049208952113986015, 0.0015945304185152054, 0.5016276836395264, 0.0028718002140522003, 0.001891767606139183, 0.000566433125641197, 0.0014097377425059676, 0.003448096802458167, 0.0022105479147285223, 0.0003155835438519716, 0.0019418180454522371, 0.45847389101982117], [0.14860279858112335, 0.00627906946465373, 0.003943925723433495, 0.004860382527112961, 0.004753129556775093, 0.016632191836833954, 0.01723095215857029, 0.23678094148635864, 0.013921421952545643, 0.08884069323539734, 0.0037801654543727636, 0.02108956314623356, 0.1472177803516388, 0.008337331935763359, 0.0005608194624073803, 0.05095003917813301, 0.22621877491474152], [0.29361915588378906, 0.0029370232950896025, 0.0022143095266073942, 0.0014191137161105871, 0.0013048818800598383, 0.003201795509085059, 0.007958369329571724, 0.2999245524406433, 0.009542430751025677, 0.016123412176966667, 0.001946056610904634, 0.005301306024193764, 0.02014773152768612, 0.001099143992178142, 0.0003889049985446036, 0.04376094415783882, 0.28911083936691284], [0.019498111680150032, 0.004896295722573996, 0.005043943412601948, 0.002846852410584688, 0.003291438100859523, 0.0029365152586251497, 0.0026671786326915026, 0.48742565512657166, 0.011153140105307102, 0.004587016999721527, 0.0016815053531900048, 0.00145218544639647, 0.0022949164267629385, 0.0012774126371368766, 0.00011621848534559831, 0.004543577320873737, 0.444288045167923], [0.08067811280488968, 0.0038186467718333006, 0.0038021670188754797, 0.001214314135722816, 0.0018836510134860873, 0.002768467180430889, 0.006292213220149279, 0.3533766567707062, 0.07308633625507355, 0.04606712982058525, 0.003590243635699153, 0.004387885332107544, 0.03129352629184723, 0.005709726829081774, 0.0010956136975437403, 0.047655295580625534, 0.3332800567150116], [0.07768674939870834, 0.0012278907233849168, 0.0010796755086630583, 0.0004826653457712382, 0.0021101697348058224, 0.006026304326951504, 0.011747465468943119, 0.15186335146427155, 0.017967049032449722, 0.10085295140743256, 0.0016179749509319663, 0.018610544502735138, 0.009471165016293526, 0.0022461065091192722, 0.001770619535818696, 0.4545133709907532, 0.1407259702682495], [0.08089234679937363, 0.0022588029969483614, 0.002382456324994564, 0.0014732616255059838, 0.001473354990594089, 0.0026552025228738785, 0.0014451079769060016, 0.3696480393409729, 0.01832922361791134, 0.05042244866490364, 0.0019298945553600788, 0.00955882016569376, 0.0549248605966568, 0.004644791595637798, 0.0005785772809758782, 0.04746290668845177, 0.3499198257923126], [0.01620936021208763, 0.0012749010929837823, 0.0005969212506897748, 0.0003393683291506022, 0.000526351504959166, 0.0024368988815695047, 0.0004092109447810799, 0.49986138939857483, 0.008397913537919521, 0.003379252040758729, 0.0004526883421931416, 0.0009417283581569791, 0.005914950743317604, 0.003761844476684928, 6.29035203019157e-05, 0.006380514241755009, 0.4490537643432617], [0.17847469449043274, 0.027851495891809464, 0.008031203411519527, 0.003897752845659852, 0.0019736175891011953, 0.009174268692731857, 0.026784775778651237, 0.2657906115055084, 0.04462268203496933, 0.023058690130710602, 0.004798270296305418, 0.00992236565798521, 0.11499764025211334, 0.005517386831343174, 0.004190279170870781, 0.025762612000107765, 0.24515166878700256], [0.005230667535215616, 0.004257251974195242, 0.0035367209929972887, 0.003674691077321768, 0.0014655701816082, 0.004584519658237696, 0.0015474701067432761, 0.5020777583122253, 0.0029590483754873276, 0.0019484206568449736, 0.0005718856700696051, 0.0014308959944173694, 0.0035097636282444, 0.0022255724761635065, 0.0003176132158841938, 0.0020011805463582277, 0.4586609899997711]], [[0.0036108652129769325, 0.01849963143467903, 0.049411874264478683, 0.0643068253993988, 0.010310312733054161, 0.034140218049287796, 0.4312479794025421, 0.19255970418453217, 0.003589708125218749, 0.007211575750261545, 0.003217290388420224, 0.0009236152982339263, 0.0016977592604234815, 0.0016501928912475705, 0.00022912635176908225, 0.012826277874410152, 0.1645670235157013], [0.039871782064437866, 0.03398009017109871, 0.009561181999742985, 0.0066717229783535, 0.0007619662792421877, 0.003138384548947215, 0.00862803589552641, 0.46139055490493774, 0.0011553798103705049, 0.00030594554846175015, 0.00033771456219255924, 0.0010393774136900902, 0.00043156143510714173, 0.00022089455160312355, 0.00010533772001508623, 0.0021215372253209352, 0.4302784502506256], [0.010167981497943401, 0.029395127668976784, 0.02430281974375248, 0.003198578953742981, 0.00256782746873796, 0.0011957045644521713, 0.0019535047467797995, 0.48395082354545593, 0.00022646102297585458, 0.00010761396697489545, 0.00010128030407940969, 0.0008287965320050716, 0.0008256362634710968, 0.00014741538325324655, 4.266649193596095e-05, 0.0003154721634928137, 0.4406723082065582], [0.02658657729625702, 0.05772276967763901, 0.03140385076403618, 0.02475529909133911, 0.021908078342676163, 0.01041181106120348, 0.013469517230987549, 0.4175986051559448, 0.0023503766860812902, 0.00048488224274478853, 0.0006043605390004814, 0.002746684942394495, 0.0023646254558116198, 0.0026747470255941153, 0.0005481202388182282, 0.0013538564089685678, 0.38301584124565125], [0.05005335062742233, 0.06884029507637024, 0.03626161068677902, 0.07021316140890121, 0.012481145560741425, 0.020489061251282692, 0.02542150765657425, 0.3719520568847656, 0.0017478337977081537, 0.000706350663676858, 0.00047120748786255717, 0.0016299448907375336, 0.0017112191999331117, 0.0012146425433456898, 0.00026298227021470666, 0.002673480426892638, 0.33387020230293274], [0.1584283858537674, 0.16888119280338287, 0.11848776042461395, 0.07025402039289474, 0.005326312500983477, 0.034746088087558746, 0.03298741951584816, 0.20063412189483643, 0.011352629400789738, 0.001555172959342599, 0.0007421558257192373, 0.001641582348383963, 0.0008922413690015674, 0.002658517798408866, 0.00012214599701110274, 0.007874086499214172, 0.18341615796089172], [0.028842566534876823, 0.021116329357028008, 0.02084812894463539, 0.04805995523929596, 0.018136393278837204, 0.08485352247953415, 0.009988483972847462, 0.397138386964798, 0.000991090084426105, 0.0006919702864252031, 0.0006804627482779324, 0.0005334317102096975, 0.0004240366106387228, 0.00031080777989700437, 0.00019894960860256106, 0.0037451786920428276, 0.36344027519226074], [0.007629618979990482, 0.006824952084571123, 0.003120941109955311, 0.0057760304771363735, 0.003165418514981866, 0.006865301635116339, 0.011941029690206051, 0.48249566555023193, 0.004503777250647545, 0.0026128552854061127, 0.001848860178142786, 0.0019908433314412832, 0.0021825155708938837, 0.0025462007615715265, 0.0009521670290268958, 0.006207573227584362, 0.4493362605571747], [0.01958552375435829, 0.0015095955459401011, 0.000818784290459007, 0.0020499888341873884, 0.0017035930650308728, 0.007508841808885336, 0.0039483364671468735, 0.49467816948890686, 0.004700459074229002, 0.0009303622064180672, 0.00021494909015018493, 0.0003858648124150932, 0.00020934366330038756, 9.801457781577483e-05, 5.083658834337257e-05, 0.0011940813856199384, 0.4604131877422333], [0.012182013131678104, 0.0003048867511097342, 0.00022707651078235358, 0.00047681646537967026, 0.0012200751807540655, 0.0035861507058143616, 0.009847451001405716, 0.47928911447525024, 0.04041699320077896, 0.005596746690571308, 0.0006885198527015746, 0.0013021272607147694, 0.000517425884027034, 0.000577111670281738, 7.489492418244481e-05, 0.001437177648767829, 0.44225552678108215], [0.004411362577229738, 0.0006322393892332911, 0.00031119436607696116, 0.0005519564147107303, 0.0011660088784992695, 0.0009699682123027742, 0.0012022752780467272, 0.49151545763015747, 0.024453667923808098, 0.0044539375230669975, 0.0010257689282298088, 0.0018594611901789904, 0.001565079903230071, 0.00041327191866002977, 0.00016501547361258417, 0.0006913818069733679, 0.46461182832717896], [0.012597310356795788, 0.0005975068197585642, 0.0002119347918778658, 0.0002794286410789937, 0.00034651061287149787, 0.0015924399485811591, 0.0036231516860425472, 0.1744053214788437, 0.606438934803009, 0.0197446309030056, 0.002813480095937848, 0.005973202642053366, 0.0025120675563812256, 0.006549640092998743, 0.0001207633686135523, 0.0010185543214902282, 0.16117523610591888], [0.08204326033592224, 0.0008170415530912578, 0.00031860521994531155, 0.002259216969832778, 0.0010435455478727818, 0.002783864736557007, 0.01695891097187996, 0.3300830125808716, 0.1358431726694107, 0.0548134408891201, 0.03063656948506832, 0.02114885486662388, 0.017982807010412216, 0.0019634913187474012, 0.00015568539674859494, 0.006453436333686113, 0.294695109128952], [0.0223862137645483, 0.001988781150430441, 0.0009807462338358164, 0.0019236876396462321, 0.0007421050686389208, 0.0020126320887356997, 0.0041262246668338776, 0.45059120655059814, 0.011150101199746132, 0.015047219581902027, 0.008298610337078571, 0.01567929983139038, 0.03611148148775101, 0.0064739445224404335, 0.001079636043868959, 0.008603977970778942, 0.41280409693717957], [0.004495426081120968, 0.0006828714394941926, 0.000225146854063496, 0.0007350947707891464, 6.691546877846122e-05, 0.000655531301163137, 0.0013839525636285543, 0.4695415794849396, 0.00401310995221138, 0.005244715604931116, 0.002263123169541359, 0.0065982043743133545, 0.029287250712513924, 0.03328244760632515, 0.0019626300781965256, 0.009835440665483475, 0.4297265410423279], [0.07682226598262787, 0.0018054709071293473, 0.0009224780951626599, 0.002286374568939209, 0.0004908937262371182, 0.0030793934129178524, 0.002130769658833742, 0.1607736051082611, 0.020767243579030037, 0.011926773004233837, 0.007212688215076923, 0.03214466571807861, 0.19900979101657867, 0.30220094323158264, 0.0023394026793539524, 0.026617255061864853, 0.14947007596492767], [0.007577006705105305, 0.00655153626576066, 0.002997698960825801, 0.005529630929231644, 0.0029950584284961224, 0.006482533644884825, 0.011762309819459915, 0.48331111669540405, 0.004457398783415556, 0.002653115428984165, 0.0018439117120578885, 0.00200221361592412, 0.0022309725172817707, 0.0025915312580764294, 0.000963730039075017, 0.006311739794909954, 0.4497385025024414]], [[0.01109678577631712, 0.0010117096826434135, 0.0004688598564825952, 0.0005161232547834516, 0.0002338179328944534, 0.0074919238686561584, 0.0026945227291435003, 0.4938630759716034, 0.0017491556936874986, 0.0006848368211649358, 0.0014946861192584038, 0.0035486880224198103, 0.0008535421802662313, 0.0016514707822352648, 0.0002030957257375121, 0.0006212258012965322, 0.4718165397644043], [0.010621950030326843, 0.011646490544080734, 0.008539113216102123, 0.005121071822941303, 0.0014236761489883065, 0.0008478055824525654, 0.0018215227173641324, 0.4964909553527832, 6.970557296881452e-05, 0.00016002164920791984, 4.338341022958048e-05, 0.0025423970073461533, 0.00298856059089303, 0.0005104454467073083, 3.852651207125746e-05, 0.0003067178186029196, 0.45682764053344727], [0.0018188806716352701, 0.028312142938375473, 0.023681825026869774, 0.011251840740442276, 0.005648389458656311, 0.0003815161471720785, 0.002278159838169813, 0.48835256695747375, 0.00019114524184260517, 0.000489383062813431, 4.093154075235361e-06, 0.00016994534234981984, 0.0001474133023293689, 0.00031222368124872446, 0.00012619310291483998, 0.0004061810905113816, 0.43642815947532654], [0.0008866526186466217, 0.005787013564258814, 0.009233498945832253, 0.003838958917185664, 0.006553275976330042, 0.006551063619554043, 0.000157097281771712, 0.507166862487793, 0.00018700305372476578, 2.600349034764804e-05, 3.101159381913021e-05, 3.7370606150943786e-05, 6.639362982241437e-05, 0.00038180287810973823, 5.995716492179781e-05, 3.9490507333539426e-05, 0.45899659395217896], [0.0007577225915156305, 0.0016118313651531935, 0.004841862246394157, 0.2869369387626648, 0.030458858236670494, 0.04482046142220497, 0.0004212194471620023, 0.33332693576812744, 3.226616536267102e-05, 1.26361783259199e-05, 1.2770678040396888e-05, 1.6771409718785435e-05, 3.66924396075774e-05, 8.81249870872125e-05, 2.9239259674795903e-05, 6.144399230834097e-05, 0.2965342700481415], [0.0018758217338472605, 0.0007095796754583716, 0.0034280207473784685, 0.036336127668619156, 0.5324495434761047, 0.004720974247902632, 0.015195902436971664, 0.21015220880508423, 0.00012031511141685769, 0.0003778524696826935, 0.00022453632846008986, 0.0003098855377174914, 0.000224611401790753, 0.00010664076398825273, 3.177867256454192e-05, 0.0007461908971890807, 0.1929900199174881], [0.009016456082463264, 0.0010768759530037642, 0.004214041866362095, 0.008062602952122688, 0.01741735450923443, 0.012850679457187653, 0.027584875002503395, 0.4808201193809509, 0.0003081945760641247, 0.0007799368468113244, 0.0002786203986033797, 0.0008961830753833055, 4.816884029423818e-05, 8.38043779367581e-05, 8.982215513242409e-05, 0.000470328435767442, 0.4360019564628601], [0.02319958060979843, 0.0052472855895757675, 0.002158869057893753, 0.004860537126660347, 0.0040931482799351215, 0.011984923854470253, 0.005775251891463995, 0.4623115062713623, 0.002492500701919198, 0.001445327652618289, 0.0011211367091163993, 0.0023307327646762133, 0.0046090781688690186, 0.006053324323147535, 0.00101186812389642, 0.001716527040116489, 0.4595884382724762], [0.03151025250554085, 0.00031631681486032903, 7.724425086053088e-05, 0.000996450544334948, 0.008429128676652908, 0.008165273815393448, 0.006539106369018555, 0.48310983180999756, 0.011348819360136986, 0.005151040852069855, 0.00043087880476377904, 0.0010849572718143463, 0.0016597297508269548, 0.00022927937970962375, 2.3442262317985296e-06, 0.00020865823898930103, 0.440740704536438], [0.0057838778011500835, 8.949777839006856e-05, 0.00032806905801407993, 0.0002956968382932246, 0.0011012930190190673, 0.0004011471464764327, 0.006425697356462479, 0.42378294467926025, 0.09371111541986465, 0.05845797806978226, 0.0022776436526328325, 0.010545536875724792, 0.0004095543990842998, 9.192561265081167e-05, 0.0001700959837762639, 0.0013485181843861938, 0.39477941393852234], [0.005166173446923494, 0.00017908650625031441, 8.968970360001549e-05, 0.00010595053754514083, 0.00016982822853606194, 9.766011498868465e-05, 0.0008671545656397939, 0.43203505873680115, 0.014478225260972977, 0.135527104139328, 0.008805598132312298, 0.021762290969491005, 0.001476060482673347, 0.001065905555151403, 0.0005293535068631172, 0.00021927796478848904, 0.3774256110191345], [0.0005935425288043916, 8.395666554861236e-06, 6.406888132914901e-06, 4.251595237292349e-05, 3.404060407774523e-05, 1.288630755880149e-05, 0.00022723243455402553, 0.09350273758172989, 0.00020655365369748324, 0.0017144465819001198, 0.8142262101173401, 0.003288627602159977, 0.0010526729747653008, 1.9989274733234197e-05, 0.00012967838847544044, 4.442239151103422e-05, 0.08488964289426804], [0.0008409898728132248, 5.267115557217039e-05, 1.1425796401454136e-05, 8.089289622148499e-05, 4.126460680708988e-06, 1.2419686754583381e-05, 3.967360044043744e-06, 0.4865311086177826, 3.743031629710458e-05, 0.00025210093008354306, 0.0007001258200034499, 0.02711857110261917, 0.04710739850997925, 0.0010817504953593016, 4.8061840061564e-05, 4.525022814050317e-05, 0.4360716640949249], [0.002913174917921424, 0.0002600011648610234, 3.331080370116979e-05, 0.00019384610641282052, 6.820556154707447e-05, 6.168073014123365e-05, 0.00011027788423234597, 0.2345069795846939, 4.368437294033356e-05, 0.00042611753451637924, 0.0007670038612559438, 0.0048230490647256374, 0.49476343393325806, 0.0423365943133831, 0.0001985717681236565, 0.001543986494652927, 0.21695010364055634], [0.004628641065210104, 0.00019572870223782957, 8.594959217589349e-05, 0.00020368758123368025, 8.744284423300996e-05, 2.279017098771874e-05, 0.00012242970115039498, 0.4608336389064789, 0.0004829490208067, 0.004712790250778198, 3.475540142972022e-05, 0.008372235111892223, 0.0395500548183918, 0.04113396257162094, 0.0012152416165918112, 0.019672924652695656, 0.4186446964740753], [0.001314694993197918, 0.0005361109506338835, 0.0002691985573619604, 0.0007500582141801715, 0.0007724633323960006, 0.00017847558774519712, 0.0003388428594917059, 0.4435509741306305, 9.05971864995081e-06, 0.00013858739112038165, 0.0018351977923884988, 0.0008490073960274458, 0.0059469714760780334, 0.001785774016752839, 0.12267188727855682, 0.010035806335508823, 0.4090169370174408], [0.02240109257400036, 0.004995665512979031, 0.0020551285706460476, 0.004542137496173382, 0.0038950126618146896, 0.011424831114709377, 0.005368831101804972, 0.46376729011535645, 0.002410619519650936, 0.001456008991226554, 0.0010412927949801087, 0.0023929551243782043, 0.004632208961993456, 0.0061010634526610374, 0.0009693729225546122, 0.001713728765025735, 0.46083277463912964]], [[0.0022750943899154663, 0.010453112423419952, 0.020606115460395813, 0.011505103670060635, 0.02004687860608101, 0.01047897431999445, 0.008018380962312222, 0.45112091302871704, 0.01091521605849266, 0.009288142435252666, 0.00922533217817545, 0.00910355243831873, 0.005777525249868631, 0.0024394341744482517, 0.0022723490837961435, 0.009101360104978085, 0.40737250447273254], [0.004924003966152668, 0.012850104831159115, 0.040057580918073654, 0.009098364040255547, 0.005592674016952515, 0.003316461341455579, 0.01663370430469513, 0.46458467841148376, 0.002391244051977992, 0.0032729865051805973, 0.002126396866515279, 0.0024289970751851797, 0.0031920478213578463, 0.001255786162801087, 0.002194190863519907, 0.0016412868862971663, 0.4244394600391388], [0.0005608937353827059, 0.005617762450128794, 0.003305960912257433, 0.002260830719023943, 0.002065944718196988, 0.0006302290712483227, 0.0028415846172720194, 0.5147683620452881, 0.0004451780114322901, 0.0003006880870088935, 0.00035635169479064643, 0.0002530182828195393, 0.00020155945094302297, 0.0007596635259687901, 0.0019033465068787336, 0.0002076802629744634, 0.463520884513855], [0.005467827431857586, 0.053151216357946396, 0.16016684472560883, 0.026265360414981842, 0.008647864684462547, 0.006163076497614384, 0.0032885149121284485, 0.38037535548210144, 0.0005913670756854117, 0.0004674695373978466, 0.00033131203963421285, 0.001321532647125423, 0.0010609409073367715, 0.0007168218144215643, 0.0005413867183960974, 0.00021686064428649843, 0.35122621059417725], [0.0024912641383707523, 0.05242779850959778, 0.0936596617102623, 0.5487411022186279, 0.009003477171063423, 0.010475761257112026, 0.009396403096616268, 0.14255447685718536, 0.0003057606518268585, 0.0002772718435153365, 0.00016678162501193583, 0.0007525997934862971, 0.00018151775293517858, 0.00035875433241017163, 0.0005213293479755521, 0.0001885435194708407, 0.12849752604961395], [0.005962515249848366, 0.010973064228892326, 0.0978999212384224, 0.16785530745983124, 0.08133397251367569, 0.03276844695210457, 0.046465519815683365, 0.2876873314380646, 0.000597060308791697, 0.0008538936963304877, 0.0022414643317461014, 0.00045995257096365094, 0.0003708236326929182, 0.00041112012695521116, 0.001500620972365141, 0.0008427174179814756, 0.2617762088775635], [0.0028146447148174047, 0.02998090349137783, 0.026145126670598984, 0.11946006119251251, 0.18174594640731812, 0.0685737207531929, 0.006310279946774244, 0.2952812910079956, 0.000993960304185748, 0.00037089057150296867, 0.0001394861174048856, 0.0004617071244865656, 0.0002782996743917465, 9.784934081835672e-05, 0.00019737507682293653, 0.00047915009781718254, 0.26666927337646484], [0.0010333707323297858, 0.001788356457836926, 0.004041990265250206, 0.0030567541252821684, 0.002498042769730091, 0.001178353326395154, 0.0015836611855775118, 0.5104126930236816, 0.0005800735088996589, 0.0006170897977426648, 0.00066317681921646, 0.0010502913501113653, 0.0008147733751684427, 0.000383299367967993, 0.00028441069298423827, 0.0009957130532711744, 0.46901798248291016], [0.006327589508146048, 0.0012476869160309434, 0.002237314358353615, 0.0015616700984537601, 0.00508793443441391, 0.004339228384196758, 0.004981195088475943, 0.2550294101238251, 0.029436105862259865, 0.29116204380989075, 0.036931149661540985, 0.09745568782091141, 0.008074183948338032, 0.0012486024061217904, 0.014209835790097713, 0.0058762384578585625, 0.23479412496089935], [0.005215411074459553, 0.00036832771729677916, 0.0003046841884497553, 0.0009913826361298561, 0.0019874130375683308, 0.0007939427741803229, 0.0020722495391964912, 0.447187602519989, 0.04533311724662781, 0.02392612025141716, 0.021450504660606384, 0.013086003251373768, 0.0011283765779808164, 0.00046343824942596257, 0.006980156525969505, 0.0016835200367495418, 0.42702773213386536], [0.005034809000790119, 0.000981734017841518, 0.0008152078953571618, 0.0006002298905514181, 0.0014736175071448088, 0.0021326520945876837, 0.0016566928243264556, 0.31829139590263367, 0.050808992236852646, 0.047773316502571106, 0.19840949773788452, 0.04932990297675133, 0.011232908815145493, 0.004363834857940674, 0.00946979783475399, 0.0020765310619026423, 0.2955489158630371], [0.002113976050168276, 0.00018926890334114432, 0.0004069455317221582, 0.0003542519698385149, 0.00046129024121910334, 0.0002226813230663538, 0.0012670224532485008, 0.09073788672685623, 0.01864161714911461, 0.06516823917627335, 0.6230757832527161, 0.08617802709341049, 0.011425490491092205, 0.0013136977795511484, 0.013945518061518669, 0.0006066134083084762, 0.08389173448085785], [0.003073921659961343, 0.0012041267473250628, 0.0004789376980625093, 0.0006527347140945494, 0.00018202702631242573, 0.0002915275108534843, 0.00021177082089707255, 0.21347902715206146, 0.016476454213261604, 0.10448583215475082, 0.01666930690407753, 0.2526237368583679, 0.1739194095134735, 0.014894764870405197, 0.0018043859163299203, 0.0005466766888275743, 0.19900530576705933], [0.0030206767842173576, 0.00023137447715271264, 0.0009005973115563393, 0.0007224963046610355, 0.00030423898715525866, 0.00021672810544259846, 0.0021856441162526608, 0.28382641077041626, 0.0005988482153043151, 0.029751384630799294, 0.035690177232027054, 0.05504085123538971, 0.11812994629144669, 0.012597351334989071, 0.1744566112756729, 0.01699228398501873, 0.26533442735671997], [0.0024657759349793196, 0.0001970427401829511, 0.00016748807684052736, 0.00018534802075009793, 0.00014009508595336229, 0.0005248280358500779, 0.0007400502800010145, 0.24483902752399445, 0.001408064621500671, 0.009656690992414951, 0.02565629780292511, 0.029809560626745224, 0.02048562839627266, 0.06809348613023758, 0.12197684496641159, 0.23768503963947296, 0.23596878349781036], [0.004823704250156879, 0.0002510686172172427, 0.0001744956971378997, 0.00036615756107494235, 0.00026677566347643733, 0.00028106302488595247, 0.0005834019975736737, 0.10652389377355576, 0.0027059512212872505, 0.02759951539337635, 0.010923890396952629, 0.023246103897690773, 0.022929541766643524, 0.10211421549320221, 0.5651404857635498, 0.028950832784175873, 0.103118896484375], [0.0009777132654562593, 0.0016491094138473272, 0.00377093069255352, 0.002786139724776149, 0.0022742468863725662, 0.001088302698917687, 0.0014697588048875332, 0.5112897157669067, 0.0005501030245795846, 0.000579355750232935, 0.0006392940995283425, 0.0010106582194566727, 0.0007831606781110168, 0.00037130696000531316, 0.0002716354501899332, 0.0009585854131728411, 0.46953004598617554]], [[0.038976337760686874, 0.00961409229785204, 0.0013238433748483658, 0.006809653714299202, 0.003536069532856345, 0.018614185974001884, 0.009504063986241817, 0.18996426463127136, 0.28113889694213867, 0.0803370252251625, 0.004060279577970505, 0.05860796198248863, 0.050784096121788025, 0.05234220251441002, 0.002480372553691268, 0.014989659190177917, 0.1769169270992279], [0.004851902835071087, 0.029543360695242882, 0.154482901096344, 0.18825477361679077, 0.04480995610356331, 0.04563237354159355, 0.029542725533246994, 0.25785040855407715, 0.0008805884863249958, 0.0018177630845457315, 0.000761527509894222, 0.0002529105986468494, 0.002214443404227495, 0.0004629243048839271, 0.0005551670910790563, 0.0005708308890461922, 0.23751544952392578], [0.002769803162664175, 0.008197277784347534, 0.0017536348896101117, 0.03467575088143349, 0.014444954693317413, 0.014298166148364544, 0.011645697988569736, 0.4692133665084839, 0.00239817937836051, 0.0015090882079675794, 0.00044163488200865686, 0.0006872944650240242, 0.0011879749363288283, 0.0008716852753423154, 0.00012616011372301728, 0.0004894623998552561, 0.43528980016708374], [0.009362754411995411, 0.010842501185834408, 0.0064840358681976795, 0.036290477961301804, 0.06834423542022705, 0.05059254169464111, 0.06500508636236191, 0.37884455919265747, 0.0035420393105596304, 0.0037230229936540127, 0.0013872438576072454, 0.0009193831938318908, 0.0034098464529961348, 0.0021378931123763323, 0.0010663566645234823, 0.001784090418368578, 0.3562638759613037], [0.008027922362089157, 0.003128264332190156, 0.01926644891500473, 0.1734047532081604, 0.010179535485804081, 0.02857435680925846, 0.31333717703819275, 0.17900384962558746, 0.019912229850888252, 0.0470740869641304, 0.018432455137372017, 0.0010763618629425764, 0.002159517491236329, 0.0020579786505550146, 0.005780589301139116, 0.0012142780469730496, 0.16737017035484314], [0.016465097665786743, 0.0018900102004408836, 0.04238666594028473, 0.082520492374897, 0.0993977040052414, 0.018861178308725357, 0.22917813062667847, 0.24227455258369446, 0.005857329815626144, 0.015579809434711933, 0.005159114021807909, 0.004531792365014553, 0.01018898282200098, 0.0024068665225058794, 0.0013271934585645795, 0.0022894360590726137, 0.2196856439113617], [0.004391717258840799, 0.0009036744013428688, 0.0008687502122484148, 0.002722236094996333, 0.02390293963253498, 0.0037836176343262196, 0.00420044083148241, 0.49151143431663513, 0.003544037463143468, 0.0024279344361275434, 0.0017998210387304425, 0.0038089442532509565, 0.010884412564337254, 0.0015863217413425446, 0.00029889988945797086, 0.0006610953714698553, 0.44270363450050354], [0.001256324234418571, 0.0005429021548479795, 0.0005868433509021997, 0.002257002517580986, 0.0007989323930814862, 0.0013249253388494253, 0.0020168914925307035, 0.5163377523422241, 0.001068641198799014, 0.0011623078025877476, 0.00030000193510204554, 0.0009521861211396754, 0.0018375777872279286, 0.0010362897301092744, 0.00043097094749100506, 0.0008062592823989689, 0.4672842025756836], [0.01752755418419838, 0.0003488147631287575, 0.0014962316490709782, 0.0015353925991803408, 0.0014925745781511068, 0.0010170828318223357, 0.0076755378395318985, 0.22095638513565063, 0.0619484968483448, 0.3184283673763275, 0.008283027447760105, 0.07927460223436356, 0.05951099097728729, 0.004753191489726305, 0.0015185645315796137, 0.007651553023606539, 0.2065816968679428], [0.005796052049845457, 0.0002740106137935072, 0.00022713560611009598, 0.0018879066919907928, 0.0019566144328564405, 0.0007970799342729151, 0.00304929306730628, 0.4519134759902954, 0.018429743126034737, 0.028721950948238373, 0.0018114425474777818, 0.030979624018073082, 0.019514750689268112, 0.009412471204996109, 0.0017005238914862275, 0.004153670277446508, 0.4193742275238037], [0.0015857716789469123, 0.000986230792477727, 7.962297968333587e-05, 0.0012663773959502578, 0.00040817048284225166, 0.00016814640548545867, 0.0003424639580771327, 0.4402284324169159, 0.019882196560502052, 0.012405737303197384, 0.0005390960141085088, 0.05834276229143143, 0.014721404761075974, 0.022942880168557167, 0.001639563008211553, 0.008397700265049934, 0.4160633981227875], [0.008377504535019398, 0.0005180230364203453, 0.0004486166872084141, 0.0021586103830486536, 0.0019947343971580267, 0.0007909151026979089, 0.0016832307446748018, 0.3894874155521393, 0.021301815286278725, 0.046887028962373734, 0.004423712380230427, 0.04250910133123398, 0.057654090225696564, 0.030498089268803596, 0.010863645933568478, 0.012875259853899479, 0.36752817034721375], [0.0019489119295030832, 0.000336305150995031, 0.00015826374874450266, 0.00043840735452249646, 0.0002256950392620638, 0.001259137992747128, 0.0005439341184683144, 0.40097132325172424, 0.001962528331205249, 0.004655881784856319, 0.00017267126531805843, 0.007796818856149912, 0.0339060015976429, 0.05830707028508186, 0.0019901064224541187, 0.113780178129673, 0.37154674530029297], [0.007161043118685484, 0.0001757900754455477, 0.0015213820151984692, 0.00140712212305516, 0.001174174016341567, 0.00031171037699095905, 0.004333582241088152, 0.324642151594162, 0.0016992771998047829, 0.016666561365127563, 0.0027171429246664047, 0.01455305889248848, 0.059876877814531326, 0.0513005405664444, 0.037043265998363495, 0.17329469323158264, 0.30212152004241943], [0.001092555932700634, 0.0006826882599852979, 0.00048422772670164704, 0.0008324494701810181, 0.0011040645185858011, 0.00024397812376264483, 0.0012031944934278727, 0.46817296743392944, 0.0008082817657850683, 0.0036010397598147392, 0.0015608540270477533, 0.005168259609490633, 0.00932722631841898, 0.014402383007109165, 0.0033058016560971737, 0.046136755496263504, 0.44187331199645996], [0.006799872033298016, 0.0007712256046943367, 0.0007395145366899669, 0.0011672800173982978, 0.0070961518213152885, 0.007983696646988392, 0.0077833072282373905, 0.34959226846694946, 0.01439765002578497, 0.020221339538693428, 0.01570134237408638, 0.008648251183331013, 0.03951069712638855, 0.12448414415121078, 0.02468704618513584, 0.04012776538729668, 0.3302884101867676], [0.0011054305359721184, 0.00047423990326933563, 0.0005131713696755469, 0.0020329467952251434, 0.0007148956647142768, 0.0011484521673992276, 0.001780129736289382, 0.5177304744720459, 0.0009308467269875109, 0.0009908140636980534, 0.0002691643894650042, 0.0007956245099194348, 0.0016231390181928873, 0.0009040692239068449, 0.00036981774610467255, 0.0006914714467711747, 0.4679252505302429]], [[0.0032640490680933, 0.0063628386706113815, 0.0005348502891138196, 0.0003996864543296397, 9.735026105772704e-05, 0.00031834313995204866, 0.000395137001760304, 0.1579686850309372, 0.6589411497116089, 0.004236893262714148, 0.0047025191597640514, 0.0008945011650212109, 0.00047383891069330275, 0.002966162748634815, 0.0014378527412191033, 0.0008196427952498198, 0.15618647634983063], [0.004964017774909735, 0.03040875867009163, 0.05403232201933861, 0.0398433692753315, 0.006044136360287666, 0.013880222104489803, 0.015832621604204178, 0.4290490746498108, 0.0006471463129855692, 0.0007718211272731423, 0.00032877526246011257, 6.939583545317873e-05, 0.002810221631079912, 0.0010773964459076524, 0.0009998672176152468, 0.0012632767902687192, 0.39797744154930115], [0.03838713467121124, 0.007421733811497688, 0.006711688823997974, 0.024914663285017014, 0.02570611611008644, 0.024959255009889603, 0.0151298176497221, 0.4424462914466858, 0.00042809502338059247, 0.0006039860309101641, 0.0002314788434887305, 0.00014037049550097436, 0.00024619221221655607, 0.00048749501002021134, 0.00025454856222495437, 0.0008028906886465847, 0.4111282527446747], [0.02304382249712944, 0.015037196688354015, 0.011081398464739323, 0.008465694263577461, 0.19536305963993073, 0.1697010099887848, 0.014849673956632614, 0.2847273647785187, 0.0017795433523133397, 0.0003157014143653214, 0.0003637450572568923, 0.0003325349243823439, 0.0005300667253322899, 0.000598089536651969, 0.0004198823298793286, 0.0025242180563509464, 0.2708670198917389], [0.005892421118915081, 0.0028785986360162497, 0.0037849275395274162, 0.006293190643191338, 0.016768038272857666, 0.4643842875957489, 0.10188636183738708, 0.20419184863567352, 0.001725164707750082, 0.0005772890872322023, 0.0003099875757470727, 3.501834362396039e-05, 5.000212695449591e-05, 0.00020643736934289336, 5.056496229371987e-05, 0.0004776221758220345, 0.19048826396465302], [0.12253923714160919, 0.0006490188534371555, 0.0014095220249146223, 0.004252448212355375, 0.004486711695790291, 0.013474869541823864, 0.5181862115859985, 0.17146694660186768, 0.0014427476562559605, 0.0019333697855472565, 0.0001941430091392249, 2.7511263397173025e-05, 0.0004215872031636536, 0.0001227026805281639, 0.0001248656481038779, 0.00019584837718866765, 0.15907227993011475], [0.007203268352895975, 0.0014873448526486754, 0.00015209197590593249, 0.00042094741365872324, 0.00018842009012587368, 0.000716701615601778, 0.0016899446491152048, 0.4815935790538788, 0.04333335533738136, 0.0010002970229834318, 0.0013431813567876816, 0.00015307813009712845, 3.9617974834982306e-05, 0.00020723008492495865, 1.2174826224509161e-05, 0.00013406244397629052, 0.4603247344493866], [0.006852757651358843, 0.006547790020704269, 0.0030300114303827286, 0.004325081128627062, 0.005152330733835697, 0.006955202203243971, 0.005443495232611895, 0.47411683201789856, 0.0037267019506543875, 0.003180661704391241, 0.0033255796879529953, 0.0019731170032173395, 0.002653153846040368, 0.0038164539728313684, 0.0023481566458940506, 0.010356337763369083, 0.4561963975429535], [0.009126291610300541, 0.00016169888840522617, 0.00017239699081983417, 0.000273021258180961, 0.00013750324433203787, 0.0003742114349734038, 0.006471728906035423, 0.42420825362205505, 0.003851932007819414, 0.13364163041114807, 0.010721182450652122, 0.0009002761216834188, 0.001901097595691681, 0.00011155217362102121, 0.00045141851296648383, 0.0006211274303495884, 0.4068746268749237], [0.004218528047204018, 0.00025822632596828043, 3.160057167406194e-05, 8.654622070025653e-05, 3.165929956594482e-05, 0.0002686771913431585, 0.00029801478376612067, 0.4285305440425873, 0.017533686012029648, 0.020307481288909912, 0.08711591362953186, 0.007044652011245489, 0.00593388918787241, 0.00852082297205925, 0.004716299939900637, 0.001050080987624824, 0.41405338048934937], [0.01590060442686081, 0.00011074377107433975, 2.4844017389114015e-05, 7.703465962549672e-05, 3.2935549825197086e-05, 0.00017924493295140564, 0.0002676631847862154, 0.265347957611084, 0.000492884311825037, 0.01122378185391426, 0.004673244431614876, 0.43607521057128906, 0.005700152833014727, 0.0035337607841938734, 0.000295093166641891, 0.001957812812179327, 0.2541070580482483], [0.02097749337553978, 0.00045833244803361595, 4.080756116309203e-05, 0.00012208092084620148, 2.4270852009067312e-05, 0.00015646718384232372, 0.00011376404290786013, 0.12342259287834167, 0.001916329376399517, 0.002818291774019599, 0.0013625436695292592, 0.0026856688782572746, 0.6749051213264465, 0.04384029656648636, 0.002585151931270957, 0.004151083528995514, 0.12041959166526794], [0.00358110130764544, 0.0011924795107915998, 8.858641376718879e-05, 0.00011104826990049332, 6.4820133047760464e-06, 0.0003397009277250618, 0.00016871602565515786, 0.3301616311073303, 9.425164171261713e-05, 0.00027630754630081356, 0.0004103370592929423, 0.0015387332532554865, 0.024287046864628792, 0.2604977488517761, 0.023229548707604408, 0.03517642244696617, 0.31883981823921204], [0.004666340071707964, 0.00038397187017835677, 0.0005966455792076886, 0.0002070161426672712, 5.4846907005412504e-05, 8.596424595452845e-05, 0.0012470235815271735, 0.41681107878685, 7.199977062555263e-06, 0.0003268352011218667, 0.0003205059911124408, 8.725547377252951e-05, 0.003715049708262086, 0.006015043705701828, 0.11313369870185852, 0.04750971868634224, 0.40483179688453674], [0.0038006831891834736, 0.0003231786540709436, 0.00011179737339261919, 0.00015091919340193272, 0.0004341636085882783, 0.0003000767028424889, 0.00024086121993605047, 0.2502947747707367, 5.665831486112438e-05, 0.0006447812775149941, 0.00026638605049811304, 0.0006303851841948926, 0.005567341577261686, 0.007949023507535458, 0.006080927327275276, 0.48388171195983887, 0.23926633596420288], [0.00711279921233654, 0.0005954647203907371, 0.0001276496914215386, 0.0007588432636111975, 6.876347470097244e-05, 0.0007780141895636916, 0.0002643018960952759, 0.48840051889419556, 0.0009983251802623272, 8.652396354591474e-05, 0.0003033917164430022, 0.00019872773555107415, 0.006851953454315662, 0.01942499727010727, 0.0009169687400572002, 0.006074898410588503, 0.4670378863811493], [0.005994903389364481, 0.005298085510730743, 0.0024021638091653585, 0.0036500385031104088, 0.0045059435069561005, 0.006045484449714422, 0.004468916915357113, 0.48015886545181274, 0.0028006588108837605, 0.002635649172589183, 0.002653145929798484, 0.0015610060654580593, 0.0022873675916343927, 0.003135301638394594, 0.0019450521795079112, 0.009086239151656628, 0.46137118339538574]], [[0.0048123812302947044, 0.00209894310683012, 0.0011626965133473277, 0.0014687813818454742, 0.0006082436884753406, 0.0013757192064076662, 0.030025584623217583, 0.08956196904182434, 0.014125152491033077, 0.11033368855714798, 0.008902354165911674, 0.011466803960502148, 0.026069993153214455, 0.009420773945748806, 0.012056716717779636, 0.5899583101272583, 0.08655181527137756], [0.021962404251098633, 0.020488057285547256, 0.054582517594099045, 0.01990138739347458, 0.03443135693669319, 0.050552695989608765, 0.48743095993995667, 0.11724468320608139, 0.006147482432425022, 0.021113203838467598, 0.00919561181217432, 0.01109311729669571, 0.011391060426831245, 0.004070018883794546, 0.003616462927311659, 0.016193801537156105, 0.11058511584997177], [0.006378205493092537, 0.010617067106068134, 0.04643158242106438, 0.02025887928903103, 0.014094867743551731, 0.018106302246451378, 0.1390853226184845, 0.3533708453178406, 0.0027947521302849054, 0.008857923559844494, 0.020990382879972458, 0.0055335224606096745, 0.0062545533291995525, 0.0012347252340987325, 0.01096450723707676, 0.007572354283183813, 0.32745423913002014], [0.008438930846750736, 0.029820701107382774, 0.08328043669462204, 0.011176995001733303, 0.024783305823802948, 0.04985334724187851, 0.37946537137031555, 0.18348272144794464, 0.006473247427493334, 0.00402679480612278, 0.007148390635848045, 0.002528097713366151, 0.020282626152038574, 0.0022574381437152624, 0.005141792818903923, 0.010322893969714642, 0.17151685059070587], [0.013146300800144672, 0.014578698202967644, 0.04752691835165024, 0.018887920305132866, 0.011114251799881458, 0.018865643069148064, 0.0902118906378746, 0.3782249093055725, 0.00435866741463542, 0.003512841183692217, 0.0055445535108447075, 0.00442067626863718, 0.02273484691977501, 0.003563523991033435, 0.0026610682252794504, 0.011130188591778278, 0.3495170772075653], [0.026183543726801872, 0.04415304586291313, 0.11016897857189178, 0.023265209048986435, 0.015046006068587303, 0.015175879001617432, 0.32376012206077576, 0.13141563534736633, 0.019901374354958534, 0.0419188030064106, 0.024590054526925087, 0.017970411106944084, 0.01921130158007145, 0.010980273596942425, 0.008002642542123795, 0.04373222589492798, 0.12452444434165955], [0.00439279992133379, 0.021659402176737785, 0.15881158411502838, 0.019986066967248917, 0.010725073516368866, 0.006811514962464571, 0.011382153257727623, 0.3734387457370758, 0.0025932774879038334, 0.006043863017112017, 0.016087103635072708, 0.002002769848331809, 0.005180936306715012, 0.0019360696896910667, 0.0025182426907122135, 0.003972879145294428, 0.35245761275291443], [0.0058541065081954, 0.0035270738881081343, 0.0037601981312036514, 0.0030270484276115894, 0.0017382270889356732, 0.002040113089606166, 0.008299806155264378, 0.5053389668464661, 0.0012789653846994042, 0.0007271157228387892, 0.0011962797725573182, 0.0003216115874238312, 0.0027103715110570192, 0.0006156162125989795, 0.0005756017053499818, 0.0014705591602250934, 0.4575183391571045], [0.009700633585453033, 0.003023226046934724, 0.004072319716215134, 0.003685861360281706, 0.0021484140306711197, 0.0024234929587692022, 0.022623082622885704, 0.40299728512763977, 0.0017948102904483676, 0.008753431960940361, 0.0254472978413105, 0.01711825467646122, 0.027757422998547554, 0.002251633210107684, 0.03640643507242203, 0.05417483299970627, 0.37562161684036255], [0.0028829448856413364, 0.0010242618154734373, 0.0012637637555599213, 0.000558661122340709, 0.0003508103545755148, 0.0008501987322233617, 0.003323100972920656, 0.499811589717865, 0.0006993316928856075, 0.0006721063400618732, 0.007602715399116278, 0.0013754855608567595, 0.006477029528468847, 0.000623533793259412, 0.00932715367525816, 0.005771982949227095, 0.4573853015899658], [0.007727264892309904, 0.000898777914699167, 0.01313664112240076, 0.003194994991645217, 0.0014758826000615954, 0.0007962792296893895, 0.01338866539299488, 0.3851647675037384, 0.005100421607494354, 0.015264005400240421, 0.09867162257432938, 0.010212494991719723, 0.006864444352686405, 0.004543095827102661, 0.07139288634061813, 0.0035913733299821615, 0.3585764467716217], [0.005817990750074387, 0.0009788741590455174, 0.004159149713814259, 0.001518536009825766, 0.0011408962309360504, 0.0016662640264257789, 0.008779410272836685, 0.42437392473220825, 0.0065417359583079815, 0.014485418796539307, 0.053910478949546814, 0.0028802845627069473, 0.020178869366645813, 0.0059985388070344925, 0.038687944412231445, 0.012680376879870892, 0.3962012529373169], [0.004734321031719446, 0.003024928504601121, 0.0019297586986795068, 0.0026375832967460155, 0.0023297788575291634, 0.005153844598680735, 0.00532491272315383, 0.3924025595188141, 0.010840315371751785, 0.033790573477745056, 0.013390748761594296, 0.03214506059885025, 0.013913237489759922, 0.04251052439212799, 0.016102980822324753, 0.05241706222295761, 0.3673517107963562], [0.01092924177646637, 0.001503748120740056, 0.002171804429963231, 0.0014883984113112092, 0.0025665252469480038, 0.0023110362235456705, 0.006589826196432114, 0.4631040394306183, 0.0016154979821294546, 0.003656144021078944, 0.02070433273911476, 0.00933170598000288, 0.011967726983129978, 0.0022938179317861795, 0.0153458621352911, 0.02230805717408657, 0.42211225628852844], [0.03446207940578461, 0.0012501878663897514, 0.007782842963933945, 0.0013884148793295026, 0.0008539968403056264, 0.001043938216753304, 0.017917238175868988, 0.40183836221694946, 0.01055973395705223, 0.008871659636497498, 0.0603027306497097, 0.015342195518314838, 0.0035483792889863253, 0.005809720605611801, 0.048666175454854965, 0.013975669629871845, 0.366386741399765], [0.016678277403116226, 0.002050426322966814, 0.008223841898143291, 0.0022403779439628124, 0.001961521804332733, 0.0034141947980970144, 0.02179364673793316, 0.2994793951511383, 0.022490115836262703, 0.05387016385793686, 0.06793522089719772, 0.038013335317373276, 0.0262907762080431, 0.016293074935674667, 0.09536747634410858, 0.03742564097046852, 0.28647249937057495], [0.005877192597836256, 0.0034107740502804518, 0.0037220846861600876, 0.002956162439659238, 0.0016542795347049832, 0.0019468183163553476, 0.007985391654074192, 0.5060425400733948, 0.0012573145795613527, 0.0007172296172939241, 0.0011814589379355311, 0.00031311772181652486, 0.002615701174363494, 0.0006017001578584313, 0.0005493956268765032, 0.001424335641786456, 0.4577445983886719]], [[0.058081138879060745, 0.0035345377400517464, 0.00048327207332476974, 0.0018260409124195576, 0.0011258991435170174, 0.0059553589671850204, 0.009336988441646099, 0.3867037892341614, 0.014517584815621376, 0.022331276908516884, 0.0009122620103880763, 0.006296880077570677, 0.011915742419660091, 0.019564205780625343, 0.002081089187413454, 0.07221511751413345, 0.38311880826950073], [0.20696312189102173, 0.05203016474843025, 0.03296201303601265, 0.012925166636705399, 0.0025476927403360605, 0.018447132781147957, 0.011745893396437168, 0.33291444182395935, 0.004095475655049086, 0.002184044336900115, 0.0004623348359018564, 0.0006705705309286714, 0.0010425550863146782, 0.0012804197613149881, 0.000319386221235618, 0.003924714867025614, 0.31548479199409485], [0.009544258005917072, 0.06354167312383652, 0.10917437821626663, 0.017222663387656212, 0.003616398200392723, 0.01736697368323803, 0.005380899645388126, 0.3946889042854309, 0.0010804414050653577, 0.0010615169303491712, 0.00034612935269251466, 0.00022362053277902305, 0.0006903003668412566, 0.0008227587677538395, 0.00013060474884696305, 0.004481843207031488, 0.3706267178058624], [0.05291534960269928, 0.027047622948884964, 0.0243399515748024, 0.07604734599590302, 0.008007433265447617, 0.015517547726631165, 0.008383023552596569, 0.39121633768081665, 0.007392592262476683, 0.0019033858552575111, 0.0007349075167439878, 0.0020370050333440304, 0.003927053418010473, 0.0033635820727795362, 0.0005569791537709534, 0.0028744502924382687, 0.3737354576587677], [0.062217917293310165, 0.01337711326777935, 0.020553801208734512, 0.03973612189292908, 0.033664073795080185, 0.046530988067388535, 0.028501177206635475, 0.3765867352485657, 0.003768692724406719, 0.0036649045068770647, 0.0014923333656042814, 0.0033850050531327724, 0.009810353629291058, 0.0044006286188960075, 0.0002195091510657221, 0.0036168720107525587, 0.34847375750541687], [0.34146106243133545, 0.02478223294019699, 0.004990026820451021, 0.005676247179508209, 0.01033876370638609, 0.04843056574463844, 0.015825675800442696, 0.23181189596652985, 0.02017080970108509, 0.007384820841252804, 0.0008916688966564834, 0.009339377284049988, 0.0157246682792902, 0.035551056265830994, 0.0008182819583453238, 0.0075484635308384895, 0.21925443410873413], [0.3155786991119385, 0.006876218132674694, 0.003396758111193776, 0.004508413840085268, 0.0036614052951335907, 0.013870066963136196, 0.03118664026260376, 0.29524222016334534, 0.00980982556939125, 0.002955785021185875, 0.001366298645734787, 0.004347093403339386, 0.0027167159132659435, 0.011480855755507946, 0.000715260801371187, 0.0069707888178527355, 0.2853168845176697], [0.013149316422641277, 0.012902080081403255, 0.01806490309536457, 0.014573928900063038, 0.010331671684980392, 0.009624972008168697, 0.008706919848918915, 0.43649911880493164, 0.007637768052518368, 0.006162748672068119, 0.011010280810296535, 0.009294092655181885, 0.009086434729397297, 0.009444552473723888, 0.005963052622973919, 0.010287081822752953, 0.40726107358932495], [0.09136522561311722, 0.006062550004571676, 0.001603035139851272, 0.001733393408358097, 0.0003329587052576244, 0.003800787264481187, 0.0014865024713799357, 0.38748699426651, 0.01800359971821308, 0.015988672152161598, 0.0006138475146144629, 0.0213943962007761, 0.010658537037670612, 0.03979094699025154, 0.0054314760491251945, 0.025079913437366486, 0.3691672086715698], [0.007324482314288616, 0.005489994306117296, 0.001962576759979129, 0.001324493088759482, 0.0009386801975779235, 0.005905526224523783, 0.0008926731534302235, 0.4418574571609497, 0.00871247984468937, 0.01325586810708046, 0.0006926454952917993, 0.016734357923269272, 0.010610931552946568, 0.02821049839258194, 0.006918969098478556, 0.02438165806233883, 0.4247867465019226], [0.003925632685422897, 0.004230535123497248, 0.0010590058518573642, 0.0006869681528769433, 0.0003802390128839761, 0.000535926956217736, 0.0010150398593395948, 0.46231839060783386, 0.005874479189515114, 0.016517847776412964, 0.002310746582224965, 0.008089551702141762, 0.002861416433006525, 0.011940768919885159, 0.009251350536942482, 0.024087822064757347, 0.4449143409729004], [0.013991579413414001, 0.0026920612435787916, 0.0004286356270313263, 0.0012821572599932551, 0.0005682572955265641, 0.0017876577330753207, 0.0012407383183017373, 0.42504575848579407, 0.02766994945704937, 0.030809173360466957, 0.002387475920841098, 0.01229769829660654, 0.006653984542936087, 0.02307051420211792, 0.001481684623286128, 0.038690682500600815, 0.4099019765853882], [0.01480164472013712, 0.007414062973111868, 0.0005500276456587017, 0.004616001155227423, 0.0013686147285625339, 0.00413471320644021, 0.0012240175856277347, 0.4411126673221588, 0.02192091755568981, 0.015876512974500656, 0.001365436241030693, 0.005048608873039484, 0.01549906563013792, 0.017442476004362106, 0.00177295773755759, 0.031113913282752037, 0.4147384762763977], [0.05755488574504852, 0.005670232232660055, 0.004319096449762583, 0.003675673855468631, 0.0011407688725739717, 0.008057341910898685, 0.0011659596348181367, 0.3978501856327057, 0.03677073121070862, 0.014605476520955563, 0.0012661413056775928, 0.021521707996726036, 0.009133722633123398, 0.027279064059257507, 0.004793264903128147, 0.025246990844607353, 0.37994876503944397], [0.00315808760933578, 0.009061409160494804, 0.0012865527532994747, 0.00216081365942955, 0.0009869079804047942, 0.002581524895504117, 0.0005197927239350975, 0.45337435603141785, 0.011839378625154495, 0.00985421147197485, 0.0017092888010665774, 0.003585747443139553, 0.0038926894776523113, 0.015279467217624187, 0.009867019020020962, 0.031809236854314804, 0.4390336275100708], [0.008756712079048157, 0.010189360938966274, 0.0019374573603272438, 0.0012815456138923764, 0.0009366818121634424, 0.006824163720011711, 0.003176322439685464, 0.42618700861930847, 0.019318632781505585, 0.028950825333595276, 0.0020517874509096146, 0.013749299570918083, 0.00409420533105731, 0.03339924290776253, 0.0033742424566298723, 0.02643943950533867, 0.40933308005332947], [0.0122181735932827, 0.012281388975679874, 0.01749054342508316, 0.01346633117645979, 0.009635481052100658, 0.008993065916001797, 0.00810133945196867, 0.4418259263038635, 0.00723161268979311, 0.005751136690378189, 0.010379642248153687, 0.008538834750652313, 0.008417852222919464, 0.008858553133904934, 0.005616511218249798, 0.009633349254727364, 0.41156017780303955]], [[0.009939809329807758, 0.007224703673273325, 0.007533363066613674, 0.007282021455466747, 0.0034862966276705265, 0.00901864841580391, 0.031616855412721634, 0.45856714248657227, 0.007089737802743912, 0.005893452558666468, 0.005734927020967007, 0.002829513978213072, 0.0008206665515899658, 0.0009188369731418788, 0.0010493493173271418, 0.014137896709144115, 0.42685678601264954], [0.05443664640188217, 0.029737835749983788, 0.022491198033094406, 0.01302969641983509, 0.0009477597195655107, 0.0014010426821187139, 0.022573504596948624, 0.44702789187431335, 0.0001566989376442507, 0.00010116443445440382, 0.0001250420173164457, 0.0005660986062139273, 0.0007146843709051609, 0.000161769799888134, 8.476022048853338e-05, 0.0007350319065153599, 0.4057091772556305], [0.02425786480307579, 0.2085607349872589, 0.04924154654145241, 0.04065050557255745, 0.005116268526762724, 0.003940454684197903, 0.02292780950665474, 0.33670246601104736, 0.0003809206828009337, 0.00017320620827376842, 0.00012582748604472727, 0.0005399397923611104, 0.0007989048608578742, 0.0007496858015656471, 0.00017536936502438039, 0.001251032343134284, 0.3044074773788452], [0.017011234536767006, 0.08842890709638596, 0.03877810016274452, 0.04709337279200554, 0.005090255755931139, 0.007245996501296759, 0.04618887975811958, 0.38638895750045776, 0.00029823428485542536, 0.0008584815659560263, 0.0003715864149853587, 0.0006675109616480768, 0.00035607171594165266, 0.0012847973266616464, 0.00014487490989267826, 0.004457451403141022, 0.3553353250026703], [0.06363087892532349, 0.053292203694581985, 0.05096591264009476, 0.36944296956062317, 0.03395163267850876, 0.04834141954779625, 0.02734738402068615, 0.18110227584838867, 0.0002906589361373335, 0.00019604693807195872, 0.0012373443460091949, 0.00010409109381726012, 0.00012401021376717836, 0.0002704971411731094, 0.0005050405743531883, 0.003530337940901518, 0.1656673401594162], [0.014341513626277447, 0.003986303694546223, 0.030320309102535248, 0.3685612678527832, 0.40015092492103577, 0.06286061555147171, 0.05971454083919525, 0.02730235457420349, 0.0008103272411972284, 0.0007862814818508923, 0.0008068971219472587, 0.00015983544290065765, 0.0006843036389909685, 0.0005336704198271036, 0.0002578137500677258, 0.0043155960738658905, 0.024407442659139633], [0.005483838729560375, 0.0033265429083257914, 0.024957021698355675, 0.06485161185264587, 0.44609877467155457, 0.21358022093772888, 0.11668811738491058, 0.057974692434072495, 0.003344930475577712, 0.0035220894496887922, 0.0005914203356951475, 0.00038499984657391906, 0.00045922843855805695, 0.0006797462701797485, 0.0007015742594376206, 0.0034659935627132654, 0.05388921499252319], [0.009029662236571312, 0.008867698721587658, 0.002036694437265396, 0.005566149950027466, 0.003628699341788888, 0.005201783962547779, 0.008890935219824314, 0.48325616121292114, 0.0030989625956863165, 0.004227292723953724, 0.0037787994369864464, 0.0013994371984153986, 0.0019835950806736946, 0.0029882160015404224, 0.0013262588763609529, 0.00817120261490345, 0.4465484023094177], [0.8085571527481079, 0.0007493507582694292, 0.0019036189187318087, 0.0015540625900030136, 0.0038444935344159603, 0.007121680304408073, 0.0633232444524765, 0.03853936493396759, 0.021813571453094482, 0.014556693844497204, 0.0019070605048909783, 0.0004331294330768287, 0.0001968226715689525, 0.00015141199401114136, 6.762483099009842e-05, 0.0011697685113176703, 0.0341109074652195], [0.3206234872341156, 0.0010715130483731627, 0.0004725066537503153, 0.000787086202763021, 0.002252891194075346, 0.014332323335111141, 0.03703133761882782, 0.07742665708065033, 0.39005160331726074, 0.05841495096683502, 0.02228580228984356, 0.0027372236363589764, 0.0007897275499999523, 0.0009922193130478263, 0.00016248947940766811, 0.002158122370019555, 0.06841004639863968], [0.15167462825775146, 0.003144986229017377, 0.0004673275980167091, 0.0015654508024454117, 0.0017308281967416406, 0.004515755455940962, 0.062292736023664474, 0.1616169810295105, 0.1045684963464737, 0.3141041696071625, 0.02399604395031929, 0.014539767988026142, 0.0037524055223912, 0.0017867519054561853, 0.0002666361106093973, 0.0057896836660802364, 0.14418728649616241], [0.18430300056934357, 0.0006995412986725569, 0.00035440968349575996, 0.0007710016216151416, 0.0003348179452586919, 0.002954046707600355, 0.03137532249093056, 0.046687643975019455, 0.13489992916584015, 0.4926917850971222, 0.026139475405216217, 0.019150640815496445, 0.0064958231523633, 0.004663608502596617, 0.00016776786651462317, 0.006610635668039322, 0.041700541973114014], [0.10179366916418076, 0.0007348553626798093, 0.00019881267508026212, 0.0004953754832968116, 2.999552270921413e-05, 0.0004993690527044237, 0.005266892723739147, 0.2633466124534607, 0.015496296808123589, 0.08973632007837296, 0.15794748067855835, 0.08470945060253143, 0.032944921404123306, 0.0021199495531618595, 0.0006800106493756175, 0.009472664445638657, 0.23452730476856232], [0.07440144568681717, 0.00022271893976721913, 0.00046802894212305546, 0.0010866225929930806, 0.00010922667570412159, 0.00038501838571392, 0.002403436228632927, 0.05347945913672447, 0.0028376237023621798, 0.04160892218351364, 0.07838019728660583, 0.03517412766814232, 0.5606391429901123, 0.07125737518072128, 0.005065568257123232, 0.024564482271671295, 0.04791658744215965], [0.048744507133960724, 0.0014646576019003987, 0.000533664075192064, 0.0008612934616394341, 9.188978583551943e-05, 0.0005595221882686019, 0.0017017755890265107, 0.07265777885913849, 0.0032121159601956606, 0.036121610552072525, 0.012174506671726704, 0.05168210715055466, 0.3044191002845764, 0.2770574390888214, 0.0061468705534935, 0.1158781349658966, 0.06669303774833679], [0.017357023432850838, 0.0005461283726617694, 0.0009654095047153533, 0.000442716118413955, 0.000257193052675575, 0.0005188124487176538, 0.0058730789460241795, 0.008355407044291496, 0.000914216972887516, 0.06323366612195969, 0.0023887401912361383, 0.009233508259057999, 0.29986700415611267, 0.4443662762641907, 0.0030868363101035357, 0.13496631383895874, 0.00762767530977726], [0.008935264311730862, 0.008518635295331478, 0.001973578939214349, 0.0052887131460011005, 0.0034949309192597866, 0.005065209232270718, 0.008939981460571289, 0.4839116334915161, 0.002998156240209937, 0.004387673921883106, 0.0037572146393358707, 0.0014034658670425415, 0.002018640749156475, 0.003026488935574889, 0.0013112741289660335, 0.00826748926192522, 0.44670164585113525]], [[0.012184658087790012, 0.008505215868353844, 0.014193563722074032, 0.008205908350646496, 0.0014411743031814694, 0.009403233416378498, 0.003979966510087252, 0.4882141649723053, 0.004072886426001787, 0.0006297666113823652, 0.000201590868528001, 0.0009201199864037335, 0.0007396361907012761, 0.0008097634417936206, 9.02313768165186e-05, 0.0005449629970826209, 0.44586315751075745], [0.0013898422475904226, 0.008414855226874352, 0.019608808681368828, 0.0016067869728431106, 0.00014330419071484357, 0.0018221481004729867, 0.0019456377485767007, 0.5011254549026489, 0.0003726936411112547, 8.073732169577852e-05, 3.754526551347226e-05, 0.00020925446006003767, 0.00027266511460766196, 0.00048708345275372267, 6.716827192576602e-05, 6.075066630728543e-05, 0.46235519647598267], [0.000252473633736372, 0.011854629032313824, 0.02528519369661808, 0.0012852048967033625, 0.00020065312855876982, 0.0005648799706250429, 0.0005235031130723655, 0.506248950958252, 0.0001067768462235108, 3.2294177799485624e-05, 1.528438224340789e-05, 2.609648981888313e-05, 6.311253673629835e-05, 0.0002040141262114048, 8.059528227022383e-06, 1.6905221855267882e-05, 0.4533120095729828], [0.002926149405539036, 0.28323885798454285, 0.24611446261405945, 0.0254647396504879, 0.004542201291769743, 0.005212891846895218, 0.012094838544726372, 0.2143026888370514, 0.0008276054286397994, 0.0006083512562327087, 0.00013954140013083816, 0.00042719399789348245, 0.0014161287108436227, 0.0012400433188304305, 0.00013553403550758958, 0.0006795660592615604, 0.20062923431396484], [0.01579451374709606, 0.0871802270412445, 0.19255691766738892, 0.3597927391529083, 0.018881194293498993, 0.02390315756201744, 0.07935640215873718, 0.11356460303068161, 0.0011755885789170861, 0.0006382029387168586, 6.039217987563461e-05, 0.0007043501827865839, 0.0005545559106394649, 0.0008882189868018031, 0.00010235571971861646, 0.0007649580365978181, 0.1040816381573677], [0.02811458893120289, 0.12556779384613037, 0.12480274587869644, 0.07554753124713898, 0.028632069006562233, 0.02600189670920372, 0.13272206485271454, 0.23228637874126434, 0.00257579842582345, 0.0016351820668205619, 0.0005269265966489911, 0.0019535976462066174, 0.0030801454558968544, 0.002008943585678935, 0.0003782814310397953, 0.002560045337304473, 0.21160608530044556], [0.016431162133812904, 0.00885338056832552, 0.008457045070827007, 0.027995018288493156, 0.05877500772476196, 0.03618244454264641, 0.017765508964657784, 0.42946168780326843, 0.00294468249194324, 0.0005550188943743706, 0.00017534277867525816, 0.0005801208899356425, 0.0013792435638606548, 0.0016008998500183225, 0.00018133661069441587, 0.00020087146549485624, 0.38846132159233093], [0.004859008826315403, 0.006090878508985043, 0.0040610311552882195, 0.005147299263626337, 0.0019026636146008968, 0.01046158280223608, 0.006691939663141966, 0.4869372248649597, 0.002904450288042426, 0.0014126470778137445, 0.0017575236270204186, 0.0012702866224572062, 0.0034210714511573315, 0.0034555403981357813, 0.0017269020900130272, 0.0019563522655516863, 0.4559434950351715], [0.004845303483307362, 0.0017335828160867095, 0.006482381839305162, 0.0052496930584311485, 0.006233659107238054, 0.006766089238226414, 0.005063262302428484, 0.47900670766830444, 0.019519057124853134, 0.00405309209600091, 0.0002688068198040128, 0.0029997490346431732, 0.0015211037825793028, 0.0017624662723392248, 0.00018829450709745288, 0.0009046376217156649, 0.45340219140052795], [0.0007962834788486362, 0.0005222181789577007, 0.002212675055488944, 0.0013036631280556321, 0.003965959884226322, 0.001140089938417077, 0.0006054844707250595, 0.49249377846717834, 0.022363988682627678, 0.0016585325356572866, 0.0011970446212217212, 0.0007815089193172753, 0.0018286594422534108, 0.0006528430967591703, 6.864647730253637e-05, 9.251816663891077e-05, 0.46831613779067993], [0.004625837318599224, 0.0024228421971201897, 0.003148886142298579, 0.0005834151525050402, 0.021894307807087898, 0.005230520386248827, 0.0022548476699739695, 0.45780667662620544, 0.05236639827489853, 0.00411509582772851, 0.002727513900026679, 0.0013582556275650859, 0.004267165903002024, 0.001986091025173664, 0.0005386286647990346, 0.001028838800266385, 0.4336446225643158], [0.0033834499772638083, 0.0008692671544849873, 0.0026557990349829197, 0.00033716074540279806, 0.0007147450814954937, 0.000865574402268976, 0.0025856709107756615, 0.45415636897087097, 0.054058998823165894, 0.021769464015960693, 0.008878658525645733, 0.006614568643271923, 0.008991510607302189, 0.0036782962270081043, 0.00027585314819589257, 0.0009506583446636796, 0.42921391129493713], [0.027775224298238754, 0.0006304891430772841, 0.0007492569275200367, 0.0011337787145748734, 0.0007583643309772015, 0.0041227685287594795, 0.05064774677157402, 0.3734520375728607, 0.07708753645420074, 0.0488978810608387, 0.017656449228525162, 0.01979784481227398, 0.00943046249449253, 0.006686373148113489, 0.0005436926730908453, 0.0020358620677143335, 0.35859429836273193], [0.00742443697527051, 0.001189874135889113, 0.004027462098747492, 0.0025093574076890945, 0.0013430585386231542, 0.0017698142910376191, 0.012047508731484413, 0.44061076641082764, 0.013889811933040619, 0.021956544369459152, 0.002472400199621916, 0.012589454650878906, 0.036182206124067307, 0.0167858824133873, 0.0014867889694869518, 0.0029481761157512665, 0.4207664728164673], [0.0008840393857099116, 0.0007128884899429977, 0.0007232290226966143, 0.00018893781816586852, 0.00043874632683582604, 0.00045278071775101125, 0.00022807817731518298, 0.4652222692966461, 0.004917670972645283, 0.0021742689423263073, 0.001956837484613061, 0.0011202013120055199, 0.06077861413359642, 0.01824222318828106, 0.00046264741104096174, 0.002999867545440793, 0.4384966790676117], [0.002743187127634883, 0.0012244120007380843, 0.0012345373397693038, 0.0002988884225487709, 0.0016728475457057357, 0.0008148871129378676, 0.0010718100238591433, 0.3036087155342102, 0.004517070017755032, 0.011793745681643486, 0.0014015306951478124, 0.012156683020293713, 0.2692471444606781, 0.09917345643043518, 0.002273885067552328, 0.004460975993424654, 0.2823062837123871], [0.004642104264348745, 0.005710650701075792, 0.0038544596172869205, 0.004685578402131796, 0.001754248165525496, 0.009675242938101292, 0.006343010812997818, 0.48870575428009033, 0.002827167045325041, 0.0013951309956610203, 0.001721624401398003, 0.0012315827189013362, 0.0033350202720612288, 0.003388363169506192, 0.001680687884800136, 0.0018939882284030318, 0.45715540647506714]], [[0.005110002122819424, 0.005243502091616392, 0.044925522059202194, 0.18013958632946014, 0.02472485415637493, 0.1627327799797058, 0.40163204073905945, 0.08183026313781738, 0.0010027880780398846, 0.0010755606926977634, 0.011116763576865196, 0.00484110601246357, 0.001587292063049972, 0.0002706963859964162, 6.567491800524294e-05, 0.00400706147775054, 0.06969451159238815], [0.18214844167232513, 0.02809913456439972, 0.004850266966968775, 0.003685017814859748, 0.0020814971067011356, 0.00032684349571354687, 0.0030828863382339478, 0.41418221592903137, 2.054480319202412e-05, 0.0001406587107339874, 1.9035733203054406e-05, 0.0002993363596033305, 0.00430124718695879, 0.0007239219848997891, 1.1524194633238949e-05, 0.0022125791292637587, 0.35381487011909485], [0.004572773352265358, 0.6229727864265442, 0.022383665665984154, 0.002413122681900859, 0.000362670689355582, 0.002742021344602108, 0.0059003704227507114, 0.18106311559677124, 0.00011430172889959067, 0.00015165729564614594, 2.6816562694875756e-06, 2.7619146294455277e-06, 9.666436380939558e-05, 0.0007024533115327358, 3.468029899522662e-05, 0.0008733842987567186, 0.15561091899871826], [0.004115269053727388, 0.023489195853471756, 0.6333683133125305, 0.03843390569090843, 0.011588061228394508, 0.004509551916271448, 0.0018771549221128225, 0.14559462666511536, 0.0001255011884495616, 0.000122419762192294, 5.126211362949107e-06, 3.501359242363833e-05, 3.606339305406436e-05, 0.00044822378549724817, 1.091876401915215e-05, 0.0022400752641260624, 0.13400059938430786], [0.0019748113118112087, 0.0008361928630620241, 0.013866727240383625, 0.9531517028808594, 0.0035323600750416517, 0.003416527761146426, 0.0007410639664158225, 0.011606544256210327, 6.039542768121464e-06, 9.631342254579067e-05, 2.0979675241505902e-07, 9.657464397605509e-05, 2.539563865866512e-06, 7.30191186448792e-06, 2.7621141271083616e-05, 0.0004698503471445292, 0.01016773097217083], [0.015710238367319107, 0.004889908246695995, 0.004747701808810234, 0.023744938895106316, 0.551334023475647, 0.02658320777118206, 0.009716896340250969, 0.18731112778186798, 0.00018207498942501843, 0.0010387522634118795, 4.93815605295822e-06, 1.2415423952916171e-05, 0.00013230141485109925, 0.000416814349591732, 5.4612778512819204e-06, 0.005046966951340437, 0.16912229359149933], [0.0034168993588536978, 0.0022271759808063507, 0.0033042575232684612, 0.004180824849754572, 0.018737608566880226, 0.49540263414382935, 0.014411557465791702, 0.2417408674955368, 0.0014099132968112826, 0.0009504570043645799, 1.805807914934121e-05, 1.002754106593784e-05, 3.997509338660166e-05, 9.342974954051897e-05, 2.634658812894486e-05, 0.00112089142203331, 0.2129090428352356], [0.0021914467215538025, 0.00832028966397047, 0.004788258112967014, 0.005528679117560387, 0.003466195659711957, 0.013044467195868492, 0.008945983834564686, 0.4852476418018341, 0.0014283099444583058, 0.003144088201224804, 0.0022380719892680645, 0.0008132868679240346, 0.0008117115939967334, 0.0017450281884521246, 0.001616528956219554, 0.0018050218932330608, 0.454865038394928], [0.4379085600376129, 0.00028174620820209384, 3.1670977477915585e-05, 0.00015586770314257592, 0.0027239841874688864, 0.0009933231631293893, 0.17001473903656006, 0.16477473080158234, 0.004492960404604673, 0.08151695877313614, 0.00017584662418812513, 0.0016925687668845057, 0.0005805432447232306, 1.2447393601178192e-05, 1.0126451570613426e-06, 0.0015085089253261685, 0.13313452899456024], [0.003525580745190382, 4.802578405360691e-05, 2.01742604986066e-05, 9.991535989684053e-06, 4.661239927372662e-06, 5.132077421876602e-05, 0.0005717097665183246, 0.022776108235120773, 0.8859135508537292, 0.06343701481819153, 0.000866693735588342, 0.0017111633205786347, 0.00015655916649848223, 0.000185528420843184, 2.203381882281974e-05, 4.2796200432348996e-05, 0.020657191053032875], [0.011056514456868172, 3.670415026135743e-05, 3.75458002963569e-05, 5.2443712775129825e-05, 3.189638664480299e-05, 2.9558484584413236e-06, 0.005105303134769201, 0.009075704962015152, 0.003393452614545822, 0.9445227384567261, 0.0015669281128793955, 0.01678871177136898, 0.0006078589358367026, 3.815459422185086e-05, 1.7540629414725117e-05, 6.0220550949452445e-05, 0.007605218794196844], [0.001715721096843481, 4.4702055674861185e-06, 3.7012682696513366e-06, 3.2903128612815635e-06, 3.4372243362668087e-07, 6.439051389861561e-07, 0.0006992665003053844, 0.00806423556059599, 0.0006165258237160742, 0.03605213388800621, 0.9346634149551392, 0.006596107501536608, 0.003923584707081318, 9.183640941046178e-05, 9.569924441166222e-05, 0.0001224019069923088, 0.007346579805016518], [0.0001551469904370606, 5.273045644571539e-07, 3.6399751479621045e-07, 2.6008397981058806e-05, 3.606203557993126e-09, 2.4337593274026403e-08, 8.297465683426708e-06, 0.0022804904729127884, 2.89407040554579e-07, 0.0016876587178558111, 0.00042468419997021556, 0.9910705089569092, 0.0023586973547935486, 5.395537300501019e-06, 1.2956435057276394e-05, 5.2216324547771364e-05, 0.0019168899161741138], [0.0006996840238571167, 6.358242899295874e-06, 3.444561116339173e-07, 8.608779467067507e-07, 6.041139499757264e-07, 9.932793432199105e-08, 8.998684279504232e-06, 0.004120247904211283, 3.383163402759237e-07, 0.00014349669800139964, 1.1060445103794336e-05, 0.0007158363587222993, 0.9806085228919983, 0.008883124217391014, 1.6464431610074826e-05, 0.0012194132432341576, 0.003564612939953804], [0.00262492336332798, 0.0007247307221405208, 0.0001397337473463267, 2.2053094653529115e-05, 1.2582573617692105e-05, 9.890898581943475e-06, 5.660822716890834e-05, 0.053488463163375854, 0.00022304743470158428, 0.001291738823056221, 1.1688776794471778e-05, 0.0016349911456927657, 0.10247543454170227, 0.7778118848800659, 0.0005079564871266484, 0.010389856062829494, 0.048574384301900864], [0.0003329257888253778, 7.444038783432916e-05, 0.0001273355446755886, 8.453674672637135e-05, 2.3071950636222027e-05, 2.8033704438712448e-05, 0.00013234779180493206, 0.018939178436994553, 7.5294128691894e-06, 0.0002344320819247514, 0.00016444017819594592, 0.00033245462691411376, 0.011586690321564674, 0.01243089884519577, 0.9226889610290527, 0.015689915046095848, 0.01712280884385109], [0.0022594965994358063, 0.007946429774165154, 0.004695436917245388, 0.0053703333251178265, 0.003358474001288414, 0.012818355113267899, 0.008875174447894096, 0.48547399044036865, 0.0014509111642837524, 0.0032204673625528812, 0.0022641364485025406, 0.0008676875731907785, 0.000867484079208225, 0.001839510165154934, 0.0016459976322948933, 0.0019465988734737039, 0.45509955286979675]]], [[[0.006081664934754372, 0.05992679297924042, 0.004632278345525265, 0.04761708155274391, 0.0069939917884767056, 0.03733282908797264, 0.04673796519637108, 0.39511778950691223, 0.0018650954589247704, 0.0007704297895543277, 0.0002778128255158663, 0.0020284021738916636, 0.0011147432960569859, 0.00067733513424173, 7.294692477444187e-05, 0.0015523826004937291, 0.3872005045413971], [0.005391435232013464, 0.08358006924390793, 0.006630939897149801, 0.011355679482221603, 0.004883507266640663, 0.020148931071162224, 0.010913971811532974, 0.4339543879032135, 0.0006111536640673876, 0.00014911442121956497, 0.00017661698802839965, 0.00026286710635758936, 0.0004035455349367112, 0.0008672158000990748, 2.0717823645099998e-05, 0.0002563974994700402, 0.4203934967517853], [0.00915137305855751, 0.2797113358974457, 0.019832463935017586, 0.018241873010993004, 0.003129567950963974, 0.011380055919289589, 0.011015127412974834, 0.32218578457832336, 0.0005929506733082235, 0.00021194826695136726, 0.00023431847512256354, 0.0003328909515403211, 0.0003763137210626155, 0.0003916354908142239, 6.570235564140603e-05, 0.00043183378875255585, 0.3227148652076721], [0.013154246844351292, 0.3600543141365051, 0.023770911619067192, 0.030796343460679054, 0.016679560765624046, 0.03596251830458641, 0.030871694907546043, 0.24113529920578003, 0.0031435987912118435, 0.0007975840708240867, 0.0004333582182880491, 0.0005949624464847147, 0.0008247648947872221, 0.001094144769012928, 0.0002791965671349317, 0.001881363452412188, 0.23852622509002686], [0.013965611346065998, 0.5769703388214111, 0.05015614256262779, 0.06386855244636536, 0.011596623808145523, 0.02515888400375843, 0.035795003175735474, 0.10849734395742416, 0.001974851591512561, 0.0014403314562514424, 0.00027517142007127404, 0.0006395029486157, 0.0007528035785071552, 0.0004802368930540979, 9.630419663153589e-05, 0.0019671532791107893, 0.1063651442527771], [0.009709489531815052, 0.3894999623298645, 0.02210947871208191, 0.08367262780666351, 0.011140666902065277, 0.02128802239894867, 0.029205838218331337, 0.2157878875732422, 0.002081731567159295, 0.0003966822405345738, 0.00011556350364116952, 0.0005021410179324448, 0.0004580656823236495, 0.00030238102772273123, 3.912653119186871e-05, 0.0010188381420448422, 0.21267148852348328], [0.011339440010488033, 0.20982499420642853, 0.008171171881258488, 0.04262728989124298, 0.02856399677693844, 0.10336685925722122, 0.018930068239569664, 0.2827814221382141, 0.0021787085570394993, 0.000823981303256005, 0.0005304586375132203, 0.0017707452643662691, 0.0014833150198683143, 0.001551308436319232, 0.0003602523938752711, 0.0035466367844492197, 0.2821493148803711], [0.007176012732088566, 0.00820113904774189, 0.0007596280192956328, 0.003672394435852766, 0.0012578731402754784, 0.004433946218341589, 0.006090362556278706, 0.4700566530227661, 0.0012794070644304156, 0.0010410482063889503, 0.0005839013610966504, 0.0007950154831632972, 0.001299087656661868, 0.0008044294081628323, 0.0001501230290159583, 0.0016047084936872125, 0.49079430103302], [0.15465664863586426, 0.0031453038100153208, 0.0009835183154791594, 0.0016759778372943401, 0.0009102323092520237, 0.002278411528095603, 0.017927585169672966, 0.378543496131897, 0.015190532431006432, 0.004076844546943903, 0.0012658251216635108, 0.002001154702156782, 0.0015545767964795232, 0.0005735427839681506, 0.00013070402201265097, 0.0026634216774255037, 0.4124222695827484], [0.18213911354541779, 0.002082312945276499, 0.000587976595852524, 0.0014239961747080088, 0.0011733978753909469, 0.002871074015274644, 0.04821096360683441, 0.3232286870479584, 0.07176226377487183, 0.00858426932245493, 0.0020331665873527527, 0.0024506866466253996, 0.002922122133895755, 0.0017343986546620727, 0.00020298622257541865, 0.0033374675549566746, 0.34525516629219055], [0.11981701850891113, 0.0031062138732522726, 0.0012621426722034812, 0.0014903683913871646, 0.0013740435242652893, 0.002417524578049779, 0.01943671703338623, 0.31205058097839355, 0.15796837210655212, 0.01622297614812851, 0.0037208327557891607, 0.0038314065895974636, 0.009846174158155918, 0.0032112649641931057, 0.00043363115401007235, 0.0023809729609638453, 0.34142979979515076], [0.0985623449087143, 0.001958950189873576, 0.0004693427763413638, 0.0006763520068489015, 0.0006983986240811646, 0.0027868757024407387, 0.0534990057349205, 0.3235074579715729, 0.12810227274894714, 0.027528075501322746, 0.004432633053511381, 0.0032425832469016314, 0.0036657529417425394, 0.004851105622947216, 0.0001295759720960632, 0.003144132671877742, 0.3427451550960541], [0.11859655380249023, 0.010592753067612648, 0.003915958106517792, 0.006048004142940044, 0.0012478609569370747, 0.0025914048310369253, 0.03795735165476799, 0.21124321222305298, 0.20915192365646362, 0.0636608675122261, 0.014383974485099316, 0.045033980160951614, 0.031910691410303116, 0.006803985219448805, 0.00033904644078575075, 0.008478373289108276, 0.2280440628528595], [0.13088934123516083, 0.0035796703305095434, 0.0014508141903206706, 0.0024668967816978693, 0.0003476907149888575, 0.0005831770249642432, 0.010213586501777172, 0.17767012119293213, 0.06018450856208801, 0.03426986187696457, 0.016314895823597908, 0.10584431141614914, 0.2015276551246643, 0.029337994754314423, 0.002748935017734766, 0.028454555198550224, 0.1941160261631012], [0.09070423245429993, 0.0025058856699615717, 0.001340070040896535, 0.0009188575786538422, 0.0005339680355973542, 0.0017334287986159325, 0.006997799966484308, 0.07420050352811813, 0.04359907656908035, 0.026590466499328613, 0.020357169210910797, 0.05759013816714287, 0.3899690806865692, 0.17142851650714874, 0.003307122504338622, 0.028572555631399155, 0.07965105026960373], [0.05354088172316551, 0.002591901458799839, 0.0007613704074174166, 0.0023510607425123453, 0.0004944084794260561, 0.0008302785572595894, 0.017565065994858742, 0.06460551917552948, 0.03785065561532974, 0.03227800875902176, 0.010800259187817574, 0.042973730713129044, 0.5491861701011658, 0.08713904768228531, 0.0025061580818146467, 0.027050837874412537, 0.06747457385063171], [0.007503681816160679, 0.008212205022573471, 0.0007863524951972067, 0.003759582992643118, 0.0013562600361183286, 0.004557321779429913, 0.006450110115110874, 0.4691476821899414, 0.001448334543965757, 0.001180148683488369, 0.0006850242498330772, 0.0008904925780370831, 0.001459707971662283, 0.0009326364961452782, 0.00017835278413258493, 0.0018082803580909967, 0.489643931388855]], [[0.016195744276046753, 0.007531195878982544, 0.00832654070109129, 0.01957850717008114, 0.003244524821639061, 0.0041532255709171295, 0.017489759251475334, 0.14534495770931244, 0.18893791735172272, 0.20019234716892242, 0.01511116698384285, 0.048345886170864105, 0.05928065627813339, 0.03101625293493271, 0.0026634729001671076, 0.06964404881000519, 0.16294381022453308], [0.005956089124083519, 0.04857485741376877, 0.03526328131556511, 0.536970317363739, 0.09033060818910599, 0.09486401826143265, 0.036154985427856445, 0.06904285401105881, 0.0009082306060008705, 0.0010115077020600438, 0.0016392747638747096, 0.0006081801257096231, 0.003833032911643386, 0.0006445400649681687, 0.00021222887153271586, 0.0010150948073714972, 0.07297086715698242], [0.009720677509903908, 0.028409060090780258, 0.006889475509524345, 0.1505645215511322, 0.07185011357069016, 0.09998802840709686, 0.014820966869592667, 0.29232335090637207, 0.001485234941355884, 0.0008462928817607462, 0.001615152694284916, 0.0005526298773474991, 0.0035069601144641638, 0.0007641459815204144, 0.00010834964632522315, 0.0008453446207568049, 0.3157096803188324], [0.011588013730943203, 0.03263028338551521, 0.016619393602013588, 0.07424406707286835, 0.12225540727376938, 0.10742254555225372, 0.06151161342859268, 0.26838418841362, 0.004352821968495846, 0.00218875496648252, 0.002210963750258088, 0.0013849706156179309, 0.003246636362746358, 0.0015423427103087306, 0.00036032666685059667, 0.002137352479621768, 0.2879202663898468], [0.006749264895915985, 0.0028159739449620247, 0.004609315190464258, 0.0210895836353302, 0.014897584915161133, 0.03405199572443962, 0.19525428116321564, 0.3322852551937103, 0.0028566333930939436, 0.008909238502383232, 0.001465243985876441, 0.0006865789764560759, 0.0013370326487347484, 0.0007473042351193726, 0.00012875357060693204, 0.0011581083526834846, 0.3709578216075897], [0.005366782192140818, 0.003631867468357086, 0.0029813863802701235, 0.005355299450457096, 0.003701434237882495, 0.009891425259411335, 0.1378726363182068, 0.3925182521343231, 0.001643494819290936, 0.0028973990119993687, 0.000407673156587407, 0.00045835901983082294, 0.001067211152985692, 0.00043942814227193594, 3.0581992177758366e-05, 0.0007625091238878667, 0.4309742748737335], [0.026992961764335632, 0.01583678089082241, 0.001769352937117219, 0.007409827783703804, 0.009119429625570774, 0.011499504558742046, 0.01268517691642046, 0.4155344069004059, 0.012704935856163502, 0.012436505407094955, 0.0009204059024341404, 0.005355632398277521, 0.020831512287259102, 0.003900873241946101, 9.179565677186474e-05, 0.0021385583095252514, 0.4407724142074585], [0.004720430355519056, 0.002525599440559745, 0.0008383329259231687, 0.00580767123028636, 0.0027247031684964895, 0.0035044869873672724, 0.005578236188739538, 0.4559880495071411, 0.0011049964232370257, 0.0018586115911602974, 0.0009332125773653388, 0.00134023348800838, 0.002363360719755292, 0.0009372793138027191, 0.00029140099650248885, 0.002324732718989253, 0.5071585774421692], [0.0032737369183450937, 0.0008022664114832878, 0.000240363267948851, 0.00716595072299242, 0.0014526378363370895, 0.004759353585541248, 0.01834230124950409, 0.23834951221942902, 0.010684075765311718, 0.14173462986946106, 0.06206003949046135, 0.08581943064928055, 0.13188433647155762, 0.0024691049475222826, 0.0004012853023596108, 0.011505370028316975, 0.2790555953979492], [0.005403564777225256, 0.0007430469268001616, 0.00015542798792012036, 0.0023276153951883316, 0.0010992380557581782, 0.0019804127514362335, 0.005390452686697245, 0.372401624917984, 0.008150969631969929, 0.013963522389531136, 0.017318231984972954, 0.03829382359981537, 0.08700034767389297, 0.008544448763132095, 0.0004107538843527436, 0.0066061303950846195, 0.4302103519439697], [0.004853794816881418, 0.001001037540845573, 3.6372177419252694e-05, 0.0006635976606048644, 0.0006123472121544182, 0.0003230271686334163, 0.0006626614485867321, 0.29512521624565125, 0.007403214927762747, 0.009180111810564995, 0.002477343427017331, 0.06222626566886902, 0.23668812215328217, 0.03415898233652115, 0.0003038712020497769, 0.009002981707453728, 0.3352811336517334], [0.003494726028293371, 0.000769023026805371, 0.00012818830145988613, 0.0005306644015945494, 0.0005303329671733081, 0.0006576215964742005, 0.0015999723691493273, 0.40922337770462036, 0.001091606798581779, 0.0008842453826218843, 0.0030796322971582413, 0.0032097497023642063, 0.06694649904966354, 0.020804084837436676, 0.0011883035767823458, 0.010251147672533989, 0.47561079263687134], [0.000959699391387403, 0.0002990888897329569, 5.9609210438793525e-05, 0.00010200739779975265, 0.00012336000509094447, 0.00018489906506147236, 0.0013471123529598117, 0.4471541941165924, 0.00014906033175066113, 0.00012513915135059506, 0.00027241845964454114, 0.0001663800358073786, 0.007664814125746489, 0.01962592452764511, 0.000740576593670994, 0.013673599809408188, 0.5073521137237549], [0.0009458474814891815, 0.00032672841916792095, 9.562892228132114e-05, 0.00027293103630654514, 0.00012847421749029309, 0.00041943677933886647, 0.0020799310877919197, 0.4430186152458191, 3.9779115468263626e-05, 0.00017419367213733494, 0.00013837730512022972, 0.00023241918825078756, 0.004290263168513775, 0.003138928208500147, 0.0006468164501711726, 0.018844034522771835, 0.5252075791358948], [0.0006187675753608346, 0.00028512885910458863, 2.947577740997076e-05, 0.00015786872245371342, 0.00014967193419579417, 0.0004934691824018955, 0.0002329775452381, 0.4530373513698578, 7.580123201478273e-05, 0.00014958517567720264, 0.00014327304961625487, 0.00036756545887328684, 0.0020051472820341587, 0.001575519680045545, 0.0004626726149581373, 0.005856851581484079, 0.53435879945755], [0.006384200882166624, 0.001496648881584406, 0.00011051010369556025, 0.00247593829408288, 0.001244617160409689, 0.0016671734629198909, 0.0019840325694531202, 0.4408290684223175, 0.001353834057226777, 0.0007860823534429073, 0.0004793203843291849, 0.0007536641205660999, 0.013821986503899097, 0.01104042399674654, 0.00039385183481499553, 0.010251244530081749, 0.504927396774292], [0.004829013254493475, 0.0025731620844453573, 0.0008636197890155017, 0.00586956599727273, 0.002794514410197735, 0.0036605612840503454, 0.005821586586534977, 0.45518770813941956, 0.0012199264019727707, 0.002042418345808983, 0.0010405421489849687, 0.0014593041269108653, 0.00251271715387702, 0.0010186491999775171, 0.00032868131529539824, 0.002559759421274066, 0.5062181949615479]], [[0.020592810586094856, 0.031068485230207443, 0.0030091910157352686, 0.0019116721814498305, 0.0016300983261317015, 0.00534924678504467, 0.0010003162315115333, 0.2047761231660843, 0.4428405463695526, 0.013930168002843857, 0.004908159375190735, 0.010767710395157337, 0.0014716139994561672, 0.0041494728066027164, 0.0013201857218518853, 0.0018012389773502946, 0.24947291612625122], [0.008310715667903423, 0.033739812672138214, 0.2864121198654175, 0.32581403851509094, 0.021804066374897957, 0.005936089437454939, 0.006554628722369671, 0.14570818841457367, 0.0005274848081171513, 0.0024290799628943205, 0.00023074595083016902, 0.000169322345755063, 0.0032311948016285896, 0.0005978214903734624, 0.0008715527364984155, 0.0007968372665345669, 0.15686629712581635], [0.025114938616752625, 0.03411397710442543, 0.028034377843141556, 0.26350197196006775, 0.08961477130651474, 0.03336198255419731, 0.014267167076468468, 0.23327352106571198, 0.000878416292835027, 0.006170214619487524, 0.000500328082125634, 0.0007782478351145983, 0.004409650340676308, 0.0003126771771349013, 0.00012549350503832102, 0.0018747287103906274, 0.26366758346557617], [0.02972385659813881, 0.028901759535074234, 0.011909120716154575, 0.03582359105348587, 0.22567670047283173, 0.15854208171367645, 0.016133824363350868, 0.23219706118106842, 0.0031607914716005325, 0.0010064819362014532, 0.0007177618099376559, 0.0009901736630126834, 0.0010422583436593413, 0.000474760978249833, 0.0003878779534716159, 0.0016598624642938375, 0.2516520619392395], [0.01880212500691414, 0.003504497464746237, 0.0009997341549023986, 0.004814295098185539, 0.019866814836859703, 0.2643168568611145, 0.01715674065053463, 0.31928977370262146, 0.001218293677084148, 0.0004931697039864957, 0.00021046191977802664, 6.0413265600800514e-05, 8.493886707583442e-05, 0.0001404096110491082, 4.5598018914461136e-05, 0.00015505151532124728, 0.3488408625125885], [0.013494123704731464, 0.0017110737971961498, 0.005545753985643387, 0.013046718202531338, 0.014396010898053646, 0.011247918009757996, 0.41226208209991455, 0.24960345029830933, 0.0011181272566318512, 0.0035277006682008505, 0.00024853527429513633, 6.489689985755831e-05, 0.00023856516054365784, 9.435818355996162e-05, 9.150271216640249e-05, 0.00016662190319038928, 0.273142546415329], [0.023378994315862656, 0.024302160367369652, 0.001910935970954597, 0.008892455138266087, 0.003502874867990613, 0.009325496852397919, 0.003965886775404215, 0.39976391196250916, 0.023312702775001526, 0.0037509610410779715, 0.0016956980107352138, 0.0024269605055451393, 0.000281721557257697, 0.0001702914305496961, 8.011345926206559e-05, 0.0009240841027349234, 0.49231475591659546], [0.011980770155787468, 0.007396169472485781, 0.0037237638607621193, 0.008932376280426979, 0.005901596043258905, 0.00927004124969244, 0.008305061608552933, 0.43802934885025024, 0.0033466797322034836, 0.005797204095870256, 0.002863217843696475, 0.002911756746470928, 0.0023249078076332808, 0.00230137025937438, 0.0016775100957602262, 0.005442157853394747, 0.4797961413860321], [0.00800317246466875, 0.0005134400562383235, 0.00038273941027000546, 0.0046585937961936, 0.0003526542568579316, 0.0004815298307221383, 0.0014348424738273025, 0.23554135859012604, 0.011392051354050636, 0.4252050220966339, 0.01297029573470354, 0.01035100407898426, 0.002546168165281415, 0.00018028220802079886, 5.717705425922759e-05, 0.00043050097883678973, 0.28549924492836], [0.003621145850047469, 0.0004858339380007237, 0.00014547632599715143, 0.0017724293284118176, 0.00038654441596008837, 0.00041515869088470936, 0.0007591542671434581, 0.15908434987068176, 0.03863447159528732, 0.17868393659591675, 0.11524900048971176, 0.26531630754470825, 0.03404228016734123, 0.0022856583818793297, 0.0002307220274815336, 0.005717561114579439, 0.1931699514389038], [0.004400501027703285, 0.0002689460525289178, 0.00010673885117284954, 0.0004513103631325066, 0.00010925299284281209, 0.000372429087292403, 0.00023232129751704633, 0.12209988385438919, 0.0010749115608632565, 0.008683067746460438, 0.029099229723215103, 0.4268147647380829, 0.25060057640075684, 0.007010831031948328, 0.0001690016215434298, 0.004305271431803703, 0.1442009061574936], [0.0064212665893137455, 0.0009032529196701944, 3.7182209780439734e-05, 0.0011846325360238552, 0.00016604784468654543, 0.00015707682177890092, 0.00020069214224349707, 0.1174682080745697, 0.007923083379864693, 0.001099211978726089, 0.00972581934183836, 0.034272450953722, 0.5990175604820251, 0.07887446135282516, 0.0006086943903937936, 0.0013662977144122124, 0.14057400822639465], [0.005209831520915031, 0.0014653399121016264, 8.564235758967698e-05, 9.721316018840298e-05, 1.1006238310073968e-05, 0.00047946471022441983, 6.052808021195233e-05, 0.33735474944114685, 0.00010327780910301954, 7.165308488765731e-05, 0.00023468099243473262, 0.0025291775818914175, 0.0370057076215744, 0.21125300228595734, 0.0025826571509242058, 0.010629099793732166, 0.3908270299434662], [0.0008229176746681333, 0.00010462543286848813, 0.00011519669351400807, 0.0004263767914380878, 2.6006326152128167e-05, 8.14265149529092e-05, 0.0006299851229414344, 0.3966752886772156, 1.6248745851044077e-06, 0.00024319578369613737, 2.243592098238878e-05, 0.00030899079865776, 0.0006297664949670434, 0.009646696969866753, 0.001934357569552958, 0.1147339716553688, 0.47359710931777954], [0.004730875138193369, 0.0005377806373871863, 0.00015374516078736633, 0.001086399657651782, 0.0005122027359902859, 0.0008682155748829246, 0.000714510097168386, 0.34216055274009705, 0.00011033022019546479, 0.0011319939512759447, 0.0013709497870877385, 0.009159684181213379, 0.017145680263638496, 0.010857585817575455, 0.0013117485214024782, 0.21709582209587097, 0.39105188846588135], [0.010524287819862366, 0.006296233274042606, 0.000507623772136867, 0.006553971208631992, 0.0015426735626533628, 0.0038521536625921726, 0.0021818974055349827, 0.42262303829193115, 0.003211166011169553, 0.00013440323527902365, 0.00047631346387788653, 0.0014118240214884281, 0.012307706288993359, 0.02146642841398716, 0.0011419247603043914, 0.020310083404183388, 0.4854583144187927], [0.012948949821293354, 0.007929883897304535, 0.004084647633135319, 0.009830895811319351, 0.006657589226961136, 0.010164221748709679, 0.010127272456884384, 0.4344370663166046, 0.003944419790059328, 0.0068285525776445866, 0.003559018252417445, 0.003391153644770384, 0.0026713202241808176, 0.002616946818307042, 0.002035971265286207, 0.00628670072183013, 0.47248542308807373]], [[0.01765236258506775, 0.0011515406658872962, 0.00046742433914914727, 0.003537696087732911, 0.00027801250689662993, 0.0014234023401513696, 0.0016422171611338854, 0.42092835903167725, 0.01074168086051941, 0.008800662122666836, 0.0008520812261849642, 0.0025731041096150875, 0.0008202915196307003, 0.004787192214280367, 0.0005216507706791162, 0.002012457000091672, 0.521809995174408], [0.012828490696847439, 0.015513851307332516, 0.006796387955546379, 0.019984884187579155, 0.0007274491945281625, 0.0029961070977151394, 0.0029165714513510466, 0.44823238253593445, 0.0006004389724694192, 0.0003202476946171373, 6.603512883884832e-05, 0.00018798986275214702, 0.00046109905815683305, 0.000277299084700644, 5.464645801112056e-05, 0.00017403429956175387, 0.48786213994026184], [0.0021822256967425346, 0.0016321373404935002, 0.000704972364474088, 0.005071562714874744, 0.00011832808377221227, 0.00010630009637679905, 0.0020619097631424665, 0.461978942155838, 5.422638787422329e-05, 7.143030961742625e-05, 4.829871249967255e-05, 3.326690421090461e-05, 3.7241003155941144e-05, 1.0774228030641098e-05, 1.7124188161687925e-05, 4.830861507798545e-05, 0.5258228778839111], [0.007529828697443008, 0.007521424442529678, 0.0009364617289975286, 0.017082108184695244, 0.0015077664284035563, 0.004340911749750376, 0.003715225961059332, 0.46496984362602234, 0.00023513408086728305, 4.812382394447923e-05, 8.421896200161427e-05, 0.0001247006730409339, 6.331182521535084e-05, 0.00019771725055761635, 1.921921648317948e-05, 0.00015419079863931984, 0.4914698600769043], [0.004584986716508865, 0.00854632817208767, 0.006921413354575634, 0.3387047052383423, 0.007675492204725742, 0.022835748270154, 0.009250831790268421, 0.28124940395355225, 0.0003971874830313027, 6.782493437640369e-05, 9.845796012086794e-05, 3.252578608226031e-05, 6.721194222336635e-05, 0.00013949914136901498, 2.6562627681414597e-05, 5.604580292128958e-05, 0.3193458020687103], [0.003450269578024745, 0.0033433714415878057, 0.008306697010993958, 0.05035312473773956, 0.009666441939771175, 0.019783295691013336, 0.01717206835746765, 0.41438257694244385, 0.0005316047463566065, 0.0001934585307026282, 8.231119136326015e-05, 2.8238941013114527e-05, 4.0960654587252066e-05, 0.00013452132407110184, 1.7893667973112315e-05, 0.0001983147521968931, 0.4723147749900818], [0.01158907264471054, 0.0012793599162250757, 0.0012243700912222266, 0.014744999818503857, 0.00788902211934328, 0.008668545633554459, 0.010289556346833706, 0.4491034150123596, 0.0010485228849574924, 0.0005319747724570334, 0.00041739927837625146, 0.0002205324126407504, 0.00022487141541205347, 0.00013836787547916174, 0.00010937024489976466, 0.0006729429587721825, 0.49184757471084595], [0.04544135555624962, 0.00616177124902606, 0.0024318306241184473, 0.005519090220332146, 0.001453452161513269, 0.0029237736016511917, 0.017560768872499466, 0.43086448311805725, 0.001742662861943245, 0.0037070964463055134, 0.002096862066537142, 0.0026011799927800894, 0.0018225281964987516, 0.0008524906588718295, 0.0007885348168201745, 0.003470800817012787, 0.4705614149570465], [0.010534780099987984, 0.00010035747982328758, 6.403197039617226e-05, 0.00031868822406977415, 7.368931255768985e-05, 0.00025751194334588945, 0.0005848543951287866, 0.4546374976634979, 0.003709169337525964, 0.0032256983686238527, 0.000918336387258023, 0.00046460534213110805, 0.00047928676940500736, 4.511106089921668e-05, 1.1231205462536309e-05, 0.00025702163111418486, 0.5243180990219116], [0.0036661105696111917, 2.6086941943503916e-05, 2.958319237222895e-05, 0.00021588393428828567, 2.1026857211836614e-05, 2.9474829716491513e-05, 0.0007121140952222049, 0.4462484121322632, 0.004153985995799303, 0.001993260346353054, 0.0023950713220983744, 0.000887528876774013, 7.701302092755213e-05, 1.4702494809171185e-05, 1.7841908629634418e-05, 0.00017350666166748852, 0.5393384099006653], [0.007324127946048975, 0.00018661178182810545, 0.000116864794108551, 0.0005300973425619304, 6.313479389064014e-05, 0.00018069853831548244, 0.0016803512116894126, 0.43822818994522095, 0.006182958371937275, 0.008649715222418308, 0.0031037060543894768, 0.006654918193817139, 0.00051596958655864, 0.00022150081349536777, 0.00012972916010767221, 0.0010058965999633074, 0.5252255201339722], [0.006289494689553976, 0.0001930443395394832, 3.218771598767489e-05, 0.0003514834388624877, 5.091804268886335e-05, 8.218526636483148e-05, 0.0008587857009842992, 0.43409082293510437, 0.0049641621299088, 0.002512351144105196, 0.02556491270661354, 0.004035274963825941, 0.001706803566776216, 0.00036473441286943853, 8.660368621349335e-05, 0.0007990176090970635, 0.5180173516273499], [0.003098932560533285, 0.0003592980501707643, 0.00010947593546006829, 0.000615964294411242, 4.28724342782516e-06, 0.00010384851339040324, 0.0001419971522409469, 0.4401145875453949, 0.0010574485640972853, 0.0007277632248587906, 0.0005620897863991559, 0.0005332737928256392, 0.0003330994222778827, 0.0005806020344607532, 2.0575040252879262e-05, 0.00017659128934610635, 0.5514601469039917], [0.0015181989874690771, 7.185248250607401e-05, 6.781389674870297e-05, 0.00019640359096229076, 7.65226377552608e-06, 4.883259316557087e-05, 0.00039224905776791275, 0.453235387802124, 0.00010586709686322138, 0.0005570781650021672, 0.0005809744470752776, 0.00034128539846278727, 0.0031461473554372787, 0.003064568620175123, 0.00026223057648167014, 0.0020910673774778843, 0.5343123078346252], [0.002024384681135416, 0.00025236414512619376, 0.000688300933688879, 0.0006958885933272541, 1.705244358163327e-05, 6.705238047288731e-05, 0.0016066618263721466, 0.43003740906715393, 0.00024023855803534389, 0.0009365276200696826, 0.0011787806870415807, 0.0031107289250940084, 0.0018348470330238342, 0.007793189492076635, 0.0017747282981872559, 0.020573578774929047, 0.5271682739257812], [0.0019014828139916062, 0.00015799507673364133, 0.00013784242037218064, 0.0006970398244448006, 0.0002842363028321415, 0.0003044432960450649, 0.0026170460041612387, 0.4370087683200836, 0.0002136660332325846, 0.0006203326629474759, 0.002926041604951024, 0.0015660738572478294, 0.005705229472368956, 0.008290007710456848, 0.006195631809532642, 0.008172153495252132, 0.5232018828392029], [0.0529736690223217, 0.007014984730631113, 0.0028949773404747248, 0.0059104072861373425, 0.0017823688685894012, 0.00338401785120368, 0.019880235195159912, 0.42337295413017273, 0.0022250160109251738, 0.004774811211973429, 0.0026195154059678316, 0.0032879512291401625, 0.0023055823985487223, 0.0010912425350397825, 0.0010446207597851753, 0.004494143649935722, 0.4609434902667999]], [[0.0026332123670727015, 0.016766928136348724, 0.008011487312614918, 0.01790531352162361, 0.018650200217962265, 0.062303535640239716, 0.00492036622017622, 0.4069289565086365, 0.0013141982490196824, 0.0007088479469530284, 0.0002457851660437882, 0.0016835161950439215, 0.000699252646882087, 0.0008167345076799393, 0.0002552729856688529, 0.004614090546965599, 0.45154228806495667], [0.021348735317587852, 0.06308237463235855, 0.0012783269630745053, 0.0008726424421183765, 0.0020216379780322313, 0.001760077546350658, 0.03428907319903374, 0.4050016403198242, 1.2061688721587416e-05, 0.00016670834156684577, 4.294016616768204e-05, 2.7111689632874914e-05, 0.001610580482520163, 0.00010187678708462045, 4.417030140757561e-05, 0.015155099332332611, 0.45318493247032166], [0.003091631457209587, 0.594883382320404, 0.0010874372674152255, 0.0003623396041803062, 0.001062209834344685, 0.0035483611281961203, 0.0017792407888919115, 0.18863049149513245, 0.00012455935939215124, 0.00010312092490494251, 3.243057562940521e-06, 6.509753347927472e-06, 0.0002912225027102977, 8.259034075308591e-05, 1.1080795047746506e-05, 0.0010622382396832108, 0.20387029647827148], [0.0022930591367185116, 0.12563984096050262, 0.08801192045211792, 0.017069920897483826, 0.0015811006305739284, 0.00283697503618896, 0.004461849573999643, 0.35876205563545227, 0.0002696115698199719, 0.00011372932203812525, 6.011414006934501e-05, 7.020973862381652e-05, 7.607688894495368e-05, 9.619046613806859e-05, 2.4265851607196964e-05, 0.00170729064848274, 0.39692577719688416], [0.0021572434343397617, 0.002745784819126129, 0.004188232123851776, 0.906137228012085, 0.0030303839594125748, 0.0010624447604641318, 0.0026167738251388073, 0.03699648752808571, 8.34439470054349e-06, 8.540863200323656e-05, 2.928004505520221e-06, 0.00012868872727267444, 2.883228717109887e-06, 3.5744585602515144e-06, 8.613767022325192e-06, 0.00033215826260857284, 0.04049265757203102], [0.001158821047283709, 0.001286138198338449, 0.0002362321683904156, 0.01615927368402481, 0.922779381275177, 0.007019980810582638, 0.0010226935846731067, 0.02288481406867504, 3.143897993140854e-05, 0.0001426228991476819, 5.250073627394158e-06, 4.433151843841188e-06, 0.0005128434277139604, 8.119357516989112e-05, 1.0257708709104918e-05, 0.0008597345440648496, 0.0258049163967371], [0.0038134735077619553, 0.014512907713651657, 0.0017344387015327811, 0.005672078114002943, 0.02979953959584236, 0.7441712617874146, 0.005500673782080412, 0.09057512879371643, 0.00041286874329671264, 0.00017180165741592646, 3.044616505576414e-06, 2.81435113720363e-05, 0.00011558442929526791, 0.00011456076754257083, 7.358984294114634e-05, 0.0013731947401538491, 0.10192778706550598], [0.0031713987700641155, 0.01049118209630251, 0.0013209134340286255, 0.0023277662694454193, 0.001549050211906433, 0.007051176857203245, 0.005296653136610985, 0.45139455795288086, 0.0005366570549085736, 0.001273870700970292, 0.0002764156088232994, 0.0006945689092390239, 0.0008733040885999799, 0.0005719222826883197, 0.0002151395019609481, 0.003827531822025776, 0.5091279149055481], [0.24841752648353577, 0.0006868786877021194, 6.506958925456274e-06, 0.0007770706433802843, 0.00035913960891775787, 0.0015311617171391845, 0.22299742698669434, 0.2130577266216278, 0.0010123912943527102, 0.04461001232266426, 0.00039270363049581647, 0.00013133673928678036, 0.0002693503338377923, 8.715867352293571e-07, 3.7627869460266083e-06, 0.016293900087475777, 0.24945224821567535], [0.015003357082605362, 0.0013782646274194121, 0.00024101496092043817, 0.00011596337571972981, 0.00012209978012833744, 0.0004333557444624603, 0.008557945489883423, 0.058474693447351456, 0.7500280141830444, 0.0953151285648346, 0.002409890992566943, 0.0007866734522394836, 0.00022496342717204243, 5.2120005420874804e-05, 1.3340957593754865e-05, 0.00012583636271301657, 0.06671729683876038], [0.012346221134066582, 0.00011537739919731393, 0.0005208725924603641, 0.001600478426553309, 0.00012466148473322392, 0.0004231053462717682, 0.022761313244700432, 0.028999492526054382, 0.036587633192539215, 0.8429945111274719, 0.001648884848691523, 0.008194942027330399, 0.000262050743913278, 0.0006445805192925036, 3.0672545108245686e-05, 0.007904641330242157, 0.03484056890010834], [0.008065525442361832, 3.289232699899003e-05, 7.25282370694913e-06, 0.0002044859720626846, 2.1289970391080715e-05, 3.1236475479090586e-05, 0.005125476513057947, 0.011228681541979313, 0.004996485076844692, 0.12316475063562393, 0.8027245998382568, 0.02157537452876568, 0.007968842051923275, 0.0005096656968817115, 0.0004529108991846442, 0.001809856272302568, 0.0120806023478508], [0.00013010729162488133, 1.9808639990515076e-05, 0.00014662352623417974, 0.0006823948351666331, 2.422748821118148e-06, 1.2738895520669757e-06, 8.520845585735515e-05, 0.02025916613638401, 4.210897532175295e-05, 0.008654609322547913, 0.0010076829930767417, 0.940001904964447, 0.001866328762844205, 8.559123671147972e-05, 5.123051960254088e-05, 0.0017627060879021883, 0.02520069293677807], [0.0007561540696769953, 1.224671086674789e-05, 2.71782113259178e-07, 1.0042626854556147e-05, 2.0495381249929778e-05, 2.920824272223399e-06, 7.098415517248213e-05, 0.006611887365579605, 2.735405587372952e-06, 0.00031088394462130964, 3.813771400018595e-05, 0.0007258863770402968, 0.9741557240486145, 0.006121969316154718, 6.413153460016474e-05, 0.003135510953143239, 0.007960056886076927], [0.011815240606665611, 0.013770796358585358, 0.00016628307639621198, 0.00014302284398581833, 0.00020878014038316905, 0.001011322601698339, 0.00024132771068252623, 0.11770996451377869, 0.0022566679399460554, 0.005912574473768473, 2.368062951063621e-06, 0.0003211611183360219, 0.13078634440898895, 0.5592588186264038, 0.0001499788777437061, 0.01386864110827446, 0.14237670600414276], [0.00341402692720294, 0.0007988111465238035, 0.0005109183839522302, 0.0011417168425396085, 0.0010168449953198433, 0.0004975261399522424, 0.011145468801259995, 0.07588810473680496, 3.4240663808304816e-05, 0.017858341336250305, 0.0009391247294843197, 0.013362288475036621, 0.31077897548675537, 0.09214280545711517, 0.21333712339401245, 0.17187216877937317, 0.08526159822940826], [0.0036418098025023937, 0.012458807788789272, 0.0014593518571928144, 0.0023590882774442434, 0.0017022675601765513, 0.007798854261636734, 0.005697906482964754, 0.44995060563087463, 0.000681153847835958, 0.0014743177453055978, 0.0003242199891246855, 0.000792105623986572, 0.00107921427115798, 0.0007018313044682145, 0.00024288527492899448, 0.004314068704843521, 0.5053214430809021]], [[0.036404676735401154, 0.00592489168047905, 0.006970780435949564, 0.025631822645664215, 0.005646585952490568, 0.011554471217095852, 0.030073555186390877, 0.12259546667337418, 0.057070840150117874, 0.2043054699897766, 0.02925042062997818, 0.05809417739510536, 0.08366486430168152, 0.07597700506448746, 0.022499265149235725, 0.08973279595375061, 0.1346028447151184], [0.023810431361198425, 0.24388140439987183, 0.15159577131271362, 0.13559827208518982, 0.024114958941936493, 0.13965743780136108, 0.13844817876815796, 0.041259393095970154, 0.0037603636737912893, 0.006554889492690563, 0.0034487496595829725, 0.0023160402197390795, 0.032709624618291855, 0.0064082988537848, 0.0010581834940239787, 0.003914321307092905, 0.04146362096071243], [0.00724352290853858, 0.028957119211554527, 0.023850275203585625, 0.03370911628007889, 0.013197769410908222, 0.085672527551651, 0.16931235790252686, 0.3014773428440094, 0.0005187847418710589, 0.0013224215945228934, 0.007630565203726292, 0.0006650203722529113, 0.00461833830922842, 0.002115938114002347, 0.0023917024955153465, 0.002881410298869014, 0.31443583965301514], [0.026455210521817207, 0.10573790967464447, 0.06613335013389587, 0.1268225908279419, 0.054754335433244705, 0.2500959038734436, 0.043942276388406754, 0.13487322628498077, 0.003782139625400305, 0.006633463781327009, 0.0029290663078427315, 0.002675961470231414, 0.021187856793403625, 0.0047859493643045425, 0.0020174114033579826, 0.007072727661579847, 0.1401006281375885], [0.022197989746928215, 0.01526658795773983, 0.13132710754871368, 0.4635508954524994, 0.013824806548655033, 0.021256139501929283, 0.10771340131759644, 0.09966151416301727, 0.0025476706214249134, 0.0023534914944320917, 0.0030855608638375998, 0.0010719865094870329, 0.005523371044546366, 0.0013514339225366712, 0.0008391692535951734, 0.0016986906994134188, 0.10673018544912338], [0.022062508389353752, 0.030834706500172615, 0.16222503781318665, 0.30899718403816223, 0.01854780688881874, 0.04431447759270668, 0.27264654636383057, 0.038108501583337784, 0.007623128592967987, 0.008003332652151585, 0.007105081342160702, 0.003722228342667222, 0.022986114025115967, 0.006339218467473984, 0.0022896211594343185, 0.003946448676288128, 0.04024811461567879], [0.02057664468884468, 0.059904344379901886, 0.08036200702190399, 0.07162459194660187, 0.015420306473970413, 0.06397315859794617, 0.09348619729280472, 0.2717522978782654, 0.00303972908295691, 0.0021396896336227655, 0.004615255165845156, 0.002129098866134882, 0.012131500989198685, 0.0035935945343226194, 0.0011915522627532482, 0.009762515313923359, 0.28429749608039856], [0.0034727174788713455, 0.001563248224556446, 0.0017624144675210118, 0.004384288098663092, 0.0006182839861139655, 0.0013654198264703155, 0.007617918774485588, 0.4653688669204712, 0.00048346439143642783, 0.0004133052716497332, 0.0012232944136485457, 0.0003533849085215479, 0.0013381107710301876, 0.0005761455395258963, 0.0002277886087540537, 0.0011180340079590678, 0.5081132054328918], [0.023778393864631653, 0.0013149264268577099, 0.002899616491049528, 0.004137106705456972, 0.002222983166575432, 0.010428386740386486, 0.04450900852680206, 0.028673797845840454, 0.03317929059267044, 0.1581706702709198, 0.10455930978059769, 0.090985007584095, 0.31997284293174744, 0.07702523469924927, 0.01817440241575241, 0.049170564860105515, 0.030798496678471565], [0.03401818871498108, 0.0018349975580349565, 0.0026087488513439894, 0.009189194068312645, 0.0016569886356592178, 0.007010174449533224, 0.032535914331674576, 0.28086790442466736, 0.022682664915919304, 0.030953465029597282, 0.0239972285926342, 0.01669435389339924, 0.1358443647623062, 0.03454763814806938, 0.00619794987142086, 0.043422047048807144, 0.31593814492225647], [0.0220197606831789, 0.0015905228210613132, 0.004697291646152735, 0.0006652434240095317, 0.0008372087613679469, 0.003812365001067519, 0.026964526623487473, 0.0501592755317688, 0.08369182050228119, 0.08895114064216614, 0.060747984796762466, 0.08872146904468536, 0.24895046651363373, 0.12610971927642822, 0.029318436980247498, 0.10803248733282089, 0.054730307310819626], [0.013267026282846928, 0.002497647190466523, 0.003735042642802, 0.00415381882339716, 0.0017993756337091327, 0.004174662288278341, 0.016351640224456787, 0.047731414437294006, 0.07842813432216644, 0.1060716062784195, 0.07655243575572968, 0.028031915426254272, 0.4090787172317505, 0.10566414892673492, 0.02145230770111084, 0.027221638709306717, 0.053788457065820694], [0.04550601914525032, 0.0033663648646324873, 0.008475780487060547, 0.006979511119425297, 0.0023410047870129347, 0.007689537946134806, 0.03844694420695305, 0.07012171298265457, 0.08134126663208008, 0.12783783674240112, 0.09567268937826157, 0.06633039563894272, 0.13671033084392548, 0.14445962011814117, 0.02276717871427536, 0.06602661311626434, 0.07592718303203583], [0.008343325927853584, 0.0005641806637868285, 0.0037967467214912176, 0.0030488502234220505, 0.0009284068364650011, 0.0031955817248672247, 0.0639251098036766, 0.030720263719558716, 0.013042779639363289, 0.10724092274904251, 0.1442202478647232, 0.07095249742269516, 0.16175884008407593, 0.121917724609375, 0.08880365639925003, 0.14373967051506042, 0.033801186829805374], [0.00696480693295598, 0.0005280878976918757, 0.00468235881999135, 0.00048087540199048817, 0.00032980277319438756, 0.0025695296935737133, 0.04120693728327751, 0.026116471737623215, 0.007979660294950008, 0.04728662595152855, 0.12942172586917877, 0.037477314472198486, 0.07916063070297241, 0.11529239267110825, 0.0523444302380085, 0.4195806384086609, 0.028577713295817375], [0.00442898366600275, 0.0002960737037938088, 0.002873498247936368, 0.0007604320417158306, 0.0006338224629871547, 0.0014531416818499565, 0.03951537236571312, 0.03200171887874603, 0.015140668489038944, 0.09006698429584503, 0.25681406259536743, 0.031889986246824265, 0.11948448419570923, 0.1239023357629776, 0.1734052151441574, 0.07143285125494003, 0.03590038791298866], [0.0033199891913682222, 0.0013743824092671275, 0.0015889897476881742, 0.0038579709362238646, 0.0005784900858998299, 0.0012502586469054222, 0.006983641535043716, 0.46587660908699036, 0.0004946071421727538, 0.0004342380561865866, 0.001464478438720107, 0.00038961227983236313, 0.001373390550725162, 0.0005847654538229108, 0.00025788479251787066, 0.0011901834513992071, 0.5089805722236633]], [[0.02334071695804596, 0.003417863044887781, 0.0012010140344500542, 0.0022430107928812504, 0.0016200195532292128, 0.0034463603515177965, 0.016079798340797424, 0.43757450580596924, 0.009318018332123756, 0.0023155095987021923, 0.0009462449233978987, 0.0011029751040041447, 0.0015380559489130974, 0.0013740865979343653, 0.0005716481246054173, 0.00023246731143444777, 0.4936777651309967], [0.016842616721987724, 0.016163630411028862, 0.004704550839960575, 0.024345064535737038, 0.02072679065167904, 0.018617207184433937, 0.006456471048295498, 0.41849708557128906, 0.00020310889522079378, 0.00010640353866619989, 0.00018726183043327183, 4.943988096783869e-05, 0.0003971579426433891, 0.00024654006119817495, 0.00015375726798083633, 0.00011350440763635561, 0.47218945622444153], [0.011483881622552872, 0.0038958184886723757, 0.0032236510887742043, 0.006952177733182907, 0.01255497895181179, 0.003602716838940978, 0.004671114031225443, 0.44970831274986267, 0.0001247778709512204, 0.0007429611287079751, 0.00012146234803367406, 5.554977906285785e-05, 0.00020211789524182677, 0.00011740457557607442, 0.00038341357139870524, 0.0004642812127713114, 0.501695454120636], [0.01829781010746956, 0.009315880946815014, 0.004116649739444256, 0.02432245947420597, 0.05863891914486885, 0.034059956669807434, 0.008985783904790878, 0.3995921313762665, 0.000461219489807263, 0.0005761642823927104, 0.00025125869433395565, 0.00012400621199049056, 0.00015309845912270248, 0.00011205890768906102, 0.00025281263515353203, 0.00042673404095694423, 0.44031304121017456], [0.002048932947218418, 0.00011754844308597967, 0.0003798315301537514, 0.009035445749759674, 0.01144897285848856, 0.012676088139414787, 0.0231602992862463, 0.43892595171928406, 0.00016001671610865742, 0.0004826670337934047, 7.210757758002728e-05, 4.212051135255024e-05, 2.0793697331100702e-05, 2.235932697658427e-05, 2.2639251255895942e-05, 0.00021473340166267008, 0.5011695027351379], [0.006609529256820679, 0.0006958271842449903, 0.0005905352300032973, 0.0029306572396308184, 0.005833779461681843, 0.009688823483884335, 0.02402917854487896, 0.4391782879829407, 0.0002819157962221652, 0.0006053519900888205, 5.935047011007555e-05, 6.756571383448318e-05, 0.00015949553926475346, 4.4194744987180457e-05, 2.6636947950464673e-05, 6.5536230977159e-05, 0.5091332197189331], [0.009429986588656902, 0.0006045891204848886, 0.00024717769701965153, 0.0012584561482071877, 0.000926637148950249, 0.0018138496670871973, 0.004042741842567921, 0.4723522365093231, 0.0007235656958073378, 0.0015806583687663078, 0.0004189120954833925, 0.002250867197290063, 0.00018596992595121264, 0.0001054031090461649, 0.00017397037299815565, 0.0007486983668059111, 0.5031362771987915], [0.04861484840512276, 0.004636226687580347, 0.0027589057572185993, 0.004302005749195814, 0.003476787591353059, 0.003933227155357599, 0.014289463870227337, 0.4333992004394531, 0.0017444214317947626, 0.0023987097665667534, 0.0019793235696852207, 0.0014676316641271114, 0.001365580246783793, 0.0012826970778405666, 0.0031027907971292734, 0.0032456957269459963, 0.46800240874290466], [0.003631715662777424, 0.0003088420198764652, 6.453265086747706e-05, 6.451351509895176e-05, 5.187777787796222e-05, 0.00020016876806039363, 0.0010744031751528382, 0.4380161464214325, 0.00386659218929708, 0.03637664020061493, 0.003699705470353365, 0.0030308172572404146, 0.0006971068796701729, 9.385282464791089e-05, 0.00017142188153229654, 0.00010433657007524744, 0.508547306060791], [0.004595261532813311, 5.1648741646204144e-05, 1.4154617019812576e-05, 3.180181738571264e-05, 3.3620755857555196e-05, 3.636182373156771e-05, 0.0001568787993164733, 0.45110490918159485, 0.0006998178432695568, 0.004974895156919956, 0.006279801018536091, 0.001824003062210977, 0.0007305459585040808, 0.00021568563533946872, 0.0002705328515730798, 0.0002863524714484811, 0.5286937952041626], [0.010347146540880203, 0.00016215415962506086, 1.785890526662115e-05, 6.463612226070836e-05, 1.9570428776205517e-05, 6.15140306763351e-05, 7.926848775241524e-05, 0.42909231781959534, 0.0004602996341418475, 0.0019012303091585636, 0.0008683592895977199, 0.0508384071290493, 0.01879936270415783, 0.0030202579218894243, 0.00023789764964021742, 0.005454739555716515, 0.4785749912261963], [0.008678950369358063, 0.0003749372554011643, 3.828941407846287e-05, 9.725706331664696e-05, 4.172022454440594e-05, 8.346237154910341e-05, 0.00010827038931893185, 0.4490053057670593, 0.00039811295573599637, 0.0014210202498361468, 0.0005618249997496605, 0.002701524877920747, 0.01092784944921732, 0.0036350078880786896, 0.0008035670616663992, 0.00292084994725883, 0.5182020664215088], [0.0017807602416723967, 8.14798622741364e-05, 1.949024408531841e-05, 2.636770841490943e-05, 6.6211737248522695e-06, 1.9461362171568908e-05, 0.00011237102444283664, 0.43786904215812683, 6.049060539226048e-05, 6.120120087871328e-05, 0.0001008913095574826, 0.0002591551747173071, 0.00392237538471818, 0.034151602536439896, 0.0009867261396721005, 0.0015346537111327052, 0.5190073251724243], [0.0008413376635871828, 5.2972169214626774e-05, 1.0195322829531506e-05, 3.163064320688136e-05, 3.838329212157987e-05, 1.968651486095041e-05, 7.842334889573976e-05, 0.44952014088630676, 4.057848855154589e-06, 4.2802708776434883e-05, 1.912906918732915e-05, 3.9723501686239615e-05, 0.00035543524427339435, 0.0010322420857846737, 0.0018562062177807093, 0.00603888463228941, 0.5400186777114868], [0.003925972152501345, 6.406171451089904e-05, 3.27638590533752e-05, 1.6427151422249153e-05, 3.751390249817632e-05, 5.116049578646198e-05, 0.0007518212660215795, 0.3194265365600586, 1.138227162300609e-05, 0.00026957321097142994, 9.563883213559166e-05, 0.0009502588072791696, 0.0007739081047475338, 0.0008060032268986106, 0.002137943869456649, 0.3175145089626312, 0.35313454270362854], [0.015826690942049026, 0.0002890320320148021, 7.490795542253181e-05, 0.00017063115956261754, 9.796588710742071e-05, 0.0003676929627545178, 0.0020560543052852154, 0.4530862867832184, 0.0005198271828703582, 0.000404447055188939, 0.00021449162159115076, 0.0004192628839518875, 0.0005889731110073626, 0.0015403784345835447, 0.0010129765141755342, 0.0043958453461527824, 0.5189346671104431], [0.052716221660375595, 0.005045577883720398, 0.003161515574902296, 0.004799775779247284, 0.0040612961165606976, 0.0044054691679775715, 0.01666777953505516, 0.4275640547275543, 0.0021218350157141685, 0.0031049291137605906, 0.0024828100576996803, 0.0017261586617678404, 0.0016498207114636898, 0.0015639223856851459, 0.00406023720279336, 0.003891457337886095, 0.460977166891098]], [[0.027384456247091293, 0.003865094855427742, 0.006935993675142527, 0.006513173691928387, 0.003044238081201911, 0.0122421495616436, 0.21293111145496368, 0.3195265233516693, 0.028109388425946236, 0.008688928559422493, 0.0027171308174729347, 0.019530262798070908, 0.0040236786007881165, 0.001430067582987249, 0.000350464804796502, 0.007164575159549713, 0.33554279804229736], [0.010306515730917454, 0.02785869874060154, 0.0005697893211618066, 0.0038877699989825487, 0.004114500246942043, 0.003510303096845746, 0.0004360430466476828, 0.4481392204761505, 0.0001262290170416236, 0.00014625293260905892, 0.00022977754997555166, 0.0002344312088098377, 0.0016184109263122082, 0.00039523933082818985, 2.6176930987276137e-05, 0.00031901895999908447, 0.4980815649032593], [0.0022807903587818146, 0.05350009351968765, 0.0004899122286587954, 0.005731653887778521, 0.00285477377474308, 0.000935403979383409, 0.0001341810857411474, 0.43902426958084106, 0.00011182844900758937, 0.00014399029896594584, 1.776874159986619e-05, 2.7661961212288588e-05, 0.000830171920824796, 6.179954652907327e-05, 2.576399674580898e-05, 0.0001455369492759928, 0.4936845004558563], [0.0016307136975228786, 0.007147143129259348, 0.0004076235927641392, 0.0013824062189087272, 0.005103759933263063, 0.00895671546459198, 0.0001787373039405793, 0.46844273805618286, 9.635076276026666e-05, 2.7759524527937174e-05, 5.796884579467587e-06, 1.1905499377462547e-05, 2.6403076844871975e-05, 2.5728535547386855e-05, 8.997942131827585e-06, 5.638348011416383e-05, 0.50649094581604], [0.002747944323346019, 0.0021481853909790516, 0.00017988868057727814, 0.008824770338833332, 0.012845930643379688, 0.012115215882658958, 0.00032963082776404917, 0.45364251732826233, 1.377021999360295e-05, 1.625001459615305e-05, 5.548761237150757e-06, 6.019344709784491e-06, 1.4527436178468633e-05, 4.137966243433766e-06, 5.909572337259306e-06, 7.527507841587067e-05, 0.5070245265960693], [0.0064710755832493305, 0.0055014523677527905, 0.006762563716620207, 0.016172390431165695, 0.07339484244585037, 0.008078213781118393, 0.004816546104848385, 0.42597219347953796, 0.0008864799165166914, 0.0004943885141983628, 6.378938996931538e-05, 0.00011360318603692576, 0.0003861173172481358, 0.0002852885809261352, 6.726421270286664e-05, 0.0006380290142260492, 0.4498957097530365], [0.01601148210465908, 0.002178492955863476, 0.0002915983786806464, 0.0049717240035533905, 0.017615729942917824, 0.030614901334047318, 0.002784779528155923, 0.45284345746040344, 0.0013775276020169258, 0.00012832324136979878, 4.610059841070324e-05, 0.00014426627603825182, 0.00017321540508419275, 0.0002519008703529835, 1.649900514166802e-05, 0.00016625228454358876, 0.47038379311561584], [0.008576407097280025, 0.00485766539350152, 0.0021057312842458487, 0.0023646976333111525, 0.0021793998312205076, 0.008902637287974358, 0.007513664662837982, 0.4587917625904083, 0.001238831551745534, 0.003447196679189801, 0.001358703593723476, 0.0023625066969543695, 0.0009326458675786853, 0.0008537550456821918, 0.0008821932133287191, 0.00303064426407218, 0.49060145020484924], [0.07086576521396637, 0.0005989865749143064, 2.8617532734642737e-05, 0.0008393687894567847, 0.0007259768899530172, 0.006472375709563494, 0.009653646498918533, 0.41931697726249695, 0.0041388534009456635, 0.007326140534132719, 0.002127461601048708, 0.000667815562337637, 0.0002826448471751064, 3.635108078015037e-05, 9.59646513365442e-06, 0.00029489691951312125, 0.47661450505256653], [0.00646407064050436, 0.0002647223591338843, 3.608215047279373e-05, 0.00014765237574465573, 0.00103684701025486, 0.0006052081589587033, 0.0013685396406799555, 0.4640445113182068, 0.006257316097617149, 0.0014311211416497827, 0.00023412483278661966, 0.00036500670830719173, 0.0005167955532670021, 2.4727249183342792e-05, 2.7253681764705107e-06, 3.0023329600226134e-05, 0.5171705484390259], [0.006129029672592878, 0.00013018301979172975, 0.0002194285043515265, 0.00020228374341968447, 9.450138168176636e-05, 0.0006227205158211291, 0.0013360349694266915, 0.4303603768348694, 0.0009678167989477515, 0.03947090357542038, 0.003807917470112443, 0.001652649836614728, 0.00023406754189636558, 0.0001700032444205135, 0.00016199004312511533, 0.0006948598311282694, 0.5137451887130737], [0.004958172794431448, 0.000171467472682707, 2.084590778395068e-05, 0.00018925512267742306, 0.00011369951971573755, 0.00015574732969980687, 0.0064537073485553265, 0.2364617884159088, 0.0024626562371850014, 0.05859846621751785, 0.3970663547515869, 0.02125915139913559, 0.005378312431275845, 0.00023079576203599572, 0.0003951511171180755, 0.0013739216374233365, 0.26471051573753357], [0.0014909330056980252, 7.429483230225742e-05, 1.2632801372092217e-05, 5.102091017761268e-05, 1.266022718482418e-05, 6.795165973016992e-05, 7.344643381657079e-05, 0.45816490054130554, 0.00019593347678892314, 0.00035940087400376797, 0.0007398570887744427, 0.006173113361001015, 0.001410710858181119, 0.000392199115594849, 9.738588232721668e-06, 0.00039338134229183197, 0.5303778052330017], [0.0024054632522165775, 0.00017946798470802605, 1.3819018931826577e-05, 7.923242810647935e-05, 1.5097161849553231e-05, 3.873049354297109e-05, 0.00018185861699748784, 0.4292594790458679, 9.185023372992873e-05, 0.0009503445471636951, 0.0025395466946065426, 0.004461972508579493, 0.055510953068733215, 0.006417848169803619, 0.0003238137869630009, 0.0017766391392797232, 0.4957539439201355], [0.004398868419229984, 0.0010472489520907402, 2.8715712687699124e-05, 0.00013836275320500135, 3.299454692751169e-05, 0.00011848005669889972, 0.0005804836982861161, 0.45545029640197754, 7.839587487978861e-05, 0.0009533732663840055, 0.00017155252862721682, 0.0008656001882627606, 0.007362784817814827, 0.0048707895912230015, 0.00041766653885133564, 0.005084885284304619, 0.5183994174003601], [0.0017970808548852801, 0.00026023320970125496, 4.874147634836845e-05, 0.00022750595235265791, 0.00015482639719266444, 0.00011236843420192599, 0.0007157096406444907, 0.4581325948238373, 1.6165839042514563e-05, 0.00035737428697757423, 0.00023007097479421645, 0.00027071614749729633, 0.005896392278373241, 0.002517733257263899, 0.002036839025095105, 0.004224543925374746, 0.5230010747909546], [0.010017584078013897, 0.005770355463027954, 0.002671315334737301, 0.002746246987953782, 0.0026496490463614464, 0.010977867059409618, 0.009256823919713497, 0.4536496698856354, 0.001648116740398109, 0.004471372347325087, 0.0017110984772443771, 0.0031337442342191935, 0.0011752437567338347, 0.0011686974903568625, 0.0011543374275788665, 0.0038848938420414925, 0.48391303420066833]], [[0.03853359818458557, 0.036859918385744095, 0.011397325433790684, 0.026413539424538612, 0.01571391150355339, 0.02060040459036827, 0.22436775267124176, 0.2719573974609375, 0.016298364847898483, 0.013952111825346947, 0.006871000397950411, 0.015526541508734226, 0.008867987431585789, 0.003469701623544097, 0.0008460694225504994, 0.015167880803346634, 0.2731565237045288], [0.12904447317123413, 0.09081084281206131, 0.01562613993883133, 0.18045374751091003, 0.09362813085317612, 0.08303964138031006, 0.17073331773281097, 0.1069255843758583, 0.007132493890821934, 0.0024554196279495955, 0.0017182434676215053, 0.0015517818974331021, 0.003766770474612713, 0.001856318092904985, 0.0002420110540697351, 0.004484777804464102, 0.10653036087751389], [0.051792167127132416, 0.046312082558870316, 0.026903217658400536, 0.258914053440094, 0.15026314556598663, 0.09999839216470718, 0.08199159801006317, 0.13014163076877594, 0.00332284695468843, 0.0033064833842217922, 0.002324905479326844, 0.0012790506007149816, 0.0033780867233872414, 0.001674981089308858, 0.00044809156679548323, 0.004268608056008816, 0.13368074595928192], [0.101529560983181, 0.0892266109585762, 0.012720931321382523, 0.06323404610157013, 0.06039601191878319, 0.07705161720514297, 0.16258500516414642, 0.17579643428325653, 0.03917108476161957, 0.008783639408648014, 0.007864728569984436, 0.005652969237416983, 0.010728993453085423, 0.003939260728657246, 0.0014604658354073763, 0.006658356636762619, 0.17320029437541962], [0.10826022177934647, 0.036093614995479584, 0.00422231899574399, 0.01809084601700306, 0.007913530804216862, 0.02203749120235443, 0.10944864898920059, 0.32041695713996887, 0.01915033534169197, 0.005699771922081709, 0.0027201364282518625, 0.003512016963213682, 0.004465777892619371, 0.0008267273660749197, 0.0003198097983840853, 0.003226133529096842, 0.3335956037044525], [0.06991403549909592, 0.012323886156082153, 0.000727494596503675, 0.0024990320671349764, 0.0014475154457613826, 0.012466980144381523, 0.08723749965429306, 0.3941297233104706, 0.0043054865673184395, 0.0019219801761209965, 0.0006718478398397565, 0.0012584858341142535, 0.0008273684070445597, 0.0003590746782720089, 0.00015489688667003065, 0.0013140714727342129, 0.40844064950942993], [0.02205503173172474, 0.013402307406067848, 0.003075401997193694, 0.003126043826341629, 0.0026873883325606585, 0.01464426051825285, 0.03294937685132027, 0.4233415424823761, 0.013440222479403019, 0.007379074115306139, 0.0027028846088796854, 0.00595076521858573, 0.003715357044711709, 0.002559725660830736, 0.0014290065737441182, 0.0072624352760612965, 0.4402792155742645], [0.011862031184136868, 0.004002700559794903, 0.0010597293730825186, 0.0042723920196294785, 0.0032027927227318287, 0.005350660998374224, 0.011668965220451355, 0.4587627053260803, 0.0015704173129051924, 0.0019171726889908314, 0.0030264118686318398, 0.003633410669863224, 0.004682071041315794, 0.002473787870258093, 0.001309214043430984, 0.0076927486807107925, 0.47351276874542236], [0.02444065362215042, 0.002264506882056594, 0.0002843729453161359, 0.0039218757301568985, 0.0020329179242253304, 0.002478779759258032, 0.017001666128635406, 0.09802453219890594, 0.030774159356951714, 0.11070332676172256, 0.05051247030496597, 0.1828005611896515, 0.294685959815979, 0.023409536108374596, 0.00047149747842922807, 0.051007989794015884, 0.10518523305654526], [0.016597818583250046, 0.0020784277003258467, 0.0009343185229226947, 0.0035360793117433786, 0.002437220187857747, 0.0011332413414493203, 0.005135592073202133, 0.1517927348613739, 0.02151290513575077, 0.0657721534371376, 0.022096829488873482, 0.12264952808618546, 0.28496846556663513, 0.04092462360858917, 0.0032316772267222404, 0.09079134464263916, 0.16440702974796295], [0.009898381307721138, 0.0031737873796373606, 0.0008604836766608059, 0.0027260123752057552, 0.0007575763156637549, 0.0008582966402173042, 0.0014013038016855717, 0.14467753469944, 0.015510574914515018, 0.010667411610484123, 0.021475881338119507, 0.05291607230901718, 0.40640559792518616, 0.12622502446174622, 0.003934914246201515, 0.04111674427986145, 0.15739446878433228], [0.02213294617831707, 0.007840126752853394, 0.0009984615026041865, 0.002623229054734111, 0.0007805950008332729, 0.0008756379829719663, 0.003412999212741852, 0.206177219748497, 0.020407510921359062, 0.007533859461545944, 0.01610538363456726, 0.02503262646496296, 0.24688909947872162, 0.07922980934381485, 0.007530231960117817, 0.12342324107885361, 0.2290070354938507], [0.021368658170104027, 0.010447652079164982, 0.002178157912567258, 0.004343140870332718, 0.0004877011233475059, 0.0006717191427014768, 0.007746066432446241, 0.28512218594551086, 0.006749959662556648, 0.004146149847656488, 0.00617354828864336, 0.013535212725400925, 0.08681921660900116, 0.0354754664003849, 0.0022108464036136866, 0.19497251510620117, 0.3175518214702606], [0.019153660163283348, 0.003163108602166176, 0.000399059324990958, 0.0013737499248236418, 0.000366052525350824, 0.000996222603134811, 0.0024955912958830595, 0.3931885063648224, 0.0008913466008380055, 0.0009396941750310361, 0.0007030289270915091, 0.0024722840171307325, 0.014049634337425232, 0.015328606590628624, 0.004726288840174675, 0.10836031287908554, 0.4313928484916687], [0.013429294340312481, 0.0038024834357202053, 0.0016467941459268332, 0.0020564934238791466, 0.0011626757914200425, 0.0017001541564241052, 0.002337594050914049, 0.38051554560661316, 0.005712383426725864, 0.0036202860064804554, 0.0011358940973877907, 0.005030148662626743, 0.022634310647845268, 0.023514915257692337, 0.004136438947170973, 0.10728643089532852, 0.42027828097343445], [0.00884063821285963, 0.0013195527717471123, 0.0003157604660373181, 0.0013408466475084424, 0.0006067503127269447, 0.0010109319118782878, 0.0017813529120758176, 0.4488208293914795, 0.003582499222829938, 0.0015068219508975744, 0.0014086280716583133, 0.0016915180021896958, 0.006518447771668434, 0.0051133520901203156, 0.0018672674195840955, 0.014190413057804108, 0.5000842809677124], [0.012381944805383682, 0.004203279037028551, 0.0011711594415828586, 0.004673803225159645, 0.0035671130754053593, 0.005806634668260813, 0.012241595424711704, 0.4557145833969116, 0.0018272607121616602, 0.0022711586207151413, 0.0036467837635427713, 0.004205690696835518, 0.0053556752391159534, 0.0028828370850533247, 0.0015599740436300635, 0.008588887751102448, 0.4699016809463501]], [[0.001630918006412685, 0.0036330276634544134, 0.0017476840876042843, 0.008779381401836872, 0.0012572674313560128, 0.010803107172250748, 0.0069645983166992664, 0.4637279510498047, 0.0004857521562371403, 0.001219079946167767, 0.0005375376786105335, 0.00043037760769948363, 0.0004086603003088385, 0.00037500864709727466, 0.00035355924046598375, 0.0011502847773954272, 0.49649578332901], [0.005849900655448437, 0.026236917823553085, 0.0029655976686626673, 0.005820889491587877, 0.001578476163558662, 0.0015933995600789785, 0.013776613399386406, 0.4570389986038208, 7.735843246337026e-05, 0.0001210536720464006, 3.135226870654151e-05, 8.219595474656671e-05, 0.00027954234974458814, 3.8645386666757986e-05, 7.09274536347948e-05, 0.000435490976087749, 0.48400259017944336], [0.007559177000075579, 0.14521479606628418, 0.01958632469177246, 0.013652811758220196, 0.001643932075239718, 0.004145259037613869, 0.013748853467404842, 0.38054025173187256, 0.0006117882439866662, 0.0004906049580313265, 2.169937579310499e-05, 0.00016530000721104443, 0.00016657485684845597, 8.909327152650803e-05, 8.825505210552365e-05, 0.0005184361943975091, 0.4117567241191864], [0.016508817672729492, 0.10554523766040802, 0.0065936134196817875, 0.02353733219206333, 0.0015333673218265176, 0.010534252971410751, 0.01612282171845436, 0.39454999566078186, 0.0007361700409092009, 0.0002221357135567814, 3.769283648580313e-05, 0.0001482527586631477, 0.00010547341662459075, 7.131034362828359e-05, 4.5916272938484326e-05, 0.00046974472934380174, 0.4232378602027893], [0.02959698811173439, 0.01268436573445797, 0.00953881535679102, 0.43772831559181213, 0.05048434063792229, 0.01491355337202549, 0.04251565411686897, 0.1940881907939911, 0.000400698947487399, 0.00016604083066340536, 4.636454468709417e-05, 0.00024080794537439942, 0.00021130035747773945, 0.00016402745677623898, 2.4400264010182582e-05, 0.0005704581853933632, 0.20662568509578705], [0.006177200935781002, 0.005787085276097059, 0.01466528419405222, 0.20637789368629456, 0.5009527802467346, 0.044982749968767166, 0.02846948243677616, 0.09063484519720078, 0.0005874054040759802, 0.00035206295433454216, 0.0002605296322144568, 0.0005908579332754016, 0.0017104543512687087, 0.0005926968879066408, 0.00022485233785118908, 0.0006102999323047698, 0.09702354669570923], [0.01510736346244812, 0.011606606654822826, 0.01218446809798479, 0.1088191568851471, 0.16073215007781982, 0.22981631755828857, 0.03207985311746597, 0.19987715780735016, 0.0049982802011072636, 0.001009913394227624, 0.001004306715913117, 0.0013966960832476616, 0.002690874971449375, 0.0017732917331159115, 0.00029171674395911396, 0.0018956500571221113, 0.21471616625785828], [0.006383002735674381, 0.005902440287172794, 0.0016148111317306757, 0.007346749305725098, 0.0025664924178272486, 0.008729341439902782, 0.011538311839103699, 0.45823484659194946, 0.002566457027569413, 0.002229247009381652, 0.0016701137647032738, 0.001487839501351118, 0.0018377351807430387, 0.0018774971831589937, 0.0008590968791395426, 0.002762381685897708, 0.48239368200302124], [0.04524953290820122, 0.0013594976626336575, 0.0004093957832083106, 0.002909192582592368, 0.002391014015302062, 0.005804012063890696, 0.044699136167764664, 0.41011562943458557, 0.0087862154468894, 0.00261941971257329, 0.00020449739531613886, 0.0003144172951579094, 0.0002249893150292337, 3.501161336316727e-05, 4.206320954835974e-05, 0.0010243634460493922, 0.4738115668296814], [0.09132824093103409, 0.005113589111715555, 0.0012245092075318098, 0.007615723647177219, 0.01507547777146101, 0.029535191133618355, 0.054246921092271805, 0.23267139494419098, 0.17377068102359772, 0.11762631684541702, 0.004653451964259148, 0.0022293049842119217, 0.002251217607408762, 0.0008152805967256427, 9.909580694511533e-05, 0.002008657669648528, 0.2597349286079407], [0.030890826135873795, 0.0048774913884699345, 0.0022368342615664005, 0.0021380609832704067, 0.004099604208022356, 0.016608424484729767, 0.02315337583422661, 0.22653543949127197, 0.22455313801765442, 0.193069189786911, 0.009306511841714382, 0.0021771180909126997, 0.004968162160366774, 0.0034003539476543665, 0.0001488552225055173, 0.00161849707365036, 0.25021809339523315], [0.016647247597575188, 0.0011671415995806456, 0.0012498158030211926, 0.004579117987304926, 0.004774804692715406, 0.011453363113105297, 0.017292439937591553, 0.14293035864830017, 0.0943334773182869, 0.3871225118637085, 0.12513461709022522, 0.015245389193296432, 0.0074633886106312275, 0.005015241447836161, 0.001106962445192039, 0.0036120624281466007, 0.16087201237678528], [0.021398290991783142, 0.0023197627160698175, 0.0004182607226539403, 0.0020134795922785997, 0.0001864724763436243, 0.0018597646849229932, 0.010608920827507973, 0.42670655250549316, 0.009306972846388817, 0.013215397484600544, 0.003056164598092437, 0.010228910483419895, 0.004213388543576002, 0.0009899141732603312, 0.0001486779801780358, 0.0017029246082529426, 0.49162614345550537], [0.020024023950099945, 0.0005738435429520905, 0.0006840116693638265, 0.003592725610360503, 0.0009128357050940394, 0.0018631581915542483, 0.00553504191339016, 0.2339477241039276, 0.005209977738559246, 0.011542570777237415, 0.008849975652992725, 0.05570434778928757, 0.28781193494796753, 0.08509176969528198, 0.0027863369323313236, 0.005720905493944883, 0.27014878392219543], [0.026004817336797714, 0.0013846780639141798, 0.0009464538306929171, 0.004057134967297316, 0.0025667804293334484, 0.0030928037595003843, 0.003819472389295697, 0.16580355167388916, 0.004551692865788937, 0.029466545209288597, 0.012271486222743988, 0.02901923656463623, 0.29240652918815613, 0.2157498151063919, 0.002414435613900423, 0.021567465737462044, 0.18487711250782013], [0.007009573746472597, 0.0006911451346240938, 0.0005664720665663481, 0.0010569181758910418, 0.0031400129664689302, 0.002296663820743561, 0.004644713830202818, 0.022301090881228447, 0.0023411069996654987, 0.052116237580776215, 0.004019484389573336, 0.018984483554959297, 0.3202293813228607, 0.4991047978401184, 0.004182165954262018, 0.0328456312417984, 0.024470103904604912], [0.006987396627664566, 0.006830199621617794, 0.0018728243885561824, 0.00775423226878047, 0.0029497963842004538, 0.009837541729211807, 0.013146414421498775, 0.4543441832065582, 0.0032984348945319653, 0.002733840374276042, 0.0020361572969704866, 0.0018087761709466577, 0.0022521631326526403, 0.0024986821226775646, 0.0011204505572095513, 0.00325174443423748, 0.4772772490978241]], [[0.004985570441931486, 0.0070844898000359535, 0.010517451912164688, 0.00269911321811378, 0.011646711267530918, 0.0020164859015494585, 0.00781127717345953, 0.12247852236032486, 0.12794484198093414, 0.25989097356796265, 0.040366459637880325, 0.016538472846150398, 0.20354953408241272, 0.012263654731214046, 0.001551253953948617, 0.02000334858894348, 0.14865191280841827], [0.007212472148239613, 0.021706944331526756, 0.6324887871742249, 0.012274417094886303, 0.02395448088645935, 0.02845582738518715, 0.07491730153560638, 0.08988158404827118, 0.0006989810499362648, 0.00547898281365633, 0.0014704149216413498, 0.0008368089911527932, 0.0007665411103516817, 0.0002848693693522364, 0.0006771578919142485, 0.0007200897671282291, 0.09817446023225784], [0.004238876048475504, 0.0031583376694470644, 0.006447446066886187, 0.011673263274133205, 0.0355844683945179, 0.041932422667741776, 0.011973629705607891, 0.41724345088005066, 0.00019813873223029077, 0.0003567976818885654, 0.0019134156173095107, 0.0007581845857203007, 0.00019221074762754142, 8.615856495453045e-05, 0.0005158367566764355, 0.0006077094003558159, 0.4631195068359375], [0.0109877809882164, 0.018733065575361252, 0.02712864615023136, 0.027788721024990082, 0.1262953281402588, 0.2742388844490051, 0.0710548460483551, 0.20928955078125, 0.0010762745514512062, 0.0010656617814674973, 0.0021682600490748882, 0.0005878574447706342, 0.0013631522888317704, 0.0007512095617130399, 0.0012044229079037905, 0.001546715502627194, 0.2247195690870285], [0.004225891549140215, 0.0010135946795344353, 0.00979903806000948, 0.010551226325333118, 0.017262037843465805, 0.22785498201847076, 0.6028282046318054, 0.042211636900901794, 0.002911378862336278, 0.027683200314641, 0.0031484225764870644, 0.001322540221735835, 0.0002842957910615951, 0.0003951598482672125, 0.0005421005771495402, 0.003169513773173094, 0.04479667916893959], [0.0032131564803421497, 0.0003200930077582598, 0.004851092584431171, 0.0033079730346798897, 0.0030305986292660236, 0.021792355924844742, 0.8670767545700073, 0.04086114838719368, 0.0003363770665600896, 0.009304952807724476, 0.0015144629869610071, 0.00019917835015803576, 4.9718284572009e-05, 6.780373223591596e-05, 0.00037671293830499053, 0.0005737515166401863, 0.04312386363744736], [0.01597990095615387, 0.007580767385661602, 0.003855861024931073, 0.04266727715730667, 0.01571275293827057, 0.02619338594377041, 0.011654693633317947, 0.3867860436439514, 0.012727024033665657, 0.006532070692628622, 0.011278621852397919, 0.016349267214536667, 0.008474690839648247, 0.0027264319360256195, 0.0009684975375421345, 0.0031377419363707304, 0.4273749589920044], [0.007080857176333666, 0.004761289805173874, 0.0032202876172959805, 0.0046277400106191635, 0.0031745564192533493, 0.007106042467057705, 0.012047868221998215, 0.4487597346305847, 0.0014414878096431494, 0.0023271956015378237, 0.0033661103807389736, 0.0017005859408527613, 0.00100427377037704, 0.000732457498088479, 0.0009026590269058943, 0.004035356920212507, 0.49371153116226196], [0.0012632374418899417, 6.81255551171489e-05, 0.0009554591961205006, 0.00016757726552896202, 0.00017009727889671922, 0.0002328252448933199, 0.008141440339386463, 0.054396990686655045, 0.010663696564733982, 0.6404402256011963, 0.15377749502658844, 0.04531555250287056, 0.016736837103962898, 0.000921139435376972, 0.0010158420773223042, 0.0024056462571024895, 0.06332771480083466], [0.004471431020647287, 0.0001234428636962548, 0.0002918621466960758, 0.001051347702741623, 0.0005096677341498435, 0.00044376685400493443, 0.0016345218755304813, 0.09005781263113022, 0.017780892550945282, 0.07711305469274521, 0.2641834020614624, 0.3048361539840698, 0.09093035012483597, 0.014937801286578178, 0.00754490727558732, 0.01795584335923195, 0.10613381862640381], [0.008581430651247501, 0.00022472925775218755, 0.00013691693311557174, 0.0018651180434972048, 0.0006004610913805664, 0.000910055881831795, 0.001432877266779542, 0.1491088718175888, 0.0059226457960903645, 0.022053668275475502, 0.05656634271144867, 0.351685106754303, 0.182596817612648, 0.026104005053639412, 0.004961181897670031, 0.021125998347997665, 0.1661236435174942], [0.004873867146670818, 0.0001714636164251715, 0.00016864134522620589, 0.0006482871831394732, 0.00040015208651311696, 0.0002832242171280086, 0.0042347293347120285, 0.28713932633399963, 0.0005784629611298442, 0.009179624728858471, 0.03152197599411011, 0.02446536161005497, 0.11830843240022659, 0.02632470801472664, 0.06196695938706398, 0.10570980608463287, 0.3240249454975128], [0.0016174393240362406, 0.00016903350478969514, 0.00022815738338977098, 0.00023923083790577948, 0.00013738579582422972, 0.0007686250610277057, 0.0060425978153944016, 0.2819910943508148, 0.00011113385698990896, 0.0008490153704769909, 0.0013234179932624102, 0.0010876395972445607, 0.01586979441344738, 0.08942229300737381, 0.036007389426231384, 0.24640458822250366, 0.31773123145103455], [0.0007072212174534798, 2.0695446437457576e-05, 0.00034197827335447073, 0.0002634183911141008, 0.00010389957606093958, 0.00025751246721483767, 0.013238305225968361, 0.17560835182666779, 2.8504695364972576e-05, 0.003198443679139018, 0.001508731278590858, 0.0007918964838609099, 0.0024740584194660187, 0.02437790296971798, 0.08212650567293167, 0.50252765417099, 0.19242490828037262], [0.004707667510956526, 0.0005381138762459159, 0.00024868079344742, 0.001247760490514338, 0.0002658366283867508, 0.0011880019446834922, 0.0016888439422473311, 0.3188669681549072, 0.0012510968372225761, 0.004443360026925802, 0.012069962918758392, 0.009359556250274181, 0.011358045041561127, 0.01979394257068634, 0.013770177960395813, 0.24160723388195038, 0.357594758272171], [0.00971157569438219, 0.000529648270457983, 0.0005717400345019996, 0.0032357056625187397, 0.0011286125518381596, 0.003886112244799733, 0.012129511684179306, 0.34335795044898987, 0.003968046046793461, 0.006197195965796709, 0.008548915386199951, 0.006712019443511963, 0.014836153946816921, 0.020977962762117386, 0.030314521864056587, 0.1442478597164154, 0.3896464407444], [0.007409220561385155, 0.0048757800832390785, 0.0032984695862978697, 0.004708785098046064, 0.0033497947733849287, 0.007243669591844082, 0.013466081582009792, 0.4478532373905182, 0.001528545399196446, 0.0025286555755883455, 0.003553577698767185, 0.0018581392941996455, 0.0010200438555330038, 0.0007644384750165045, 0.0010107038542628288, 0.004269884433597326, 0.4912608861923218]], [[0.017152776941657066, 0.014300890266895294, 0.001825623563490808, 0.006907407194375992, 0.0041553061455488205, 0.042142268270254135, 0.7487141489982605, 0.07999628782272339, 0.003473843913525343, 0.00023424877144861966, 0.0003855812537949532, 0.001387853641062975, 0.0017026528948917985, 0.0010885322699323297, 0.00018030806677415967, 0.0019812420941889286, 0.0743710920214653], [0.016712557524442673, 0.04698034003376961, 0.03487275913357735, 0.0027402853593230247, 0.0004018193867523223, 0.002794938860461116, 0.0018179002217948437, 0.41797035932540894, 0.0002191155799664557, 0.00023701840837020427, 0.00017069937894120812, 0.0004319166182540357, 0.0037146045360714197, 0.0005826166598126292, 8.345235983142629e-05, 0.003850969485938549, 0.46641862392425537], [0.0009322563419118524, 0.1101909950375557, 0.007308823522180319, 0.0007050703279674053, 2.6771162083605304e-05, 0.0017128087347373366, 0.0008016882347874343, 0.41844433546066284, 0.00022863448248244822, 0.00015822470595594496, 1.2200776836834848e-05, 7.89978639659239e-06, 5.4594373068539426e-05, 6.6227228671778e-05, 0.0002040020190179348, 0.00021109527733642608, 0.45893436670303345], [0.001987552037462592, 0.06205309182405472, 0.03354446962475777, 0.007438543252646923, 0.0005691456608474255, 0.010343929752707481, 0.0005493342177942395, 0.41359367966651917, 0.00010280427522957325, 0.00020456247148104012, 3.2211992220254615e-05, 6.26455876044929e-05, 6.131632108008489e-05, 5.749647971242666e-05, 0.0001304459001403302, 0.00030673370929434896, 0.4689621329307556], [0.0017222192836925387, 0.006162389647215605, 0.04294818639755249, 0.20489472150802612, 0.0033900984562933445, 0.00456859590485692, 0.00047969515435397625, 0.34144318103790283, 1.3820853382640053e-05, 0.00010674862278392538, 5.783383676316589e-05, 0.00038419259362854064, 2.5898734747897834e-05, 5.443698682938702e-06, 9.380736446473747e-05, 0.00021982108592055738, 0.39348340034484863], [0.005642704665660858, 0.006175093352794647, 0.012648492120206356, 0.005806110333651304, 0.013703509233891964, 0.007441757246851921, 0.0032436256296932697, 0.4387225806713104, 0.0004530332225840539, 0.0004591986071318388, 0.0001350257807644084, 0.000281378161162138, 0.0006892158999107778, 0.00015151083061937243, 0.0002090871421387419, 0.0016325420001521707, 0.5026051998138428], [0.0011829162249341607, 0.005219062324613333, 0.0021068344358354807, 0.002766749821603298, 0.0009267742861993611, 0.08196073025465012, 0.010864775627851486, 0.4225058853626251, 0.0010010508121922612, 0.00027640911866910756, 6.165471859276295e-05, 6.36497134109959e-05, 1.569695450598374e-05, 7.658110553165898e-05, 0.0004226136370562017, 0.0007951443549245596, 0.4697535037994385], [0.004330813884735107, 0.004676480777561665, 0.003227201057597995, 0.004555049352347851, 0.0008623532485216856, 0.008119049482047558, 0.021918028593063354, 0.44915395975112915, 0.0007845173240639269, 0.0015179639449343085, 0.0009737300570122898, 0.0016558027127757668, 0.0005455320933833718, 0.0005733633297495544, 0.0009591910638846457, 0.0033476967364549637, 0.4927992820739746], [0.17410631477832794, 0.0011574969394132495, 0.0007727844058535993, 0.0006417507538571954, 0.0008040676475502551, 0.0018691470613703132, 0.012762860395014286, 0.3560769855976105, 0.020435351878404617, 0.006827721372246742, 0.00026263968902640045, 0.0053595975041389465, 0.002150010084733367, 6.353038770612329e-05, 7.682108844164759e-06, 0.005925437901169062, 0.4107765257358551], [0.00043420089059509337, 0.0002330208517378196, 0.00014345829549711198, 0.00013765483163297176, 2.7091500669484958e-05, 0.0007388386875391006, 0.00108600954990834, 0.4380437731742859, 0.026568379253149033, 0.004157377406954765, 9.083386976271868e-05, 0.00024997093714773655, 4.62676071038004e-05, 3.6445515434024855e-05, 2.673239760042634e-05, 4.090273068868555e-05, 0.5279389023780823], [0.007110815495252609, 0.0006985956570133567, 0.0010073481826111674, 0.0005024212296120822, 7.472249126294628e-05, 0.0007072013686411083, 0.003982080612331629, 0.37572869658470154, 0.017646752297878265, 0.12554915249347687, 0.0056295860558748245, 0.006270220503211021, 0.0006914451951161027, 8.023084956221282e-05, 0.00022673732019029558, 0.000725676363799721, 0.45336833596229553], [0.0024653507862240076, 0.00018381248810328543, 0.00020125281298533082, 0.00021382723934948444, 2.5490513507975265e-05, 8.741093915887177e-05, 0.0001434565638191998, 0.22600077092647552, 0.000556670012883842, 0.0032318911980837584, 0.48181524872779846, 0.029450450092554092, 0.0050818780437111855, 3.523018676787615e-05, 0.0014464439591392875, 0.0003723718982655555, 0.2486884444952011], [0.0015338532393798232, 0.00013136962661519647, 0.00043260850361548364, 0.0009338534437119961, 1.1047529369534459e-05, 1.282707216887502e-05, 0.0001323282631346956, 0.32224538922309875, 0.00021994147391524166, 0.001882028067484498, 0.027518661692738533, 0.2709803879261017, 0.004399177618324757, 8.04719966254197e-05, 0.0005215978599153459, 0.0005472408956848085, 0.3684171736240387], [0.01309477724134922, 0.0002805860713124275, 7.222242857096717e-05, 8.595949475420639e-05, 2.489298458385747e-05, 2.1147283405298367e-05, 8.959462138591334e-05, 0.3508531153202057, 7.45371071388945e-05, 0.000475141016067937, 0.00042077581747435033, 0.0323898009955883, 0.15466243028640747, 0.01553257554769516, 0.0004362289037089795, 0.014048591256141663, 0.4174376428127289], [0.0006918899598531425, 0.0021612101700156927, 5.895650610909797e-05, 1.7270594980800524e-05, 3.866154202114558e-06, 0.000324615539284423, 0.0005819381331093609, 0.4382053315639496, 0.000649857975076884, 0.000522997637744993, 1.5068594620970543e-05, 4.039863051730208e-05, 0.000432221801020205, 0.039983589202165604, 0.00021764133998658508, 0.0014718422899022698, 0.5146213173866272], [0.0001954266190296039, 0.00010724622552515939, 0.00020710354147013277, 9.43505801842548e-05, 3.404894232517108e-05, 7.662839198019356e-05, 0.0003322149277664721, 0.40801551938056946, 1.1294250725768507e-05, 0.00010893790749832988, 0.0002609151997603476, 0.0001756740821292624, 0.0005560967256315053, 0.001532661379314959, 0.10646553337574005, 0.0022924873046576977, 0.47953376173973083], [0.005221781320869923, 0.005444729700684547, 0.0036230292171239853, 0.005175991915166378, 0.0010552523890510201, 0.009777992032468319, 0.027179835364222527, 0.4439617097377777, 0.0010140828089788556, 0.001946925651282072, 0.0011733046267181635, 0.0021000003907829523, 0.0006945946952328086, 0.0007562997052446008, 0.0011931638000532985, 0.00435724388808012, 0.4853242337703705]]], [[[0.06222934275865555, 0.011223357170820236, 0.015787392854690552, 0.012799481861293316, 0.0033703488297760487, 0.01542157493531704, 0.016259174793958664, 0.24824345111846924, 0.07193581014871597, 0.05816247686743736, 0.026816723868250847, 0.024919578805565834, 0.11732491105794907, 0.050583213567733765, 0.004960009828209877, 0.05329500511288643, 0.2066682130098343], [0.06482189893722534, 0.08041630685329437, 0.054557379335165024, 0.05996212735772133, 0.06848599016666412, 0.14059551060199738, 0.030481331050395966, 0.28718000650405884, 0.0014944530557841063, 0.0007534728501923382, 0.000969366985373199, 0.00017907416622620076, 0.0024001137353479862, 0.001198392827063799, 0.0004355222044978291, 0.0010624536080285907, 0.2050066441297531], [0.11247313022613525, 0.0365796722471714, 0.061428140848875046, 0.01429937407374382, 0.022246574983000755, 0.0935877338051796, 0.021541139110922813, 0.36245018243789673, 0.0009293583570979536, 0.0009358442039228976, 0.0006898887222632766, 0.0001616123627172783, 0.0008578920387662947, 0.0006272319587878883, 0.00036865068250335753, 0.0021685240790247917, 0.26865503191947937], [0.073676697909832, 0.029823502525687218, 0.014031712897121906, 0.0322556309401989, 0.05778970941901207, 0.061451856046915054, 0.041167087852954865, 0.38792693614959717, 0.0052524711936712265, 0.0013548419810831547, 0.0017378648044541478, 0.0011779482010751963, 0.004544850438833237, 0.003287628758698702, 0.0009731279569678009, 0.0029378158506006002, 0.2806103527545929], [0.029387326911091805, 0.005912467837333679, 0.005861077457666397, 0.022701425477862358, 0.031860511749982834, 0.1161937490105629, 0.1600247323513031, 0.3582885265350342, 0.0028381526935845613, 0.002432293025776744, 0.0005547262262552977, 0.00044128589797765017, 0.0012787713203579187, 0.0014527833554893732, 0.0008210285450331867, 0.001963406801223755, 0.25798776745796204], [0.02013743482530117, 0.0031871285755187273, 0.0007052486762404442, 0.007773532997816801, 0.013147188350558281, 0.03924290090799332, 0.0686795637011528, 0.5026764273643494, 0.0006360196857713163, 0.0002409998414805159, 0.00024169354583136737, 0.0001310967782046646, 0.0005957477842457592, 0.0004924361710436642, 0.00027813532506115735, 0.001193216652609408, 0.3406412601470947], [0.017159676179289818, 0.0012950540985912085, 0.00046061669127084315, 0.0023834719322621822, 0.0016027853125706315, 0.004686467349529266, 0.004174637142568827, 0.5680398344993591, 0.0009863880695775151, 0.0005074794171378016, 0.0010034575825557113, 0.001329202437773347, 0.0007602209225296974, 0.00047516843187622726, 0.00022527104010805488, 0.0007380410097539425, 0.39417222142219543], [0.01735837757587433, 0.0056022778153419495, 0.002952342154458165, 0.004448907915502787, 0.0021315335761755705, 0.004583532921969891, 0.0053506093099713326, 0.543319821357727, 0.0018155629513785243, 0.0012482377933338284, 0.0015756797511130571, 0.0012242674129083753, 0.003077156376093626, 0.0025707613676786423, 0.0011548998299986124, 0.003515399293974042, 0.39807066321372986], [0.06977446377277374, 0.0014299725880846381, 0.0009855309035629034, 0.001155778532847762, 0.0011278808815404773, 0.0027726266998797655, 0.0012140030739828944, 0.2999148368835449, 0.017872991040349007, 0.0319855771958828, 0.04655005410313606, 0.03569550812244415, 0.23830150067806244, 0.012016739696264267, 0.0021897803526371717, 0.0025014900602400303, 0.2345113456249237], [0.031152071431279182, 0.00021291757002472878, 0.00024967739591374993, 0.00016816146671772003, 0.00014642412133980542, 0.00024397668312303722, 0.00010648447641870007, 0.4755999445915222, 0.003184968838468194, 0.007521115709096193, 0.019706960767507553, 0.02361619658768177, 0.07563291490077972, 0.013318437151610851, 0.0022315464448183775, 0.002504982054233551, 0.3444032371044159], [0.030725901946425438, 0.0020348012913018465, 0.0007141407113522291, 0.0002791658916976303, 0.00017581494466867298, 0.0009960209717974067, 0.0002711419074330479, 0.41817063093185425, 0.00535159045830369, 0.0022471360862255096, 0.007942823693156242, 0.012369257397949696, 0.13355253636837006, 0.051497362554073334, 0.002662493847310543, 0.016318274661898613, 0.3146909773349762], [0.03735653683543205, 0.000959041528403759, 0.0002924947766587138, 0.0002720350166782737, 0.00015356017684098333, 0.0005411563906818628, 0.0002914085052907467, 0.508170485496521, 0.002039810409769416, 0.0006371202180162072, 0.0018173230346292257, 0.0018793451599776745, 0.02393984980881214, 0.021286070346832275, 0.0033449747134000063, 0.008148154243826866, 0.3888707160949707], [0.021539948880672455, 0.0004585519200190902, 0.0003033443936146796, 0.0004209604812785983, 0.00013121710799168795, 0.0010772220557555556, 0.0009947087382897735, 0.4661819338798523, 0.0005258549354039133, 0.0005240062018856406, 0.0007703894516453147, 0.00091246870579198, 0.03184255585074425, 0.058947086334228516, 0.01618376187980175, 0.04722842201590538, 0.35195767879486084], [0.007219757419079542, 0.00015234193415381014, 8.739755867281929e-05, 0.00019506202079355717, 6.440157449105754e-05, 0.0003273941110819578, 0.0002922629937529564, 0.5533922910690308, 8.337156032212079e-05, 0.00011111984349554405, 0.00022264687868300825, 0.0002106963365804404, 0.004670191090553999, 0.010438680648803711, 0.012619102373719215, 0.024987246841192245, 0.3849259614944458], [0.004986012354493141, 0.00023959590180311352, 0.0001758344005793333, 0.0001403661590302363, 7.464329246431589e-05, 0.0006951958639547229, 0.0001451667194487527, 0.5705699920654297, 0.0001973821927094832, 0.00010197081428486854, 0.00025859347078949213, 0.00018118292791768909, 0.0007095415494404733, 0.0053916689939796925, 0.0025105448439717293, 0.011862685903906822, 0.40175962448120117], [0.0040214103646576405, 0.00012022176815662533, 3.7768608308397233e-05, 0.00021916604600846767, 6.829619087511674e-05, 0.0003861628647428006, 0.00028214906342327595, 0.5946462750434875, 4.915626414003782e-05, 4.4148564484203234e-05, 9.050131484400481e-05, 6.464384205173701e-05, 0.00015497686399612576, 0.0008500401745550334, 0.0005385751719586551, 0.004296896513551474, 0.3941296339035034], [0.014741160906851292, 0.004172677639871836, 0.0021332723554223776, 0.0033464725129306316, 0.0015576551668345928, 0.0035026692785322666, 0.004374745301902294, 0.5534631013870239, 0.0014334124280139804, 0.0009752177866175771, 0.0013255677185952663, 0.0010285977041348815, 0.0025830045342445374, 0.00213717482984066, 0.0009344946010969579, 0.0030172269325703382, 0.3992736339569092]], [[0.031850267201662064, 0.06144869327545166, 0.01711576245725155, 0.03911055624485016, 0.007903936319053173, 0.01682884246110916, 0.005235510412603617, 0.4188999533653259, 0.012495669536292553, 0.008952994830906391, 0.0014240797609090805, 0.003668492892757058, 0.005084467586129904, 0.007104380521923304, 0.003509915666654706, 0.005273715127259493, 0.3540927767753601], [0.08816834539175034, 0.01291849184781313, 0.007019080687314272, 0.006031675264239311, 0.0018723233370110393, 0.0027867103926837444, 0.00894177332520485, 0.47301506996154785, 0.00616964977234602, 0.000784550909884274, 0.0010844110511243343, 0.0016837569419294596, 0.0018067866330966353, 0.003910520114004612, 0.00044455082388594747, 0.0030423561111092567, 0.3803200125694275], [0.030113575980067253, 0.017297249287366867, 0.024459702894091606, 0.008308799006044865, 0.006992260925471783, 0.01253463700413704, 0.019958416000008583, 0.4835943877696991, 0.0047219800762832165, 0.00284932111389935, 0.0017693220870569348, 0.0028413215186446905, 0.002676408737897873, 0.003755107754841447, 0.0024709219578653574, 0.00704931328073144, 0.36860722303390503], [0.05742860212922096, 0.013436036184430122, 0.013409365899860859, 0.02353910356760025, 0.014928702265024185, 0.01586555317044258, 0.036650072783231735, 0.4366380572319031, 0.0065728225745260715, 0.0020143270958215, 0.002393505536019802, 0.0020754304714500904, 0.003310360014438629, 0.006202100310474634, 0.0017801353242248297, 0.0053640748374164104, 0.35839179158210754], [0.04131932556629181, 0.024496708065271378, 0.010757518000900745, 0.011858894489705563, 0.019040856510400772, 0.06169071048498154, 0.06135048717260361, 0.4226905405521393, 0.005163577385246754, 0.0016705517191439867, 0.001235193107277155, 0.0014847967540845275, 0.0027924857567995787, 0.004041844978928566, 0.0007494086748920381, 0.0037055264692753553, 0.3259516656398773], [0.1292821764945984, 0.007671054918318987, 0.0040414659306406975, 0.0028530049603432417, 0.007765212561935186, 0.024324992671608925, 0.0555647574365139, 0.39340993762016296, 0.006063939072191715, 0.002384188584983349, 0.0009634266025386751, 0.0037653581239283085, 0.003109327983111143, 0.008813275024294853, 0.001328925834968686, 0.007802393287420273, 0.34085655212402344], [0.050938621163368225, 0.011790183372795582, 0.0151284858584404, 0.006979555822908878, 0.007527490146458149, 0.03475088253617287, 0.019052451476454735, 0.452745646238327, 0.004711616318672895, 0.006395944394171238, 0.0015513282269239426, 0.006622905842959881, 0.002581524895504117, 0.00833315309137106, 0.0025920860935002565, 0.008542521856725216, 0.35975557565689087], [0.026124773547053337, 0.021172426640987396, 0.011393862776458263, 0.013054000213742256, 0.009728864766657352, 0.022097833454608917, 0.0471414290368557, 0.4665626585483551, 0.005562187172472477, 0.0038279315922409296, 0.004973159171640873, 0.005424310453236103, 0.006342133041471243, 0.0037479421589523554, 0.00539664039388299, 0.00815630704164505, 0.33929353952407837], [0.014546267688274384, 0.019375307485461235, 0.007183321285992861, 0.008889238350093365, 0.003311531152576208, 0.010084609501063824, 0.0075137256644666195, 0.4769587814807892, 0.015988342463970184, 0.0039009368047118187, 0.001373600447550416, 0.004342284519225359, 0.007108298130333424, 0.028479604050517082, 0.003127798903733492, 0.007488923147320747, 0.38032734394073486], [0.004450300242751837, 0.013733319006860256, 0.005209342576563358, 0.0045092240907251835, 0.004290551412850618, 0.007425542920827866, 0.008546719327569008, 0.48500946164131165, 0.017422856763005257, 0.007889966480433941, 0.003429705509915948, 0.005628917831927538, 0.007145700044929981, 0.02493269182741642, 0.004979937337338924, 0.007277855183929205, 0.3881179094314575], [0.020994912832975388, 0.012569146230816841, 0.003273850539699197, 0.0015651066787540913, 0.001924082636833191, 0.004172459710389376, 0.0075534069910645485, 0.42729267477989197, 0.05239259824156761, 0.02116963267326355, 0.004175584763288498, 0.007364147808402777, 0.021773945540189743, 0.05143410339951515, 0.009352311491966248, 0.01165570318698883, 0.34133628010749817], [0.0248522087931633, 0.03298011049628258, 0.0045846919529139996, 0.006975323427468538, 0.0021469921339303255, 0.0061341444961726665, 0.012816306203603745, 0.44374004006385803, 0.027548450976610184, 0.010629798285663128, 0.003212754847481847, 0.0031496440060436726, 0.014443567954003811, 0.03518765792250633, 0.004526130855083466, 0.0067893932573497295, 0.3602827489376068], [0.010024623945355415, 0.011513827368617058, 0.00148773193359375, 0.0023110369220376015, 0.002924047177657485, 0.007480372674763203, 0.0017478910740464926, 0.5062806010246277, 0.02079574204981327, 0.007332590874284506, 0.0013469929108396173, 0.0035072125028818846, 0.004997245967388153, 0.024244273081421852, 0.0019747044425457716, 0.004346712026745081, 0.38768434524536133], [0.03800666704773903, 0.005856029223650694, 0.004484549164772034, 0.0025923310313373804, 0.0018806204898282886, 0.00896464940160513, 0.010262347757816315, 0.4523540735244751, 0.023127853870391846, 0.008901788853108883, 0.002362973988056183, 0.009535424411296844, 0.015328394249081612, 0.03946968913078308, 0.007442819885909557, 0.012467269785702229, 0.35696250200271606], [0.018624387681484222, 0.005481986328959465, 0.003052372485399246, 0.0004058620543219149, 0.002112566027790308, 0.006461943034082651, 0.004783644340932369, 0.45013654232025146, 0.03769480809569359, 0.016478905454277992, 0.002755182096734643, 0.014361168257892132, 0.01294479425996542, 0.05430305749177933, 0.007781156338751316, 0.019296729937195778, 0.34332481026649475], [0.020761270076036453, 0.005136465188115835, 0.00492568826302886, 0.0015779684763401747, 0.00195700628682971, 0.0076730018481612206, 0.007862133905291557, 0.455022394657135, 0.020728083327412605, 0.011051773093640804, 0.0027191494591534138, 0.007383351679891348, 0.010208610445261002, 0.03861897811293602, 0.009071559645235538, 0.028266653418540955, 0.367035835981369], [0.024249102920293808, 0.01833004504442215, 0.01097728218883276, 0.011344080790877342, 0.008989119902253151, 0.019900605082511902, 0.03878051042556763, 0.4758493900299072, 0.005294352304190397, 0.004056975245475769, 0.004940883256494999, 0.005583477206528187, 0.006528070196509361, 0.0036511612124741077, 0.005437426269054413, 0.007895203307271004, 0.34819239377975464]], [[0.013189210556447506, 0.04878270626068115, 0.0004649242328014225, 0.0029211346991360188, 0.0014530338812619448, 0.006784756202250719, 0.004488692618906498, 0.36728328466415405, 0.24900442361831665, 0.0018015814712271094, 0.0017052630428224802, 0.002151469700038433, 0.00810808502137661, 0.021379593759775162, 0.0005860304809175432, 0.0006754586938768625, 0.26922038197517395], [0.0018229876877740026, 0.05620413273572922, 0.16378983855247498, 0.01121945958584547, 0.0003264013503212482, 0.0007784877670928836, 0.000895759672857821, 0.4300557076931, 0.0011591239599511027, 0.005226176232099533, 0.0005796078476123512, 0.00021477277914527804, 0.00021466496400535107, 9.204114758176729e-05, 0.00024531446979381144, 0.0002000557869905606, 0.32697543501853943], [0.01652393490076065, 0.009863244369626045, 0.0023245930206030607, 0.007279149256646633, 0.0009502455941401422, 0.0011404850520193577, 0.0014721720945090055, 0.5507407188415527, 0.0008428208529949188, 0.0004896912723779678, 0.0008783860830590129, 0.0003088038065470755, 0.0016061995411291718, 0.00040228216676041484, 4.379996607895009e-05, 0.00043336855014786124, 0.4047001004219055], [0.023029565811157227, 0.02599475532770157, 0.0037679008673876524, 0.024366283789277077, 0.16582749783992767, 0.0353570356965065, 0.015081219375133514, 0.3574579954147339, 0.025063686072826385, 0.0018054000101983547, 0.002628380199894309, 0.007968241348862648, 0.030823688954114914, 0.00617032079026103, 0.00013360724551603198, 0.0026568504981696606, 0.27186763286590576], [0.003751736134290695, 0.024706928059458733, 0.0006774174980819225, 0.0049817501567304134, 0.008362867869436741, 0.2924361228942871, 0.005819959100335836, 0.3694363534450531, 0.015816325321793556, 0.000794789579231292, 0.0009085320052690804, 0.0013681339332833886, 0.0005701824557036161, 0.010258568450808525, 0.000621883780695498, 0.0010362501488998532, 0.2584521472454071], [0.0035934317857027054, 0.0023006321862339973, 0.008290057070553303, 0.0044997152872383595, 0.0011891110334545374, 0.010744689963757992, 0.2126532942056656, 0.4179225265979767, 0.0006858897395431995, 0.02602897770702839, 0.001089641242288053, 0.00037494907155632973, 0.0003320509276818484, 0.0002383140817983076, 0.0037683306727558374, 0.001143922796472907, 0.30514442920684814], [0.010053616017103195, 0.013376235030591488, 0.001299927942454815, 0.0014597359113395214, 0.0002790637663565576, 0.003942014649510384, 0.01655123382806778, 0.5301147103309631, 0.008200157433748245, 0.0020484954584389925, 0.0013240063562989235, 0.0032621161080896854, 0.0006263716495595872, 0.0008787883562035859, 0.0016097313491627574, 0.009426687844097614, 0.39554718136787415], [0.010582360439002514, 0.011023254133760929, 0.004165771882981062, 0.006667990703135729, 0.002141132950782776, 0.008530589751899242, 0.008561355993151665, 0.5245344042778015, 0.0023198199924081564, 0.0038577166851609945, 0.0021427988540381193, 0.002523110480979085, 0.0010517118498682976, 0.0015477048000320792, 0.002201440278440714, 0.004279494285583496, 0.40386927127838135], [0.0007978323847055435, 0.0019295386737212539, 0.016145840287208557, 0.0015750628663226962, 8.509391773259267e-05, 0.00041409992263652384, 0.0008802920929156244, 0.19787639379501343, 0.0028672143816947937, 0.6273912191390991, 0.0005155407125130296, 0.00021750754967797548, 0.0003765000437851995, 0.0003788386529777199, 0.001178301521576941, 8.925032307161018e-05, 0.1472814530134201], [0.003347754245623946, 0.000289598829112947, 8.562362199882045e-05, 0.0001997091603698209, 0.00012645231618080288, 0.000697810435667634, 0.00041253294330090284, 0.5737211108207703, 0.0013303994201123714, 0.002372839255258441, 0.002059493213891983, 0.00043919828021898866, 0.0004088116984348744, 0.0004395085561554879, 0.0002064076397800818, 0.00014705183275509626, 0.4137156903743744], [0.0015253257006406784, 0.0026373090222477913, 6.336012302199379e-05, 0.0006488626822829247, 0.00012737214274238795, 0.0008151813526637852, 0.000151809785165824, 0.39348554611206055, 0.002083980478346348, 0.0015313861658796668, 6.387862958945334e-05, 0.3207513391971588, 0.00029936485225334764, 0.0004493242013268173, 2.085639380311477e-06, 0.0006630048155784607, 0.2747008502483368], [0.013906878419220448, 0.004609500057995319, 4.03863814426586e-05, 0.001327304169535637, 0.0007495511672459543, 0.0036639971658587456, 0.0009994710562750697, 0.532169759273529, 0.013522611930966377, 0.00039109497447498143, 0.0003075264685321599, 0.00028767791809514165, 0.033241916447877884, 0.006081035826355219, 8.873081242199987e-06, 0.0008172029629349709, 0.38787516951560974], [0.002221515402197838, 0.014974980615079403, 3.097394437645562e-05, 0.00022949308913666755, 0.0004134383052587509, 0.04491299018263817, 0.0004375235002953559, 0.1674930900335312, 0.1468004435300827, 0.0007777983555570245, 0.00020982879505027086, 0.0007031286950223148, 0.0034343446604907513, 0.49069473147392273, 0.00011791523138526827, 0.0002277431049151346, 0.12632013857364655], [0.0019167748978361487, 0.0008570684585720301, 0.0030271566938608885, 0.0002146833430742845, 0.0001258013362530619, 0.0009221627842634916, 0.0014258355367928743, 0.495392382144928, 0.0008911298355087638, 0.06519701331853867, 0.00048824577243067324, 0.0001445283996872604, 0.00022564265236724168, 0.0025092936120927334, 0.03353000804781914, 0.0015433602966368198, 0.39158895611763], [0.0005076006636954844, 0.000724041077774018, 3.729939999175258e-05, 1.5107215403986629e-05, 3.405720417504199e-05, 0.0003162130306009203, 0.0002251798432553187, 0.11012495309114456, 0.0007170755416154861, 0.00021392614871729165, 4.0316117519978434e-05, 0.00029746638028882444, 0.0008800480864010751, 0.002399762626737356, 0.00011079433897975832, 0.8016442060470581, 0.08171196281909943], [0.012252292595803738, 0.02004837803542614, 0.0018520201556384563, 0.0030965525656938553, 0.0009993320563808084, 0.03044125624001026, 0.012676111422479153, 0.44895321130752563, 0.03934065252542496, 0.0026687346398830414, 0.003711995203047991, 0.0008912935736589134, 0.0055306884460151196, 0.041940730065107346, 0.004334761295467615, 0.017228230834007263, 0.3540339171886444], [0.010044400580227375, 0.010836067609488964, 0.003907341510057449, 0.006655768025666475, 0.002017039805650711, 0.008173004724085331, 0.007778435945510864, 0.5272728800773621, 0.002296219114214182, 0.003268791828304529, 0.0020938930101692677, 0.0022261927369982004, 0.0010160149540752172, 0.0013968987623229623, 0.0020000780932605267, 0.0036871798802167177, 0.40532979369163513]], [[0.015617191791534424, 0.005418439861387014, 0.003117323387414217, 0.007170486729592085, 0.0023113747593015432, 0.0032656544353812933, 0.004667909815907478, 0.47467100620269775, 0.02349037304520607, 0.017136571928858757, 0.005434189457446337, 0.011097019538283348, 0.03265562653541565, 0.02229488454759121, 0.002128954278305173, 0.005348659586161375, 0.3641743063926697], [0.011775280348956585, 0.08641530573368073, 0.013553845696151257, 0.03420471027493477, 0.008827862329781055, 0.036319904029369354, 0.10537640005350113, 0.39867308735847473, 0.0018789931200444698, 0.000867321330588311, 0.0002377521595917642, 0.0005069066537544131, 0.0003014457761310041, 0.002956017618998885, 0.00020343292271718383, 0.00040449961670674384, 0.2974972724914551], [0.009508919902145863, 0.03990296274423599, 0.18809643387794495, 0.013460175134241581, 0.0024059894494712353, 0.012673699297010899, 0.021573470905423164, 0.40917134284973145, 0.0012642444344237447, 0.0006597968167625368, 0.00032517631188966334, 0.0005271052359603345, 0.00019992011948488653, 0.0005901289405301213, 0.00015245257236529142, 0.0006490251398645341, 0.29883915185928345], [0.005797344259917736, 0.08579199016094208, 0.05133094638586044, 0.0769948810338974, 0.008498845621943474, 0.03545122966170311, 0.18113744258880615, 0.31117719411849976, 0.003339532995596528, 0.0006752316839993, 0.00021815398940816522, 0.00043993929284624755, 0.00021365396969486028, 0.003225737251341343, 0.0002394437324255705, 0.0006249540601857007, 0.2348434031009674], [0.012185201980173588, 0.043789371848106384, 0.009993191808462143, 0.03264083340764046, 0.01782333105802536, 0.051266759634017944, 0.09156622737646103, 0.435250461101532, 0.0012348127784207463, 0.00047399813774973154, 0.0005110757774673402, 0.0002692266134545207, 0.00014191209629643708, 0.0006130430265329778, 0.00017101915727835149, 0.00017903759726323187, 0.3018905520439148], [0.00828859768807888, 0.042355168610811234, 0.013510494492948055, 0.028497062623500824, 0.01072603277862072, 0.06346774101257324, 0.45202213525772095, 0.21485859155654907, 0.0022965220268815756, 0.0005806196131743491, 0.0005559555720537901, 0.0005057745729573071, 0.00020080483227502555, 0.003505747765302658, 0.0003123208589386195, 0.0004578852385748178, 0.1578584760427475], [0.008462972939014435, 0.01966806873679161, 0.03113679215312004, 0.06810611486434937, 0.0060747163370251656, 0.023496203124523163, 0.02925211563706398, 0.46187150478363037, 0.0014180750586092472, 0.0010227281600236893, 0.002426047110930085, 0.0005825462285429239, 0.0006034694379195571, 0.0019065249944105744, 0.0009071054519154131, 0.000761057308409363, 0.34230390191078186], [0.011645305901765823, 0.008685487322509289, 0.004450683947652578, 0.003813832299783826, 0.0013365453341975808, 0.003629521233960986, 0.009613266214728355, 0.5594266653060913, 0.002450609114021063, 0.0019050838891416788, 0.0018358547240495682, 0.0016865541692823172, 0.0007053850567899644, 0.0018894716631621122, 0.000977774034254253, 0.0009025399340316653, 0.3850453495979309], [0.16020739078521729, 0.003631794825196266, 0.0005395996267907321, 0.003892946522682905, 0.0010661003179848194, 0.006977256387472153, 0.015545171685516834, 0.39014822244644165, 0.02911488153040409, 0.012187251821160316, 0.0025937315076589584, 0.03226833790540695, 0.009023203514516354, 0.029064293950796127, 0.001968708820641041, 0.010796521790325642, 0.290974497795105], [0.056725189089775085, 0.00038511023740284145, 0.0006700473022647202, 0.0004774250846821815, 0.0001462656946387142, 0.0004900086205452681, 0.004755864385515451, 0.5242371559143066, 0.004048623144626617, 0.009365683421492577, 0.003340738592669368, 0.009673887863755226, 0.003674849169328809, 0.0035705401096493006, 0.00279803853482008, 0.008139068260788918, 0.3675014078617096], [0.07196597754955292, 0.002173026092350483, 0.004970130976289511, 0.0008480082615278661, 0.0006631113938055933, 0.001571857021190226, 0.002658969722688198, 0.43474435806274414, 0.02785625495016575, 0.018780404701828957, 0.015478396788239479, 0.01805277168750763, 0.02770358882844448, 0.019175760447978973, 0.010547129437327385, 0.02191154845058918, 0.3208986818790436], [0.2121223658323288, 0.0019235057989135385, 0.0015495088882744312, 0.0008136879769153893, 0.000196195236640051, 0.0019061386119574308, 0.0064291758462786674, 0.3071148693561554, 0.04199030622839928, 0.08462458848953247, 0.0043339780531823635, 0.014813661575317383, 0.017379140481352806, 0.033101484179496765, 0.008024676702916622, 0.02374441921710968, 0.23993225395679474], [0.15858682990074158, 0.0018282901728525758, 0.0005473802448250353, 0.0021337626967579126, 0.0009929609950631857, 0.0028460451867431402, 0.003621053881943226, 0.41440314054489136, 0.020862845703959465, 0.015355078503489494, 0.008580535650253296, 0.021339459344744682, 0.022059110924601555, 0.029147200286388397, 0.004075473174452782, 0.007378764916211367, 0.2862420380115509], [0.21076945960521698, 0.001943841460160911, 0.0007246293826028705, 0.0028074111323803663, 0.000550757918972522, 0.004412703216075897, 0.008866420947015285, 0.2887250483036041, 0.04255829378962517, 0.027128154411911964, 0.012557504698634148, 0.05258859694004059, 0.023236654698848724, 0.06363048404455185, 0.007613915018737316, 0.032572098076343536, 0.21931400895118713], [0.06452745199203491, 0.0005462526460178196, 0.0013657561503350735, 0.00043937197187915444, 5.027664883527905e-05, 0.0005142366280779243, 0.001045677112415433, 0.4556729793548584, 0.010706824250519276, 0.02532513253390789, 0.010910225100815296, 0.019632460549473763, 0.003788391128182411, 0.016175536438822746, 0.016979815438389778, 0.03870779275894165, 0.3336118757724762], [0.1411915123462677, 0.001162552973255515, 0.0018730267183855176, 0.0007259511621668935, 0.00024323065008502454, 0.0024660606868565083, 0.004150643479079008, 0.3549830913543701, 0.02150757610797882, 0.024708885699510574, 0.009251038543879986, 0.01733894646167755, 0.024056637659668922, 0.0698123425245285, 0.02323761023581028, 0.038293883204460144, 0.2649969756603241], [0.011115743778645992, 0.006980876438319683, 0.0038017111364752054, 0.0029948491137474775, 0.0011021328391507268, 0.002912895753979683, 0.008221722207963467, 0.5669072866439819, 0.0020708078518509865, 0.0016587289283052087, 0.0017586579779163003, 0.001531869638711214, 0.0005953890504315495, 0.0015427664620801806, 0.0008914187201298773, 0.0008350283023901284, 0.3850780725479126]], [[0.007079931441694498, 0.04249761253595352, 0.04053551331162453, 0.028292205184698105, 0.01801162213087082, 0.01388684380799532, 0.05567692965269089, 0.46444037556648254, 0.006102659739553928, 0.002524963114410639, 0.0024744370020926, 0.0024597481824457645, 0.0030468127224594355, 0.0006662492523901165, 8.942422573454678e-05, 0.0003471468517091125, 0.3118675947189331], [0.005069421604275703, 0.18061916530132294, 0.08508310467004776, 0.04535532742738724, 0.007906693033874035, 0.007619315758347511, 0.003808134002611041, 0.3867223858833313, 0.0033708959817886353, 0.0012064081383869052, 0.000840686378069222, 0.0020678939763456583, 0.0007573501206934452, 0.0004131859459448606, 0.00012729110312648118, 0.00020404552924446762, 0.26882871985435486], [0.0020023926626890898, 0.016552774235606194, 0.015474947169423103, 0.0023357735481113195, 0.0010369681986048818, 0.0006717900978401303, 0.001175579847767949, 0.5891199707984924, 0.00012828156468458474, 8.558353874832392e-05, 8.079452527454123e-05, 0.00011179253488080576, 1.91898325283546e-05, 5.182092536415439e-06, 8.795864232524764e-06, 1.5517636711592786e-05, 0.37117472290992737], [0.009907450526952744, 0.15029355883598328, 0.06113695725798607, 0.02655322663486004, 0.012076417915523052, 0.012615250423550606, 0.008929894305765629, 0.40786463022232056, 0.00481663690879941, 0.0018113456899300218, 0.0008398808422498405, 0.0029380549676716328, 0.0015636322787031531, 0.0016138883074745536, 0.0005589253269135952, 0.0006919551524333656, 0.2957882583141327], [0.004874920938163996, 0.12603433430194855, 0.0903070792555809, 0.03677091374993324, 0.009274939075112343, 0.022849947214126587, 0.029611071571707726, 0.40155747532844543, 0.0011199190048500896, 0.0008934880024753511, 0.00028211509925313294, 0.0007920233183540404, 0.00019959468045271933, 0.00020209501963108778, 0.00015502539463341236, 0.0005466901930049062, 0.27452847361564636], [0.008167693391442299, 0.1946438103914261, 0.07790529727935791, 0.020525028929114342, 0.008915117010474205, 0.05095594748854637, 0.024820292368531227, 0.35087597370147705, 0.0027226670645177364, 0.0011229764204472303, 0.0004126499989069998, 0.0011565503664314747, 0.0005553723312914371, 0.000666837499011308, 0.0003422666050028056, 0.0012552457628771663, 0.25495627522468567], [0.012008543126285076, 0.12116901576519012, 0.036514949053525925, 0.14737863838672638, 0.036947667598724365, 0.16856235265731812, 0.050325650721788406, 0.22391222417354584, 0.010096030309796333, 0.004155176691710949, 0.0006808865000493824, 0.004539726302027702, 0.006600753869861364, 0.002891169162467122, 0.0005897106602787971, 0.0015595467993989587, 0.1720680147409439], [0.00860117468982935, 0.00887627899646759, 0.004439475014805794, 0.005032513290643692, 0.001813722075894475, 0.00576211791485548, 0.009958162903785706, 0.548775315284729, 0.003357859095558524, 0.002993339439854026, 0.0021517707500606775, 0.002214299514889717, 0.0014317439636215568, 0.0008117801044136286, 0.0005060252733528614, 0.0008919261745177209, 0.3923824727535248], [0.06043653190135956, 0.007919001393020153, 0.003556200535967946, 0.004104431252926588, 0.0013847766676917672, 0.0016274518566206098, 0.0021389471367001534, 0.5363924503326416, 0.005838167387992144, 0.002442255849018693, 0.0004324178444221616, 0.0024328201543539762, 0.0002271716803079471, 4.03986923629418e-05, 7.768611249048263e-05, 0.0002669844252523035, 0.3706822991371155], [0.019823433831334114, 0.0007472556899301708, 0.0009656522306613624, 0.00040573865408077836, 0.0002053646749118343, 0.0006189091945998371, 0.0011744749499484897, 0.5883796811103821, 0.00329749658703804, 0.0024121066089719534, 0.00017707794904708862, 0.0005322374636307359, 5.537117613130249e-05, 1.0394752280262765e-05, 2.3250222511705942e-05, 6.826285971328616e-05, 0.3811033368110657], [0.030321422964334488, 0.0026172210928052664, 0.0037845198530703783, 0.002246926771476865, 0.0013416967121884227, 0.002318615559488535, 0.0026267217472195625, 0.5169925689697266, 0.02428642474114895, 0.021224187687039375, 0.006568270269781351, 0.002901220228523016, 0.0006417499389499426, 0.00010277329420205206, 0.00045441227848641574, 0.0004957860219292343, 0.381075382232666], [0.05498568341135979, 0.002245381474494934, 0.0019435094436630607, 0.0013716608518734574, 0.0004313517711125314, 0.0006979386671446264, 0.001276145107112825, 0.5088890790939331, 0.02926749736070633, 0.026684243232011795, 0.0060414038598537445, 0.007802157662808895, 0.000891951727680862, 0.00011258080485276878, 0.0001252226938959211, 0.00019466722733341157, 0.35703957080841064], [0.09447833895683289, 0.0028153681196272373, 0.004212194122374058, 0.002145177684724331, 0.0011277066078037024, 0.001213623327203095, 0.004677282180637121, 0.27180686593055725, 0.17166340351104736, 0.12551721930503845, 0.07542457431554794, 0.036480676382780075, 0.010705082677304745, 0.0009031806257553399, 0.0002623548498377204, 0.0008012775797396898, 0.19576571881771088], [0.04302964359521866, 0.003142506582662463, 0.0031320664566010237, 0.0008010973106138408, 0.0002877646475099027, 0.0007515671895816922, 0.001566201914101839, 0.5131692290306091, 0.013505401089787483, 0.017976095899939537, 0.0074323248118162155, 0.022860420867800713, 0.004216240253299475, 0.001238325727172196, 0.0012785057770088315, 0.002047069137915969, 0.3635655641555786], [0.0053430176340043545, 0.00020044467237312347, 0.000236545703955926, 8.965158485807478e-05, 1.2412235264491756e-05, 0.00014920464309398085, 0.0003629731363616884, 0.601553201675415, 0.000487524172058329, 0.0008405859116464853, 0.0002341360377613455, 0.0007191156619228423, 0.00018008075130637735, 0.0001566711434861645, 0.00031270147883333266, 0.0002937922836281359, 0.38882794976234436], [0.03065670095384121, 0.0006178359035402536, 0.0008920178515836596, 0.0006766004371456802, 8.123068982968107e-05, 0.0005309798871167004, 0.0007500273059122264, 0.570378839969635, 0.0011965279700234532, 0.0016172232571989298, 0.000876771635375917, 0.001707606134004891, 0.0008385828114114702, 0.0005056423833593726, 0.0004982129903510213, 0.0013475306332111359, 0.3868277072906494], [0.008313629776239395, 0.0066047552973032, 0.0032311324030160904, 0.003895176574587822, 0.0013111758744344115, 0.004245354328304529, 0.007597950287163258, 0.5599685311317444, 0.0028207176364958286, 0.002364696701988578, 0.0017351101851090789, 0.0017988411709666252, 0.0010902706999331713, 0.000580125895794481, 0.00036508540506474674, 0.0006432330701500177, 0.39343419671058655]], [[0.009254392236471176, 0.014614791609346867, 0.002209634752944112, 0.018608586862683296, 0.0010864089708775282, 0.002084523206576705, 0.006317447405308485, 0.17381878197193146, 0.4234350025653839, 0.15785183012485504, 0.005158830434083939, 0.02107120491564274, 0.025052301585674286, 0.005005573388189077, 0.000656555755995214, 0.001541715464554727, 0.13223248720169067], [0.0031691747717559338, 0.11442793160676956, 0.1788446456193924, 0.09800092875957489, 0.00418424466624856, 0.005948270205408335, 0.0093125244602561, 0.3339674472808838, 0.0010352253448218107, 0.003541591577231884, 0.001254762290045619, 0.00027206179220229387, 0.0018073703395202756, 0.00014875340275466442, 0.00011294578143861145, 0.00020776840392500162, 0.24376431107521057], [0.003085682401433587, 0.007876350544393063, 0.00718509778380394, 0.007902244105935097, 0.0018031501676887274, 0.001686214585788548, 0.0020809604320675135, 0.5837993621826172, 0.00018087094940710813, 0.00015709889703430235, 0.00014172631199471653, 0.00017802792717702687, 0.00024700292851775885, 2.364023202972021e-05, 3.1850020604906604e-05, 9.749303717399016e-05, 0.3835233151912689], [0.008648602291941643, 0.020091773942112923, 0.004379244986921549, 0.029114916920661926, 0.06234189495444298, 0.011135824024677277, 0.011315341107547283, 0.4887467622756958, 0.004993945825845003, 0.0006746219587512314, 0.000247936841333285, 0.0008631508680991828, 0.0007977194036357105, 0.00038290381780825555, 0.0001427728566341102, 0.0009213780867867172, 0.35520124435424805], [0.0044928682036697865, 0.0036168943624943495, 0.011913510039448738, 0.021341819316148758, 0.008767028339207172, 0.10772255808115005, 0.24903860688209534, 0.3130241930484772, 0.00968844722956419, 0.044224999845027924, 0.0005796061013825238, 0.0008984240121208131, 0.00042662140913307667, 0.0009398582624271512, 0.0006077661528252065, 0.002392765134572983, 0.22032395005226135], [0.004887793213129044, 0.005551813170313835, 0.006651143077760935, 0.0040780059061944485, 0.0024328858125954866, 0.05179005488753319, 0.21276786923408508, 0.4055701196193695, 0.0065904236398637295, 0.02088720165193081, 0.0002913235512096435, 0.0007765268674120307, 0.001271469984203577, 0.0009024907485581934, 0.0005158974672667682, 0.0025344952009618282, 0.27250057458877563], [0.007741131819784641, 0.015743732452392578, 0.003156135091558099, 0.009810216724872589, 0.0012324409326538444, 0.00809271540492773, 0.005855454131960869, 0.5598330497741699, 0.003864931408315897, 0.0006580136832781136, 0.00015598016034346074, 0.0012447824701666832, 0.00020260120800230652, 0.00016710592899471521, 0.0001396617735736072, 0.0007014386355876923, 0.3814007043838501], [0.010876622051000595, 0.009622414596378803, 0.004372072871774435, 0.011824050918221474, 0.002144634025171399, 0.004311480093747377, 0.009201114065945148, 0.5341811180114746, 0.005715567618608475, 0.0036970973014831543, 0.001768902293406427, 0.0027471426874399185, 0.003299988806247711, 0.0017939411336556077, 0.0011034493800252676, 0.0026437968481332064, 0.39069658517837524], [0.0011126119643449783, 0.0008566984906792641, 0.0015441240975633264, 0.0013757634442299604, 0.0003270170127507299, 0.0003222278319299221, 0.0013188146986067295, 0.25308871269226074, 0.008395613171160221, 0.5076196789741516, 0.012135365977883339, 0.011380940675735474, 0.012624471448361874, 0.0004343383479863405, 0.0002993814996443689, 0.00037496781442314386, 0.18678931891918182], [0.001287969178520143, 0.00014162737352307886, 0.00010223017306998372, 8.578803681302816e-05, 2.6313786293030716e-05, 5.501299165189266e-05, 0.0001998942025238648, 0.5743443965911865, 0.002284629736095667, 0.011306922882795334, 0.005270640831440687, 0.028095664456486702, 0.0029770461842417717, 0.0004024481459055096, 0.00020054751075804234, 0.0009020920842885971, 0.372316837310791], [0.005382280796766281, 0.0004892282304354012, 0.00013536213373299688, 0.0002897864324040711, 1.9454235371085815e-05, 5.586165571003221e-05, 0.00019238462846260518, 0.519152820110321, 0.00447918102145195, 0.007109100930392742, 0.0022714370861649513, 0.07928027212619781, 0.014724121429026127, 0.0015532708493992686, 0.00016877238522283733, 0.0007627729792147875, 0.3639339208602905], [0.00322076422162354, 0.0008951859199441969, 0.00012561694893520325, 0.0009224353707395494, 5.0538175855763257e-05, 7.667708996450529e-05, 0.0002950096095446497, 0.5780929327011108, 0.0027451463975012302, 0.001182031468488276, 0.0011533120414242148, 0.004235925152897835, 0.01962188631296158, 0.0020296715665608644, 0.0002578027197159827, 0.001153075136244297, 0.38394200801849365], [0.00434714974835515, 0.0010242698481306434, 0.000792507256846875, 0.0005922834388911724, 9.064083860721439e-05, 0.001411275938153267, 0.0026627755723893642, 0.4084112346172333, 0.0014300968032330275, 0.0040126461535692215, 0.0011538645485416055, 0.006161487195640802, 0.03202470764517784, 0.16125911474227905, 0.01335094217211008, 0.056355878710746765, 0.3049190640449524], [0.000712214969098568, 0.00027582934126257896, 0.00032272498356178403, 0.0005960959824733436, 0.0001361142349196598, 0.0013408291852101684, 0.0043208240531384945, 0.5212433338165283, 0.00015730220184195787, 0.0037728215102106333, 4.344137414591387e-05, 0.00044489881838671863, 0.0038201187271624804, 0.0178498774766922, 0.010918423533439636, 0.09472732245922089, 0.33931779861450195], [0.0012212443398311734, 0.00014976495003793389, 4.5542590669356287e-05, 0.00010695974924601614, 4.146676292293705e-05, 0.00020151174976490438, 0.00020072115876246244, 0.5637592673301697, 9.507463983027264e-05, 0.00012587971286848187, 5.1853974582627416e-05, 0.00024788876180537045, 0.00043225576519034803, 0.0020064222626388073, 0.005542363505810499, 0.028193891048431396, 0.39757785201072693], [0.001453680801205337, 0.00039528272463940084, 7.770554657327011e-05, 0.0004527504206635058, 2.138262425432913e-05, 0.0003704636183101684, 0.001218548626638949, 0.6029167771339417, 0.0005022900295443833, 0.00025750978966243565, 5.918797614867799e-05, 0.0003361859708093107, 0.0005047914455644786, 0.0012252734741196036, 0.0006497654831036925, 0.0030441759154200554, 0.38651418685913086], [0.009416715241968632, 0.008238406851887703, 0.003558121155947447, 0.010110217146575451, 0.0017424746183678508, 0.0036868543829768896, 0.008248402737081051, 0.5419082641601562, 0.005468996707350016, 0.0036076223477721214, 0.0015987649094313383, 0.0025632374454289675, 0.002939678728580475, 0.0016528087435290217, 0.000902921543456614, 0.002282053930684924, 0.3920743763446808]], [[0.012134929187595844, 0.015492218546569347, 0.00249616801738739, 0.003497753757983446, 0.002424771897494793, 0.014491617679595947, 0.022283319383859634, 0.5162327289581299, 0.003880753880366683, 0.0028357268311083317, 0.0040343222208321095, 0.0021872492507100105, 0.003482232103124261, 0.001503036473877728, 0.0014324317453429103, 0.003211831906810403, 0.3883788585662842], [0.043539442121982574, 0.049938809126615524, 0.0034543536603450775, 0.005733744706958532, 0.0036039550323039293, 0.0013478315668180585, 0.013629858382046223, 0.49165448546409607, 0.0007937345071695745, 0.00047885117237456143, 0.001056457287631929, 0.002231602557003498, 0.0043318974785506725, 0.0006782227428629994, 0.0002386728156125173, 0.0026422881055623293, 0.37464573979377747], [0.010391141287982464, 0.07678362727165222, 0.0017067412845790386, 0.002578400308266282, 0.0011302907951176167, 0.001854997011832893, 0.003262014128267765, 0.5244001746177673, 0.0007317272247746587, 0.00018029891361948103, 3.861123332171701e-05, 7.738912245258689e-05, 0.0002810621226672083, 0.00044971765601076186, 0.00017114549700636417, 0.0003009303763974458, 0.3756616711616516], [0.005359490867704153, 0.039256688207387924, 0.0039106253534555435, 0.0056546349078416824, 0.0017336340388283134, 0.0037045013159513474, 0.0020847301930189133, 0.5379480123519897, 0.0004363158659543842, 0.00022993260063230991, 0.00028519838815554976, 0.0001994931371882558, 0.0002671126276254654, 0.0006542391492985189, 0.0004170140309724957, 0.0005312151624821126, 0.39732715487480164], [0.0071548097766935825, 0.015030900947749615, 0.0022806760389357805, 0.040182601660490036, 0.004317756742238998, 0.0035896445624530315, 0.0006421880680136383, 0.5485563278198242, 0.0003515915013849735, 6.875969847897068e-05, 1.3138859685568605e-05, 6.25635075266473e-05, 3.7071466067573056e-05, 0.00010738960554590449, 4.0497910958947614e-05, 6.492144166259095e-05, 0.37749916315078735], [0.04407753050327301, 0.022334247827529907, 0.002799727488309145, 0.024307359009981155, 0.05942212790250778, 0.016018759459257126, 0.028734520077705383, 0.4510243833065033, 0.0010471963323652744, 0.0003456895356066525, 0.0005133538506925106, 0.0005866107530891895, 0.001040262053720653, 0.00047369630192406476, 0.0007485056412406266, 0.005572082474827766, 0.34095388650894165], [0.0072427657432854176, 0.01898932084441185, 0.0011713637504726648, 0.03213275223970413, 0.1006462499499321, 0.0642051100730896, 0.028596658259630203, 0.4202783405780792, 0.0011296669254079461, 0.0005845446139574051, 0.00020177591068204492, 0.00022540071222465485, 0.00090057123452425, 0.0008120982674881816, 0.0014189507346600294, 0.0036600595340132713, 0.3178043067455292], [0.009843875654041767, 0.013248169794678688, 0.002443675184622407, 0.007478818763047457, 0.004410834982991219, 0.010061610490083694, 0.005105671472847462, 0.525078296661377, 0.0033773358445614576, 0.0027245597448199987, 0.0026539049576967955, 0.001539757358841598, 0.0019662249833345413, 0.0027861190028488636, 0.003019885392859578, 0.0022794322576373816, 0.4019818603992462], [0.045635562390089035, 0.001657123677432537, 8.148018241627142e-05, 0.0029530313331633806, 0.004975186660885811, 0.0013855256838724017, 0.004934057593345642, 0.5185210704803467, 0.015969615429639816, 0.004108143504709005, 0.0008596765692345798, 0.0023761061020195484, 0.0014269081875681877, 9.271092858398333e-05, 3.947590084862895e-05, 0.0017787711694836617, 0.39320558309555054], [0.006608237512409687, 0.0019977150950580835, 0.00012090996460756287, 0.0005137175903655589, 0.0035579074174165726, 0.0017350773559883237, 0.0012826380552724004, 0.3680080473423004, 0.3344712555408478, 0.012228765524923801, 0.0012088961666449904, 0.0010860528564080596, 0.0010729036293923855, 0.001363361836411059, 0.00015761498070787638, 0.0003474878612905741, 0.2642394006252289], [0.009772375226020813, 0.0029733136761933565, 0.00036108086351305246, 0.00047197440289892256, 0.00044530851300805807, 0.002168416976928711, 0.0035653586965054274, 0.5050207376480103, 0.019815821200609207, 0.044796548783779144, 0.006979561876505613, 0.003442759858444333, 0.0007534479955211282, 0.0003773870994336903, 0.00020334319560788572, 0.000575678248424083, 0.3982768952846527], [0.0069777388125658035, 0.001052972744219005, 0.00018323374388273805, 0.0006594301667064428, 0.0015799329848960042, 0.0009073065011762083, 0.0013683093711733818, 0.289193332195282, 0.008397076278924942, 0.020308421924710274, 0.4120912253856659, 0.024003615602850914, 0.0063629294745624065, 0.0013744058087468147, 0.0013069044798612595, 0.001271651010029018, 0.22296154499053955], [0.026758113875985146, 0.005109068937599659, 0.0012372881174087524, 0.002216388937085867, 0.00018580644973553717, 0.0011955249356105924, 0.002065366832539439, 0.44828879833221436, 0.017320923507213593, 0.014791283756494522, 0.02089679427444935, 0.04813483729958534, 0.05452694743871689, 0.009819847531616688, 0.0005457749939523637, 0.0018719785148277879, 0.345035195350647], [0.026897814124822617, 0.000751082377973944, 4.34111243521329e-05, 0.00047024624655023217, 0.0004092449089512229, 0.0003270387533120811, 0.0011243716580793262, 0.4636116623878479, 0.0008360311039723456, 0.0005534213851206005, 0.0031460970640182495, 0.010762319900095463, 0.11747442930936813, 0.011497611179947853, 0.0016384117770940065, 0.01548363734036684, 0.344973087310791], [0.008135799318552017, 0.0004903532681055367, 2.207469333370682e-05, 8.739673648960888e-05, 0.00026599777629598975, 0.0005724495276808739, 0.0009990244871005416, 0.5526165962219238, 0.0006004685419611633, 0.0005107761244289577, 0.00028252805350348353, 0.0006867104675620794, 0.00789592880755663, 0.027793321758508682, 0.0024983808398246765, 0.013901054859161377, 0.382641077041626], [0.0023997470270842314, 0.00035468500573188066, 3.624863165896386e-05, 0.00023783418873790652, 0.0006601939676329494, 0.00025841142632998526, 0.000258804444456473, 0.5626267194747925, 0.0002852977777365595, 0.0001182344785775058, 0.0015177460154518485, 0.0004161216493230313, 0.009563029743731022, 0.024062810465693474, 0.022292302921414375, 0.009626522660255432, 0.36528530716896057], [0.009326458908617496, 0.01057326141744852, 0.0018305826233699918, 0.006012782454490662, 0.0036366982385516167, 0.007874221540987492, 0.004345749504864216, 0.5343969464302063, 0.002854512305930257, 0.002143129473552108, 0.0021332998294383287, 0.0012479722499847412, 0.0016462380299344659, 0.0021972416434437037, 0.0023778406903147697, 0.0019760008435696363, 0.40542706847190857]], [[0.03539246320724487, 0.044933613389730453, 0.01424036268144846, 0.01662490889430046, 0.007354631554335356, 0.014308118261396885, 0.020172230899333954, 0.18362025916576385, 0.13767682015895844, 0.07754334807395935, 0.013296143151819706, 0.017110107466578484, 0.16701680421829224, 0.043021488934755325, 0.010231142863631248, 0.033163104206323624, 0.16429448127746582], [0.0078009855933487415, 0.05217166244983673, 0.12751184403896332, 0.20309984683990479, 0.06861145794391632, 0.1436161994934082, 0.06360876560211182, 0.1725260615348816, 0.0055521572940051556, 0.003432175377383828, 0.001461488544009626, 0.0032003019005060196, 0.0018620840273797512, 0.005717918276786804, 0.001196408411487937, 0.004492998123168945, 0.13413763046264648], [0.009762837551534176, 0.01698572374880314, 0.014574809931218624, 0.022866861894726753, 0.011399970389902592, 0.0379522331058979, 0.022190438583493233, 0.5024021863937378, 0.0021173555869609118, 0.0008548451005481184, 0.0006060765008442104, 0.002613954246044159, 0.0009099978487938643, 0.006754170637577772, 0.0006300437962636352, 0.006266572047024965, 0.3411119282245636], [0.009083726443350315, 0.03258905187249184, 0.02741721272468567, 0.09585630148649216, 0.02725798450410366, 0.06443633884191513, 0.034123435616493225, 0.4008280038833618, 0.00296266982331872, 0.0018283167155459523, 0.0014760495396330953, 0.0015239306958392262, 0.0010932724690064788, 0.006165963131934404, 0.001183610293082893, 0.004995749797672033, 0.2871783375740051], [0.005635023582726717, 0.025492656975984573, 0.05356141924858093, 0.20595863461494446, 0.04244311526417732, 0.051231324672698975, 0.035264793783426285, 0.3242082893848419, 0.004400957841426134, 0.0020648320205509663, 0.003443267662078142, 0.00299617531709373, 0.0017341814236715436, 0.002465439960360527, 0.001053825719282031, 0.0033922214061021805, 0.23465386033058167], [0.01027767639607191, 0.043764904141426086, 0.10498126596212387, 0.28035253286361694, 0.041418030858039856, 0.05303163826465607, 0.09587504714727402, 0.16408975422382355, 0.02081490494310856, 0.011076890863478184, 0.0082430774345994, 0.012575964443385601, 0.0033818064257502556, 0.007693647872656584, 0.0021150538232177496, 0.01279410533607006, 0.12751366198062897], [0.01582477055490017, 0.022368989884853363, 0.039018917828798294, 0.08423114567995071, 0.026230165734887123, 0.029954206198453903, 0.036084167659282684, 0.4014510214328766, 0.006331016309559345, 0.014243196696043015, 0.009860847145318985, 0.007150176912546158, 0.0029570087790489197, 0.0027406620793044567, 0.004102359525859356, 0.008958259597420692, 0.2884930670261383], [0.009568012319505215, 0.013815954327583313, 0.013416863046586514, 0.02583499066531658, 0.004744658712297678, 0.009024454280734062, 0.0033490112982690334, 0.5218000411987305, 0.0060274130664765835, 0.002814018167555332, 0.0030198285821825266, 0.0062002320773899555, 0.0033513852395117283, 0.0036827269941568375, 0.0019118750933557749, 0.003539275610819459, 0.36789920926094055], [0.004616744816303253, 0.008426404558122158, 0.00856455322355032, 0.012175400741398335, 0.006738622672855854, 0.016575131565332413, 0.00757252611219883, 0.3495500087738037, 0.016819272190332413, 0.020453333854675293, 0.009071428328752518, 0.03663598373532295, 0.03397708758711815, 0.07366131246089935, 0.01667950116097927, 0.09219849109649658, 0.2862841784954071], [0.003030609805136919, 0.00172978185582906, 0.0023018240462988615, 0.002950159600004554, 0.001144362729974091, 0.0026662119198590517, 0.0009847070323303342, 0.5052744150161743, 0.006031906232237816, 0.004067783709615469, 0.004534607287496328, 0.02026423066854477, 0.008977444842457771, 0.02421395666897297, 0.005714773200452328, 0.029973097145557404, 0.37614017724990845], [0.006042018067091703, 0.0015921663725748658, 0.006547433789819479, 0.0054186261259019375, 0.004201770294457674, 0.004739725962281227, 0.0017338477773591876, 0.4252581000328064, 0.01110632810741663, 0.014356458559632301, 0.008014691062271595, 0.0235330518335104, 0.014514587819576263, 0.060156408697366714, 0.026707950979471207, 0.053572364151477814, 0.3325044512748718], [0.004690694622695446, 0.011563602834939957, 0.004993120674043894, 0.004510411527007818, 0.0024428791366517544, 0.005140496417880058, 0.0030557038262486458, 0.4147408902645111, 0.03425651043653488, 0.01852973736822605, 0.006151593755930662, 0.00696222810074687, 0.018271394073963165, 0.06530910730361938, 0.019650662317872047, 0.05348753184080124, 0.32624346017837524], [0.004002328496426344, 0.004696457181125879, 0.00311167910695076, 0.009681333787739277, 0.003591477405279875, 0.004770220257341862, 0.002968086628243327, 0.4327305555343628, 0.014797625131905079, 0.01466042548418045, 0.0094672292470932, 0.012029741890728474, 0.008206343278288841, 0.042285412549972534, 0.036833468824625015, 0.06440751254558563, 0.33176007866859436], [0.003689211793243885, 0.004823937080800533, 0.007552321068942547, 0.014900093898177147, 0.0015733237378299236, 0.004042487591505051, 0.002183783333748579, 0.36661848425865173, 0.029090160503983498, 0.05831487849354744, 0.024194179102778435, 0.13349194824695587, 0.008249381557106972, 0.013522415421903133, 0.01272218395024538, 0.031696904450654984, 0.28333431482315063], [0.0031649013981223106, 0.0007944824174046516, 0.0030693113803863525, 0.0015005599707365036, 0.0002264968934468925, 0.0007838332676328719, 0.0004966675187461078, 0.4968203604221344, 0.011753031983971596, 0.015968112275004387, 0.004265510477125645, 0.09578412771224976, 0.0031019498128443956, 0.00823297630995512, 0.003583510173484683, 0.011168297380208969, 0.3392859101295471], [0.002776053035631776, 0.0021700880024582148, 0.005255658645182848, 0.004108811728656292, 0.0006918672588653862, 0.003894504625350237, 0.0025797849521040916, 0.45555153489112854, 0.008823426440358162, 0.038578420877456665, 0.020596977323293686, 0.0806693509221077, 0.0075058164075016975, 0.016817545518279076, 0.011806937865912914, 0.014642786234617233, 0.3235304653644562], [0.00860940758138895, 0.01150702778249979, 0.011126082390546799, 0.022280879318714142, 0.00390278035774827, 0.0076196459122002125, 0.0029445027466863394, 0.5325202345848083, 0.005541256628930569, 0.002665544394403696, 0.002962361555546522, 0.006112887058407068, 0.0031019311863929033, 0.0033277960028499365, 0.0017908780137076974, 0.003203160595148802, 0.37078356742858887]], [[0.02105846256017685, 0.019101882353425026, 0.0041950903832912445, 0.014711951836943626, 0.003899168223142624, 0.007391365710645914, 0.005572853144258261, 0.2256893813610077, 0.2139909863471985, 0.05929117649793625, 0.01667448878288269, 0.02313026413321495, 0.13342057168483734, 0.034869614988565445, 0.004419872537255287, 0.017629200592637062, 0.19495368003845215], [0.09151596575975418, 0.21008838713169098, 0.043939247727394104, 0.1662834882736206, 0.04082610458135605, 0.10099838674068451, 0.040499746799468994, 0.1167466789484024, 0.018107185140252113, 0.005967850796878338, 0.005310032516717911, 0.006435499060899019, 0.03703780472278595, 0.009771501645445824, 0.0008294267463497818, 0.0036287070252001286, 0.10201394557952881], [0.08679255098104477, 0.07269985973834991, 0.017979495227336884, 0.028063902631402016, 0.014814398251473904, 0.04933301359415054, 0.024485468864440918, 0.3604743778705597, 0.012586617842316628, 0.0049517834559082985, 0.0036596893332898617, 0.0038979060482233763, 0.02348783053457737, 0.005996506195515394, 0.0027627393137663603, 0.0071326992474496365, 0.2808811366558075], [0.08767355978488922, 0.13635078072547913, 0.035239845514297485, 0.13724327087402344, 0.03329010680317879, 0.044328223913908005, 0.03435865789651871, 0.1449677050113678, 0.035363081842660904, 0.017686452716588974, 0.028604324907064438, 0.026636935770511627, 0.048387184739112854, 0.03152349218726158, 0.013895610347390175, 0.015499196946620941, 0.12895148992538452], [0.0657811239361763, 0.06601422280073166, 0.018645694479346275, 0.05202465504407883, 0.04328853636980057, 0.07643458247184753, 0.029198497533798218, 0.3133590519428253, 0.019482124596834183, 0.006449607666581869, 0.003896522568538785, 0.005924086552113295, 0.02998710609972477, 0.010225072503089905, 0.002904109191149473, 0.007847762666642666, 0.24853724241256714], [0.054852887988090515, 0.08701568841934204, 0.01684819906949997, 0.06211966276168823, 0.049249399453401566, 0.10840268433094025, 0.05587795376777649, 0.24187427759170532, 0.027101662009954453, 0.004100144375115633, 0.003203147789463401, 0.0032158272806555033, 0.032954346388578415, 0.02719397284090519, 0.0028123941738158464, 0.011115944012999535, 0.21206185221672058], [0.03719504550099373, 0.047675490379333496, 0.012049965560436249, 0.012270371429622173, 0.012005084194242954, 0.049066029489040375, 0.020988933742046356, 0.4266993999481201, 0.007705371826887131, 0.002430735854431987, 0.002085383515805006, 0.002285297028720379, 0.015211105346679688, 0.00838780589401722, 0.002025302965193987, 0.006716660223901272, 0.33520203828811646], [0.007602925878018141, 0.004827121738344431, 0.0013496172614395618, 0.0022429560776799917, 0.0007215003133751452, 0.0040450310334563255, 0.005783813539892435, 0.5568481087684631, 0.0012822586577385664, 0.0004943335079587996, 0.0008652068208903074, 0.0007596623618155718, 0.0018599514150992036, 0.001120822736993432, 0.0010843497002497315, 0.0020300946198403835, 0.4070822596549988], [0.02303033135831356, 0.06632174551486969, 0.03152666613459587, 0.033328745514154434, 0.02979150041937828, 0.04299883171916008, 0.00791440811008215, 0.3318972587585449, 0.037177301943302155, 0.022176750004291534, 0.00870597641915083, 0.007801912259310484, 0.06906630843877792, 0.012161072343587875, 0.0063965716399252415, 0.019194433465600014, 0.25051015615463257], [0.007432466372847557, 0.014276672154664993, 0.004443021956831217, 0.011394022032618523, 0.006200187373906374, 0.013448784127831459, 0.0032542645931243896, 0.467684805393219, 0.024856312200427055, 0.015386702492833138, 0.004439453594386578, 0.007139013614505529, 0.05107295140624046, 0.008044007234275341, 0.004028948489576578, 0.012876519002020359, 0.34402185678482056], [0.006402334664016962, 0.014948786236345768, 0.007157870568335056, 0.010115891695022583, 0.005376024171710014, 0.008278830908238888, 0.0030313043389469385, 0.48465245962142944, 0.013261470943689346, 0.015316649340093136, 0.005895006004720926, 0.0063235219568014145, 0.027343595400452614, 0.00614633783698082, 0.007121018599718809, 0.01744796894490719, 0.36118099093437195], [0.011436976492404938, 0.037711530923843384, 0.011878268793225288, 0.01698177494108677, 0.01417006365954876, 0.023017000406980515, 0.008193421177566051, 0.4359745979309082, 0.019543413072824478, 0.019091855734586716, 0.009413733147084713, 0.0074509247206151485, 0.031808339059352875, 0.0086257578805089, 0.007620914373546839, 0.018267804756760597, 0.318813681602478], [0.0063243736512959, 0.03181307017803192, 0.007777595892548561, 0.016617251560091972, 0.010072224773466587, 0.020546574145555496, 0.003996904473751783, 0.4953460395336151, 0.015177024528384209, 0.01090487651526928, 0.001627835794351995, 0.002320181345567107, 0.01817052811384201, 0.006770993582904339, 0.0024850498884916306, 0.010009783320128918, 0.34003961086273193], [0.018122117966413498, 0.02519756555557251, 0.012873017229139805, 0.01301340851932764, 0.011127838864922523, 0.030749518424272537, 0.012082856148481369, 0.45068156719207764, 0.00900660827755928, 0.0107443081215024, 0.0034732487984001637, 0.0028818019200116396, 0.014477398246526718, 0.010671505704522133, 0.013698055408895016, 0.027970634400844574, 0.3332284986972809], [0.007621950004249811, 0.005902812350541353, 0.003439998021349311, 0.0035460893996059895, 0.0021943405736237764, 0.00785152055323124, 0.00796153862029314, 0.48468881845474243, 0.0061890799552202225, 0.009650468826293945, 0.0035811858251690865, 0.0032489814329892397, 0.009475122205913067, 0.007896237075328827, 0.022331183776259422, 0.028303522616624832, 0.3861171007156372], [0.010218862444162369, 0.010801728814840317, 0.003073164727538824, 0.008666586130857468, 0.005444214213639498, 0.019965698942542076, 0.011238189414143562, 0.47701990604400635, 0.0071023004129529, 0.011924095451831818, 0.00234043225646019, 0.0029674337711185217, 0.009586242958903313, 0.010551417246460915, 0.01121382787823677, 0.030090976506471634, 0.367794930934906], [0.006975265685468912, 0.0037775631062686443, 0.0010782586177811027, 0.0019481683848425746, 0.0006216369802132249, 0.0035798177123069763, 0.005189040210098028, 0.5588832497596741, 0.0010735791875049472, 0.00046839407877996564, 0.0007885974482633173, 0.0006893896497786045, 0.0017185320612043142, 0.000998837174847722, 0.0010542400414124131, 0.001888980739749968, 0.40926653146743774]], [[0.004648192785680294, 0.003442854853346944, 0.0026514327619224787, 0.010619414038956165, 0.006526973098516464, 0.003910184372216463, 0.0034715752117335796, 0.49433350563049316, 0.016302580013871193, 0.02519787661731243, 0.004452883265912533, 0.005397086497396231, 0.011241826228797436, 0.0031498554162681103, 0.0016576299676671624, 0.0027170495595782995, 0.40027916431427], [0.038913097232580185, 0.07803847640752792, 0.03787631541490555, 0.36051270365715027, 0.04141930118203163, 0.06129005551338196, 0.05830315873026848, 0.17197036743164062, 0.005536051467061043, 0.0042009176686406136, 0.0013033278519287705, 0.002835135441273451, 0.00472866278141737, 0.0029692533425986767, 0.00022088691184762865, 0.0005025758873671293, 0.1293797641992569], [0.05911184474825859, 0.08094269782304764, 0.09678555279970169, 0.32575079798698425, 0.12397237122058868, 0.0544070228934288, 0.03683457896113396, 0.09767790883779526, 0.0067354231141507626, 0.004448756575584412, 0.009519814513623714, 0.01185952965170145, 0.007989023812115192, 0.0021988616790622473, 0.0007562171667814255, 0.0014569781487807631, 0.07955274730920792], [0.04702761769294739, 0.0604376383125782, 0.04433160275220871, 0.05070260167121887, 0.04511605203151703, 0.05895973742008209, 0.11933939158916473, 0.3063926100730896, 0.010535024106502533, 0.004275477025657892, 0.0025875333230942488, 0.005445805378258228, 0.005525761749595404, 0.004163868259638548, 0.0013128508580848575, 0.0023702525068074465, 0.23147620260715485], [0.07964374125003815, 0.05127798020839691, 0.07750055193901062, 0.05841361731290817, 0.015070920810103416, 0.14958012104034424, 0.1835750937461853, 0.19679200649261475, 0.009893614798784256, 0.006574552971869707, 0.0020257041323930025, 0.004653098061680794, 0.004912586882710457, 0.002804320538416505, 0.0012980365427210927, 0.002898741513490677, 0.15308527648448944], [0.05142050236463547, 0.03977026417851448, 0.018897950649261475, 0.021504629403352737, 0.00429795915260911, 0.07824846357107162, 0.21567653119564056, 0.32269376516342163, 0.003890097141265869, 0.003306406084448099, 0.00033910846104845405, 0.0013466336531564593, 0.0014239212032407522, 0.0034608645364642143, 0.0007846765220165253, 0.0017554879887029529, 0.23118266463279724], [0.019600534811615944, 0.02895163930952549, 0.008638061583042145, 0.0042654480785131454, 0.003703672206029296, 0.08109024912118912, 0.015439261682331562, 0.471588134765625, 0.008895229548215866, 0.002278968458995223, 0.0007404423085972667, 0.0011031500762328506, 0.001379833440296352, 0.003823250997811556, 0.0006155156879685819, 0.0016413830453529954, 0.34624528884887695], [0.01047151442617178, 0.010869299992918968, 0.005988932680338621, 0.010030053555965424, 0.006244510877877474, 0.013405241072177887, 0.014496971853077412, 0.5180286765098572, 0.004804976750165224, 0.002887410344555974, 0.00278679421171546, 0.0027594459243118763, 0.0035072937607765198, 0.003620272036641836, 0.0014990769559517503, 0.0044788033701479435, 0.38412073254585266], [0.01577749475836754, 0.002443083329126239, 0.000985043472610414, 0.004559807945042849, 0.0016035408480092883, 0.003919276874512434, 0.005553583614528179, 0.14113229513168335, 0.06641194969415665, 0.19951926171779633, 0.05228813365101814, 0.08286251872777939, 0.2813529074192047, 0.015316086821258068, 0.002431280678138137, 0.008883558213710785, 0.11496027559041977], [0.018782716244459152, 0.0026479994412511587, 0.001701736357063055, 0.004286630544811487, 0.0016593735199421644, 0.004862004891037941, 0.0032281363382935524, 0.3294934034347534, 0.10797224193811417, 0.07477851212024689, 0.03742383420467377, 0.04901175945997238, 0.06568162143230438, 0.014153708703815937, 0.004842368420213461, 0.018946781754493713, 0.2605271637439728], [0.011821421794593334, 0.003464779118075967, 0.001223136205226183, 0.002425777493044734, 0.0007846727385185659, 0.0028822661843150854, 0.004055694676935673, 0.016376618295907974, 0.11163365840911865, 0.06842425465583801, 0.05581047013401985, 0.17675118148326874, 0.4582419693470001, 0.04552078992128372, 0.002799856010824442, 0.023654261603951454, 0.01412912830710411], [0.010904503986239433, 0.0028875672724097967, 0.0010423241183161736, 0.0019737579859793186, 0.000777874025516212, 0.0010597530053928494, 0.0014656251296401024, 0.05957600846886635, 0.18799330294132233, 0.20645125210285187, 0.07385137677192688, 0.056843988597393036, 0.276348739862442, 0.05028552561998367, 0.007932296954095364, 0.010650788433849812, 0.04995530843734741], [0.009988445788621902, 0.0023751899134367704, 0.0011509406613186002, 0.0020258519798517227, 0.0003051830572076142, 0.0022316589020192623, 0.005942929070442915, 0.33934950828552246, 0.03223421797156334, 0.08526736497879028, 0.021822165697813034, 0.06290415674448013, 0.08239644765853882, 0.03805691748857498, 0.011208119802176952, 0.041998788714408875, 0.26074209809303284], [0.006338858976960182, 0.0015343844424933195, 0.001248016837053001, 0.00037822622107341886, 0.00032369286054745317, 0.003694478888064623, 0.013205138966441154, 0.29161491990089417, 0.012927810661494732, 0.019628722220659256, 0.0072577581740915775, 0.015659401193261147, 0.06105148419737816, 0.1240500882267952, 0.055599551647901535, 0.1522640436887741, 0.23322340846061707], [0.006750902626663446, 0.0013924959348514676, 0.0010033359285444021, 9.653009328758344e-05, 0.00013429997488856316, 0.003255929332226515, 0.004341586958616972, 0.20430268347263336, 0.010519138537347317, 0.008869549259543419, 0.0068966541439294815, 0.013011535629630089, 0.03686394914984703, 0.11436480283737183, 0.05841456726193428, 0.35930654406547546, 0.17047546803951263], [0.004729514010250568, 0.0007008167449384928, 0.0005396735505200922, 0.0001490341528551653, 0.00012796091323252767, 0.003760706167668104, 0.002943586790934205, 0.42082223296165466, 0.007424837443977594, 0.006587756332010031, 0.0023053567856550217, 0.003017071168869734, 0.006946408189833164, 0.11773241311311722, 0.04483383893966675, 0.04712298512458801, 0.33025580644607544], [0.00839365553110838, 0.007631905842572451, 0.004248633980751038, 0.00776818348094821, 0.0046569244004786015, 0.01021517999470234, 0.011845833621919155, 0.534807562828064, 0.0037106736563146114, 0.002298369538038969, 0.0023386748507618904, 0.0021257945336401463, 0.002675264608114958, 0.002868054900318384, 0.0012678381754085422, 0.0034805487375706434, 0.38966691493988037]], [[0.013903412036597729, 0.010623575188219547, 0.000635845644865185, 0.0016100846696645021, 0.002886539790779352, 0.001707747345790267, 0.0022309802006930113, 0.16394196450710297, 0.25547999143600464, 0.11303569376468658, 0.015546607784926891, 0.043146368116140366, 0.215436190366745, 0.017529167234897614, 0.002148736733943224, 0.006052414886653423, 0.1340847611427307], [0.026143644005060196, 0.13789579272270203, 0.08231703191995621, 0.03356163576245308, 0.040601372718811035, 0.007436808664351702, 0.02873421274125576, 0.19873061776161194, 0.060738205909729004, 0.12310147285461426, 0.013294029980897903, 0.012296794913709164, 0.07028704881668091, 0.00518072908744216, 0.0006492637330666184, 0.0009117064182646573, 0.1581195890903473], [0.007914688438177109, 0.008608396165072918, 0.00834632571786642, 0.025869596749544144, 0.01830594427883625, 0.0027505101170390844, 0.0028210869058966637, 0.5277080535888672, 0.0026642580050975084, 0.00494261784479022, 0.0032171569764614105, 0.002120115328580141, 0.0038374774158000946, 0.00048555684043094516, 0.00021879689302295446, 0.0001335832930635661, 0.38005581498146057], [0.007410080172121525, 0.09402276575565338, 0.33337515592575073, 0.037222158163785934, 0.028156965970993042, 0.007217737380415201, 0.00403521116822958, 0.2510848045349121, 0.005124175921082497, 0.009653392247855663, 0.014411757700145245, 0.0032002590596675873, 0.0036925855092704296, 0.0007734932005405426, 0.0009143032948486507, 0.0003807791799772531, 0.19932451844215393], [0.015755310654640198, 0.10360319912433624, 0.39242416620254517, 0.10861736536026001, 0.010206320323050022, 0.005603366531431675, 0.028675023466348648, 0.17195270955562592, 0.006664469838142395, 0.012468398548662663, 0.006742625962942839, 0.0023948028683662415, 0.002496040193364024, 0.0008527169120498002, 0.00045099237468093634, 0.0006650317227467895, 0.1304275542497635], [0.013940802775323391, 0.01025816798210144, 0.008621418848633766, 0.009512112475931644, 0.016198186203837395, 0.015350515022873878, 0.07733681797981262, 0.46530842781066895, 0.004986444488167763, 0.010743516497313976, 0.0016347746131941676, 0.001350825885310769, 0.0056904093362390995, 0.006746944971382618, 0.001316023408435285, 0.003725755959749222, 0.34727880358695984], [0.015867462381720543, 0.0253658015280962, 0.004912342876195908, 0.03481610119342804, 0.047514911741018295, 0.06363902986049652, 0.03599691763520241, 0.41193175315856934, 0.008644253946840763, 0.012689891271293163, 0.0014663455076515675, 0.0011744051007553935, 0.01156623661518097, 0.010853887535631657, 0.0005922340205870569, 0.0019756925757974386, 0.31099265813827515], [0.0038976618088781834, 0.0074794115498661995, 0.0038347868248820305, 0.0065645803697407246, 0.003302107797935605, 0.0028914031572639942, 0.00468798540532589, 0.5540540218353271, 0.004090470261871815, 0.0027382499538362026, 0.002466007135808468, 0.0014583992306143045, 0.004061603918671608, 0.0017166860634461045, 0.00045810415758751333, 0.0008728450629860163, 0.3954256772994995], [0.014434382319450378, 0.015744900330901146, 0.005661110393702984, 0.009067351929843426, 0.003317477647215128, 0.000837066036183387, 0.004613218363374472, 0.10526461154222488, 0.2558676600456238, 0.3303821086883545, 0.02587483637034893, 0.04727783054113388, 0.09159015864133835, 0.0071126967668533325, 0.0012165674706920981, 0.001704144524410367, 0.08003375679254532], [0.018281469121575356, 0.010284986346960068, 0.006266843993216753, 0.007103536743670702, 0.008600609377026558, 0.007157580927014351, 0.008107493631541729, 0.12825043499469757, 0.29364362359046936, 0.21345259249210358, 0.03612866997718811, 0.06717440485954285, 0.06798845529556274, 0.015639716759324074, 0.004650560673326254, 0.006075994111597538, 0.10119305551052094], [0.014898733235895634, 0.013707383535802364, 0.03452673554420471, 0.011496474035084248, 0.00272614904679358, 0.001218108693137765, 0.0013226651353761554, 0.42525601387023926, 0.019792325794696808, 0.041148070245981216, 0.02803383767604828, 0.024172931909561157, 0.025110594928264618, 0.0031056897714734077, 0.001663245726376772, 0.0016802671598270535, 0.3501408100128174], [0.010551651939749718, 0.00870425533503294, 0.0070005152374506, 0.004969916306436062, 0.0029133870266377926, 0.0014959933469071984, 0.003403209615498781, 0.11750689893960953, 0.18754586577415466, 0.3975834548473358, 0.01882622018456459, 0.030286753550171852, 0.09300632774829865, 0.013402355834841728, 0.004880982916802168, 0.0020068923477083445, 0.0959153026342392], [0.016389602795243263, 0.00616547092795372, 0.0024295656476169825, 0.009044338017702103, 0.0011576034594327211, 0.0005836543277837336, 0.003226289991289377, 0.17262287437915802, 0.18557308614253998, 0.28851574659347534, 0.043155282735824585, 0.07012853771448135, 0.06217099726200104, 0.0028795977123081684, 0.0009284336119890213, 0.0016919494373723865, 0.13333703577518463], [0.0022212008479982615, 0.0012258847709745169, 0.000569426454603672, 0.0011578048579394817, 0.0012834003427997231, 0.0013029169058427215, 0.0030831322073936462, 0.501327633857727, 0.010309464298188686, 0.010621180757880211, 0.004099587444216013, 0.0021275437902659178, 0.015102606266736984, 0.050413161516189575, 0.008771194145083427, 0.016186730936169624, 0.3701971769332886], [0.0006288525182753801, 4.4573782361112535e-05, 3.8154961657710373e-05, 0.00017193764506373554, 0.00038921847590245306, 0.0017609260976314545, 0.0007235166267491877, 0.5134615898132324, 0.0015800328692421317, 0.0015052987728267908, 0.0016500088386237621, 0.0013270501513034105, 0.0024736092891544104, 0.04744929075241089, 0.016072507947683334, 0.031157471239566803, 0.37956592440605164], [0.0008650070521980524, 0.0001002375574898906, 0.00017956364899873734, 0.0006215755711309612, 0.0016774908872321248, 0.0017772327410057187, 0.0012352804187685251, 0.4272758364677429, 0.0034950943663716316, 0.012254757806658745, 0.0028750027995556593, 0.000918667356017977, 0.008368018083274364, 0.1513427346944809, 0.03895694017410278, 0.022276053205132484, 0.3257805407047272], [0.0030939553398638964, 0.0054433648474514484, 0.002927724039182067, 0.004843967035412788, 0.00233019283041358, 0.0021360409446060658, 0.0036734200548380613, 0.5652053952217102, 0.0033471521455794573, 0.0022795540280640125, 0.0020678197033703327, 0.001098231179639697, 0.0031936434097588062, 0.0013571378076449037, 0.0003659389913082123, 0.0006664511747658253, 0.39597001671791077]], [[0.010656710714101791, 0.0008398214704357088, 0.0010085308458656073, 0.0013813243713229895, 0.0008489462779834867, 0.0012689926661550999, 0.006463557481765747, 0.551730215549469, 0.003169531933963299, 0.00386054371483624, 0.0008925192523747683, 0.0027943372260779142, 0.0018041931325569749, 0.0007877142052166164, 0.00026937652728520334, 0.000547932751942426, 0.41167569160461426], [0.024827051907777786, 0.044606443494558334, 0.011051664128899574, 0.01900968700647354, 0.002818159991875291, 0.004833602346479893, 0.04853357374668121, 0.5010358691215515, 0.0012092306278645992, 0.0005354605964384973, 0.0001630623737582937, 0.00044838222675025463, 0.0004824246861971915, 0.0001172791889985092, 1.595236244611442e-05, 2.2071924831834622e-05, 0.34029003977775574], [0.050069279968738556, 0.21279992163181305, 0.01933296024799347, 0.036924269050359726, 0.003237913129851222, 0.01795029826462269, 0.044430509209632874, 0.3559248149394989, 0.003103818278759718, 0.0007501734071411192, 0.00027870447956956923, 0.0006802668212912977, 0.00045331745059229434, 0.00026039130170829594, 0.00013657848467119038, 0.00028547897818498313, 0.25338131189346313], [0.023023685440421104, 0.12141137570142746, 0.017255553975701332, 0.011121259070932865, 0.0038232621736824512, 0.01696798764169216, 0.07609598338603973, 0.4247925579547882, 0.001495402306318283, 0.0007399548776447773, 0.00034370593493804336, 0.0004371475079096854, 0.00029517506482079625, 0.0003966991207562387, 0.00010218834358965978, 0.0002587549388408661, 0.3014392554759979], [0.09576106071472168, 0.19742576777935028, 0.04419824853539467, 0.15897086262702942, 0.009728380478918552, 0.03078167699277401, 0.047233711928129196, 0.23739659786224365, 0.003990234341472387, 0.0007658710819669068, 0.00021691889560315758, 0.0005430500023066998, 0.0006197905750013888, 0.0003445236652623862, 3.96148934669327e-05, 0.00022218287631403655, 0.17176155745983124], [0.051156047731637955, 0.11404503136873245, 0.03036099672317505, 0.13940556347370148, 0.06586091220378876, 0.04301173985004425, 0.08722876757383347, 0.26729586720466614, 0.002337574027478695, 0.0006473364192061126, 0.0002715618466027081, 0.0005246303626336157, 0.0016815715935081244, 0.000513846636749804, 9.511327516520396e-05, 0.0002063548017758876, 0.19535697996616364], [0.026418259367346764, 0.038911156356334686, 0.016785142943263054, 0.0314890518784523, 0.06365862488746643, 0.10812786966562271, 0.05684245750308037, 0.3726513385772705, 0.002627086825668812, 0.0007097285706549883, 0.0007340862066484988, 0.0004208340251352638, 0.002608712762594223, 0.0046216025948524475, 0.00044560618698596954, 0.0007183123962022364, 0.2722301185131073], [0.008012780919671059, 0.008392502553761005, 0.006005004979670048, 0.008211031556129456, 0.004169235937297344, 0.010460609570145607, 0.014394670724868774, 0.523620069026947, 0.003403652459383011, 0.00355497351847589, 0.00378659600391984, 0.0017423898680135608, 0.004212078172713518, 0.0038116970099508762, 0.0018717258935794234, 0.003007060382515192, 0.391343891620636], [0.190398171544075, 0.004022962413728237, 0.003225849010050297, 0.005385962780565023, 0.003614953951910138, 0.004278045147657394, 0.017346439883112907, 0.3776796758174896, 0.0671217292547226, 0.00980730913579464, 0.0014958289684727788, 0.022284438833594322, 0.008077270351350307, 0.0008733576396480203, 0.00015115168935153633, 0.0003204523236490786, 0.2839163541793823], [0.05777157098054886, 0.0007115869084373116, 0.0005771724972873926, 0.0015480992151424289, 0.002678077667951584, 0.002544946502894163, 0.009322685189545155, 0.4118673503398895, 0.1719420701265335, 0.01720162108540535, 0.0033340263180434704, 0.015078948810696602, 0.005922044161707163, 0.0034504143986850977, 0.0005800679209642112, 0.0006591881974600255, 0.2948101758956909], [0.04606853052973747, 0.00315968063659966, 0.0008404816035181284, 0.0011048185406252742, 0.00036002599517814815, 0.0016067589167505503, 0.0026760550681501627, 0.4431912899017334, 0.1331116408109665, 0.00894797034561634, 0.0017203271854668856, 0.01989992894232273, 0.005695056170225143, 0.0028438426088541746, 0.0009144539362750947, 0.0014165209140628576, 0.326442688703537], [0.07055703550577164, 0.0013580756494775414, 0.001080965274013579, 0.0013442065101116896, 0.0015060213627293706, 0.0032457797788083553, 0.01137834507972002, 0.3686525821685791, 0.17252062261104584, 0.03023175708949566, 0.00809342972934246, 0.01638713665306568, 0.02535460889339447, 0.00908887479454279, 0.0011310952249914408, 0.0007668025209568441, 0.277302622795105], [0.2886202037334442, 0.005896302871406078, 0.0023113794159144163, 0.0027034154627472162, 0.00023407158732879907, 0.0025111210998147726, 0.017270607873797417, 0.2812270522117615, 0.11277918517589569, 0.027458515018224716, 0.0064676683396101, 0.04070272296667099, 0.007689335383474827, 0.0016982550732791424, 0.0003148906398564577, 0.0004952427698299289, 0.20162004232406616], [0.054601039737463, 0.002698288531973958, 0.001911144470795989, 0.002415318740531802, 0.001526957144960761, 0.0054902611300349236, 0.006297803949564695, 0.40457311272621155, 0.02545112371444702, 0.008527134545147419, 0.003263151738792658, 0.01973588578402996, 0.11266074329614639, 0.022901998832821846, 0.005088830832391977, 0.006071664858609438, 0.316785603761673], [0.012839782983064651, 0.0012222270015627146, 0.0009808631148189306, 0.0005109180929139256, 0.0008188929641619325, 0.006628413684666157, 0.0039040548726916313, 0.4408884346485138, 0.0070266821421682835, 0.0049997977912425995, 0.0030268284026533365, 0.011389974504709244, 0.052978985011577606, 0.04722047969698906, 0.022955598309636116, 0.055931977927684784, 0.32667607069015503], [0.00836222991347313, 0.0004100291698705405, 0.0004995992640033364, 0.0005281352787278593, 0.0004244785523042083, 0.0022620155941694975, 0.0026634102687239647, 0.515430748462677, 0.0015850422205403447, 0.0012324906419962645, 0.0019174340413883328, 0.0026088703889399767, 0.039381951093673706, 0.024177078157663345, 0.01811404339969158, 0.006271424703299999, 0.3741310238838196], [0.006829690653830767, 0.006050325930118561, 0.004600776359438896, 0.006124165374785662, 0.00295969657599926, 0.007735991384834051, 0.011275558732450008, 0.5379719734191895, 0.002759274560958147, 0.003173155477270484, 0.0033373888581991196, 0.0014604296302422881, 0.0034570696298033, 0.00301708304323256, 0.001678207772783935, 0.002508054720237851, 0.3950611650943756]]], [[[0.011778379790484905, 0.03165699914097786, 0.007932649925351143, 0.009723009541630745, 0.004191084299236536, 0.01757962256669998, 0.07558417320251465, 0.5286822319030762, 0.008867246098816395, 0.003844513790681958, 0.0023858908098191023, 0.0016925567761063576, 0.02116350457072258, 0.0019341084407642484, 0.0006740562967024744, 0.0024238734040409327, 0.2698860764503479], [0.0070331585593521595, 0.03474276885390282, 0.11704154312610626, 0.012540064752101898, 0.000989006832242012, 0.005053846165537834, 0.032846849411726, 0.46950772404670715, 0.003426749724894762, 0.00802597776055336, 0.0009419370908290148, 0.0006603601505048573, 0.0033269578125327826, 0.0016024248907342553, 0.0005893349880352616, 0.0010437554446980357, 0.30062752962112427], [0.001831729430705309, 0.0043436517007648945, 0.17206604778766632, 0.002442621625959873, 0.0003059872251469642, 0.0009829388000071049, 0.009025774896144867, 0.5763779282569885, 0.0001724223984638229, 0.002573538338765502, 0.00020257063442841172, 0.00010553075117059052, 0.0001371508842566982, 0.0002438993687974289, 0.0007907812250778079, 0.0009292939212173223, 0.22746816277503967], [0.00187400181312114, 0.005452608224004507, 0.04723650589585304, 0.02573348954319954, 0.0006291329045780003, 0.0013828753726556897, 0.0057067666202783585, 0.6844225525856018, 0.00017170840874314308, 0.0013233365025371313, 0.00026600121054798365, 0.00023294986749533564, 0.00012011083890683949, 0.00010966901027131826, 0.00014024645497556776, 0.00025225302670150995, 0.22494575381278992], [0.006594211794435978, 0.006388525478541851, 0.13849176466464996, 0.07465358078479767, 0.020006483420729637, 0.007328663021326065, 0.06784012913703918, 0.36474844813346863, 0.0006209492566995323, 0.00875956378877163, 0.0008182940073311329, 0.0006090298993512988, 0.0009420326096005738, 0.0004526966658886522, 0.0004898518673144281, 0.0015127587830647826, 0.29974302649497986], [0.00468576792627573, 0.008722933940589428, 0.04795348644256592, 0.0064021022990345955, 0.0017310800030827522, 0.022112052887678146, 0.17764176428318024, 0.4242194592952728, 0.004014854785054922, 0.007171340752393007, 0.0009121242328546941, 0.0007501594373025, 0.0018686262192204595, 0.003460242412984371, 0.0017804150702431798, 0.004522507078945637, 0.28205111622810364], [0.028164034709334373, 0.01194404810667038, 0.03830389305949211, 0.0060393111780285835, 0.0017788761761039495, 0.008039308711886406, 0.07464370131492615, 0.5123236775398254, 0.0032751006074249744, 0.005244655534625053, 0.002541617024689913, 0.0008791483123786747, 0.003665814409032464, 0.0026929182931780815, 0.00917515717446804, 0.0015130304964259267, 0.28977569937705994], [0.0022811752278357744, 0.02319641225039959, 0.037737857550382614, 0.02059542015194893, 0.0031552696600556374, 0.00955635029822588, 0.0028077377937734127, 0.7622511982917786, 0.0014645615592598915, 0.0034280193503946066, 0.0008564864401705563, 0.0010002563940361142, 0.0035692057572305202, 0.0014860100345686078, 0.003335065906867385, 0.003217922057956457, 0.12006112933158875], [0.009641136974096298, 0.025326576083898544, 0.026875916868448257, 0.008188514970242977, 0.00274420203641057, 0.025599239394068718, 0.07167235016822815, 0.31681177020072937, 0.11773303151130676, 0.05146321654319763, 0.006947430316358805, 0.008199144154787064, 0.06222199648618698, 0.030910279601812363, 0.005212176591157913, 0.006267891265451908, 0.22418498992919922], [0.002109761815518141, 0.004226265009492636, 0.07210744917392731, 0.010651719756424427, 0.0014636669075116515, 0.007354007102549076, 0.13022923469543457, 0.3874727189540863, 0.009231919422745705, 0.0857505053281784, 0.0015441038412973285, 0.005902897100895643, 0.009464292787015438, 0.005167586263269186, 0.0015904736937955022, 0.00941320788115263, 0.2563200891017914], [0.004223366733640432, 0.002626447705551982, 0.011037878692150116, 0.015666762366890907, 0.0010215368820354342, 0.002879766281694174, 0.009394120424985886, 0.5177751779556274, 0.0034479289315640926, 0.014206061139702797, 0.04850877448916435, 0.01615968719124794, 0.010355575010180473, 0.00647677993401885, 0.005947130266577005, 0.009348835796117783, 0.32092413306236267], [0.006386110093444586, 0.003683935385197401, 0.0056730760261416435, 0.014042403548955917, 0.0012330538593232632, 0.009419182315468788, 0.030077114701271057, 0.5017910599708557, 0.0035807460080832243, 0.019586797803640366, 0.005253526847809553, 0.0842403993010521, 0.011976697482168674, 0.004221876617521048, 0.0032336607109755278, 0.01388236228376627, 0.28171804547309875], [0.012155064381659031, 0.007409754674881697, 0.004397288896143436, 0.008233885280787945, 0.0013320136349648237, 0.0056645008735358715, 0.0318310484290123, 0.4605944752693176, 0.01015162467956543, 0.017817294225096703, 0.008433387614786625, 0.010874640196561813, 0.12859173119068146, 0.005994078703224659, 0.0028773690573871136, 0.00665317103266716, 0.2769886255264282], [0.0027537245769053698, 0.0036328022833913565, 0.005577662028372288, 0.0038106930442154408, 0.0019492539577186108, 0.012945123016834259, 0.011119065806269646, 0.44588541984558105, 0.014291144907474518, 0.011657710187137127, 0.0022143220994621515, 0.003548076841980219, 0.03355540707707405, 0.1519237756729126, 0.02607247419655323, 0.03367946296930313, 0.23538394272327423], [0.00029883993556723, 0.0003465866611804813, 0.0022576830815523863, 0.0005133686936460435, 5.995362880639732e-05, 0.001136656734161079, 0.00424750754609704, 0.6050891876220703, 0.00021222331270109862, 0.001574298250488937, 0.00022403067850973457, 0.0003947735531255603, 0.00047106482088565826, 0.0037779808044433594, 0.09869913756847382, 0.009850185364484787, 0.27084654569625854], [0.0018904947210103273, 0.0020954327192157507, 0.02458838000893593, 0.0033845575526356697, 0.0014684107154607773, 0.008489667437970638, 0.04170070216059685, 0.47830501198768616, 0.0010329067008569837, 0.011650288477540016, 0.0013705156743526459, 0.0018217930337414145, 0.0063153961673378944, 0.01822103187441826, 0.04106822982430458, 0.10012187063694, 0.2564753293991089], [0.0014136541867628694, 0.010964887216687202, 0.02134266123175621, 0.009557553566992283, 0.0013062885263934731, 0.004398785065859556, 0.0017203751485794783, 0.8262906074523926, 0.0007609241874888539, 0.002056643832474947, 0.0006245280965231359, 0.0005568765918724239, 0.0017411516746506095, 0.0006453447276726365, 0.001584174344316125, 0.0017488424200564623, 0.11328675597906113]], [[0.019277745857834816, 0.024589868262410164, 0.005696321837604046, 0.002298590261489153, 0.004114520736038685, 0.023255854845046997, 0.05529598146677017, 0.36301013827323914, 0.07500237971544266, 0.045327987521886826, 0.01216894667595625, 0.03438764065504074, 0.06312470138072968, 0.023146696388721466, 0.003449641866609454, 0.03772445395588875, 0.20812854170799255], [0.08875267952680588, 0.10880963504314423, 0.05643032118678093, 0.0065253726206719875, 0.008524295873939991, 0.060144368559122086, 0.013020815327763557, 0.370817095041275, 0.012173419818282127, 0.007771195378154516, 0.006266776937991381, 0.005544445477426052, 0.013791763223707676, 0.010020928457379341, 0.0016520287608727813, 0.008558062836527824, 0.22119686007499695], [0.016882501542568207, 0.029341034591197968, 0.02126455120742321, 0.005653906147927046, 0.0027595649007707834, 0.01737160235643387, 0.002424478530883789, 0.6704595685005188, 0.0021094700787216425, 0.0013319095596671104, 0.0012366862501949072, 0.0009391726925969124, 0.0027546531055122614, 0.002956267213448882, 0.001471922965720296, 0.0017157003749161959, 0.21932698786258698], [0.029202254489064217, 0.06692252308130264, 0.06995777785778046, 0.016461443156003952, 0.010741173289716244, 0.042946673929691315, 0.008874984458088875, 0.4949212372303009, 0.002817600965499878, 0.0034561334177851677, 0.003916086163371801, 0.003637258429080248, 0.00453875120729208, 0.004226867109537125, 0.0016914233565330505, 0.0031992478761821985, 0.23248858749866486], [0.09601905941963196, 0.06840411573648453, 0.1224394217133522, 0.03544906899333, 0.027376096695661545, 0.09945977479219437, 0.19540347158908844, 0.13829892873764038, 0.0061953128315508366, 0.03889516741037369, 0.005382696632295847, 0.00826807040721178, 0.00625613471493125, 0.0082086231559515, 0.010191448964178562, 0.01449411828070879, 0.11925851553678513], [0.05596718564629555, 0.021584687754511833, 0.007339956238865852, 0.0012268004938960075, 0.0016217041993513703, 0.025291679427027702, 0.011732579208910465, 0.5418863892555237, 0.0060881138779222965, 0.0046478756703436375, 0.0026409958954900503, 0.003076463472098112, 0.007349640130996704, 0.00541339349001646, 0.002005556132644415, 0.003051833948120475, 0.299075186252594], [0.08472137898206711, 0.041236136108636856, 0.02110617607831955, 0.0075948042795062065, 0.004644628148525953, 0.05971939116716385, 0.0019351777154952288, 0.5277090668678284, 0.0009569675312377512, 0.0006223632954061031, 0.0028414034750312567, 0.00039757147897034883, 0.0028592136222869158, 0.006746345199644566, 0.005237489473074675, 0.0015501779271289706, 0.23012177646160126], [0.03361385315656662, 0.042061176151037216, 0.01714962162077427, 0.009775884449481964, 0.006430193781852722, 0.03701549395918846, 0.018116027116775513, 0.5115548372268677, 0.014006099663674831, 0.006320218089967966, 0.00706414645537734, 0.005013005342334509, 0.009616820141673088, 0.013563885353505611, 0.0034947008825838566, 0.009073307737708092, 0.25613075494766235], [0.02069874107837677, 0.01043448131531477, 0.003622575895860791, 0.0027720793150365353, 0.00163747847545892, 0.0060441140085458755, 0.015880828723311424, 0.4972248077392578, 0.019370531663298607, 0.06456679105758667, 0.020001903176307678, 0.03698835149407387, 0.03504651039838791, 0.007444628514349461, 0.0032831078860908747, 0.00914907455444336, 0.24583400785923004], [0.009396915324032307, 0.008754069916903973, 0.005035027861595154, 0.0009472919045947492, 0.000816609594039619, 0.005389122758060694, 0.004731811583042145, 0.5392392873764038, 0.028001463040709496, 0.0255533866584301, 0.01331025455147028, 0.03941193222999573, 0.04092125594615936, 0.01978723146021366, 0.01359209232032299, 0.010372031480073929, 0.2347402572631836], [0.0020249842200428247, 0.002214999170973897, 0.0013396504800766706, 0.0010446255328133702, 0.00047166665899567306, 0.003662594361230731, 0.012982950545847416, 0.6640608310699463, 0.002101437421515584, 0.0054976497776806355, 0.005041120573878288, 0.0063076866790652275, 0.008934580720961094, 0.00215730257332325, 0.004157139919698238, 0.00498522212728858, 0.273015558719635], [0.013376330956816673, 0.006371388677507639, 0.0028769562486559153, 0.0012117158621549606, 0.000582011416554451, 0.002980900229886174, 0.004880707710981369, 0.6246200799942017, 0.009149904362857342, 0.01310441829264164, 0.007637325674295425, 0.0168150644749403, 0.03047161176800728, 0.002694958820939064, 0.0031748469918966293, 0.002735040383413434, 0.2573166787624359], [0.008672471158206463, 0.0027354855556041002, 0.0018763638800010085, 0.0012120106257498264, 0.001036526053212583, 0.0037757621612399817, 0.011649711057543755, 0.6081286668777466, 0.005035641137510538, 0.03107079304754734, 0.00998520478606224, 0.01426475029438734, 0.019839461892843246, 0.005180987063795328, 0.016154052689671516, 0.013235386461019516, 0.24614675343036652], [0.005501341074705124, 0.0009594668517820537, 0.00035717972787097096, 0.000612736155744642, 0.0002737265604082495, 0.0013586089480668306, 0.004921034909784794, 0.6922136545181274, 0.0015662068035453558, 0.010418660007417202, 0.002307949820533395, 0.004086006432771683, 0.009144686162471771, 0.004018046427518129, 0.006625715643167496, 0.00546954246237874, 0.2501654326915741], [0.003278226824477315, 0.0015071288216859102, 0.0013276173267513514, 0.0006581317284144461, 0.00038516594213433564, 0.001981346169486642, 0.002276766812428832, 0.6643900871276855, 0.0025823330506682396, 0.0032902774401009083, 0.00143627158831805, 0.004889187403023243, 0.006653347983956337, 0.013423419557511806, 0.017084164544939995, 0.020127128809690475, 0.2547093331813812], [0.008548463694751263, 0.0018340627430006862, 0.0010490010026842356, 0.00043611900764517486, 0.0004821592883672565, 0.0032333978451788425, 0.0038205687887966633, 0.6366986632347107, 0.0035057871136814356, 0.007899132557213306, 0.003116113832220435, 0.0064576067961752415, 0.015924787148833275, 0.01298163365572691, 0.030124012380838394, 0.009028472937643528, 0.254859983921051], [0.017512725666165352, 0.022267917171120644, 0.009943819604814053, 0.008128554560244083, 0.00439094752073288, 0.024363696575164795, 0.01218484714627266, 0.6067892909049988, 0.006701506208628416, 0.0039465464651584625, 0.004547390155494213, 0.0032975177746266127, 0.006328345276415348, 0.00785874854773283, 0.0024738095235079527, 0.005037389229983091, 0.2542269825935364]], [[0.009548085741698742, 0.0028994539752602577, 0.0022675113286823034, 0.0038795615546405315, 0.0028191269375383854, 0.004908949136734009, 0.00444787135347724, 0.5617684125900269, 0.0030798427760601044, 0.0015259026549756527, 0.005525792017579079, 0.0010173111222684383, 0.0075424970127642155, 0.007785332389175892, 0.009663975797593594, 0.011220501735806465, 0.3600998520851135], [0.009240121580660343, 0.03983833268284798, 0.033098019659519196, 0.046897850930690765, 0.01221412979066372, 0.08378443121910095, 0.05175050348043442, 0.4775988757610321, 0.0010246318997815251, 0.00012529813102446496, 0.0006063429755158722, 0.0001693956000963226, 0.0010100267827510834, 0.004384277854114771, 0.0021107816137373447, 0.002663786755874753, 0.23348309099674225], [0.014946705661714077, 0.030549131333827972, 0.020450806245207787, 0.013584188185632229, 0.008975313045084476, 0.04261016473174095, 0.045676350593566895, 0.5558201670646667, 0.000862622749991715, 0.00018538205767981708, 0.0003546209482010454, 0.00015718504437245429, 0.0006780424737371504, 0.0013617015210911632, 0.0012286343844607472, 0.0015506905037909746, 0.26100826263427734], [0.023747343569993973, 0.1289290338754654, 0.08005097508430481, 0.030489761382341385, 0.0216085035353899, 0.10249754786491394, 0.10376390069723129, 0.3171675205230713, 0.00045616814168170094, 0.00013853952987119555, 0.0005267811357043684, 0.00011460176756372675, 0.0002873706689570099, 0.0012152366107329726, 0.000678465177770704, 0.0026929776649922132, 0.18563532829284668], [0.03583849221467972, 0.1469113528728485, 0.08085722476243973, 0.09292234480381012, 0.019100571051239967, 0.08099400252103806, 0.06385605037212372, 0.29668158292770386, 0.0013059884076938033, 0.0002887963782995939, 0.0004303515306673944, 0.00019347367924638093, 0.0011550384806469083, 0.0010017602471634746, 0.0003973857965320349, 0.0012967276852577925, 0.17676874995231628], [0.04228191077709198, 0.2066255658864975, 0.10754743963479996, 0.170302152633667, 0.01715237833559513, 0.04689003527164459, 0.06873517483472824, 0.17552845180034637, 0.006985299289226532, 0.0008863514522090554, 0.0024411864578723907, 0.0014710624236613512, 0.007220716681331396, 0.011309077963232994, 0.004798203241080046, 0.008603946305811405, 0.12122111767530441], [0.02197185903787613, 0.12422139197587967, 0.16263368725776672, 0.059415049850940704, 0.03301416337490082, 0.21124252676963806, 0.06738223135471344, 0.221749946475029, 0.0012990013929083943, 0.0006360728875733912, 0.0012972187250852585, 0.0006106226937845349, 0.0033489069901406765, 0.001453361357562244, 0.0009312123293057084, 0.0021342451218515635, 0.08665846288204193], [0.01113000139594078, 0.05467669293284416, 0.07480213791131973, 0.08117908239364624, 0.008772493340075016, 0.04066956788301468, 0.09051277488470078, 0.39867278933525085, 0.0031911665573716164, 0.0008829750586301088, 0.0028709243051707745, 0.0011193316895514727, 0.005654980894178152, 0.006104514468461275, 0.007275376468896866, 0.008365781977772713, 0.204119473695755], [0.01182614453136921, 0.002485171426087618, 0.002414643531665206, 0.0026340181939303875, 0.001206871005706489, 0.007508565671741962, 0.01859925128519535, 0.4847816228866577, 0.01897297613322735, 0.006356467958539724, 0.044379182159900665, 0.004919344559311867, 0.041026972234249115, 0.03034653514623642, 0.013348580338060856, 0.01709182932972908, 0.29210183024406433], [0.012277799658477306, 0.0016887430101633072, 0.0007670294726267457, 0.001266277628019452, 0.002165739657357335, 0.004966673906892538, 0.010419261641800404, 0.6174615621566772, 0.003374255495145917, 0.0014915247447788715, 0.002912902506068349, 0.0006261896924115717, 0.007321385201066732, 0.0023824884556233883, 0.0029167276807129383, 0.004694441333413124, 0.32326701283454895], [0.011045924387872219, 0.0048765926621854305, 0.005600411910563707, 0.003934202250093222, 0.002241854788735509, 0.006728596985340118, 0.028498411178588867, 0.21724306046962738, 0.05350759997963905, 0.029792364686727524, 0.08372130244970322, 0.02958553284406662, 0.10807976126670837, 0.0913369357585907, 0.08131635934114456, 0.08765298128128052, 0.1548379808664322], [0.006472658831626177, 0.0008831421146169305, 0.0010603027185425162, 0.0005164258764125407, 0.0010638408130034804, 0.003708388190716505, 0.009364652447402477, 0.5351151823997498, 0.017358379438519478, 0.008001948706805706, 0.026606092229485512, 0.003296970622614026, 0.03447214886546135, 0.017558395862579346, 0.014075424522161484, 0.014985928311944008, 0.3054601550102234], [0.00969822145998478, 0.0018468430498614907, 0.00239880895242095, 0.0006269347504712641, 0.000963465019594878, 0.004778528586030006, 0.007834906689822674, 0.6011707782745361, 0.007816808298230171, 0.004037713166326284, 0.0124910157173872, 0.002769134007394314, 0.023751815780997276, 0.011076435446739197, 0.006946835666894913, 0.011204487644135952, 0.29058730602264404], [0.0072907982394099236, 0.0021899049170315266, 0.005169613752514124, 0.002075376221910119, 0.0006285405834205449, 0.0020117382518947124, 0.013657594099640846, 0.20741455256938934, 0.03824053704738617, 0.027511343359947205, 0.11825791746377945, 0.02845815010368824, 0.14829619228839874, 0.09032692760229111, 0.07183998823165894, 0.10176077485084534, 0.1348700225353241], [0.0038723512552678585, 0.001241197227500379, 0.003882938763126731, 0.001069758553057909, 0.0004290594079066068, 0.001760890823788941, 0.02291986532509327, 0.3274613320827484, 0.00783026497811079, 0.010338897816836834, 0.017167063429951668, 0.004411616362631321, 0.026874320581555367, 0.040231890976428986, 0.15321749448776245, 0.16144263744354248, 0.21584835648536682], [0.007035073358565569, 0.0019884223584085703, 0.0055932216346263885, 0.0016798193100839853, 0.0006719385855831206, 0.001851701526902616, 0.01952170766890049, 0.11296360939741135, 0.03880291432142258, 0.054122138768434525, 0.07528164237737656, 0.020604917779564857, 0.099748894572258, 0.07716884464025497, 0.27084189653396606, 0.11766897141933441, 0.09445425122976303], [0.0056172520853579044, 0.02024506963789463, 0.029372185468673706, 0.02645188383758068, 0.004771166481077671, 0.023836195468902588, 0.03631103038787842, 0.582915723323822, 0.0026415756437927485, 0.0008657829603180289, 0.002403985010460019, 0.000845111149828881, 0.004852882120758295, 0.004164927173405886, 0.004528185352683067, 0.005741606466472149, 0.2444353997707367]], [[0.022456655278801918, 0.021947825327515602, 0.003134569153189659, 0.06960521638393402, 0.02730356529355049, 0.025096148252487183, 0.2664371728897095, 0.3517386019229889, 0.004518304020166397, 0.0015029646456241608, 0.0009028114145621657, 0.0019885553047060966, 0.0053007639944553375, 0.0024185706861317158, 0.0012482303427532315, 0.0006944097112864256, 0.19370558857917786], [0.05893108993768692, 0.03343524783849716, 0.00914047658443451, 0.06665322184562683, 0.028155440464615822, 0.003755362704396248, 0.015476088039577007, 0.4962178170681, 0.0012719521764665842, 0.0013590195449069142, 0.000604169734288007, 0.0003547204833012074, 0.001978820189833641, 0.000249986769631505, 0.00011212703248020262, 0.00041208104812540114, 0.281892329454422], [0.01853582076728344, 0.2517050802707672, 0.050153836607933044, 0.18930920958518982, 0.01920868456363678, 0.021971946582198143, 0.016142411157488823, 0.2739299237728119, 0.004557641688734293, 0.0035171485505998135, 0.0028798289131373167, 0.0020486400462687016, 0.002549149328842759, 0.0007066048565320671, 0.00021160405594855547, 0.0016870193649083376, 0.1408853828907013], [0.02184363827109337, 0.0399944968521595, 0.004134890157729387, 0.017478760331869125, 0.010829243808984756, 0.008624649606645107, 0.007139834575355053, 0.5936956405639648, 0.0015365126309916377, 0.0012042404850944877, 0.00047354603884741664, 0.0006593057769350708, 0.0006214440218172967, 0.0003055845445487648, 0.00011743933282559738, 0.00026513918419368565, 0.2910757064819336], [0.04136659950017929, 0.06156838685274124, 0.007219827733933926, 0.2892051637172699, 0.20373280346393585, 0.048078179359436035, 0.01939469203352928, 0.18154558539390564, 0.002606752561405301, 0.002397174248471856, 0.00042817284702323377, 0.0006995027652010322, 0.0015020128339529037, 0.0007009211694821715, 0.00029365430236794055, 0.001315823057666421, 0.137944757938385], [0.018971599638462067, 0.015424394980072975, 0.0060884905979037285, 0.07139655202627182, 0.20629224181175232, 0.02861129678785801, 0.10659796744585037, 0.2987186312675476, 0.0007767421775497496, 0.00114180997479707, 0.0004270350618753582, 0.00039904977893456817, 0.0029550576582551003, 0.0005881515680812299, 0.0007020286284387112, 0.001407432253472507, 0.23950159549713135], [0.01070720236748457, 0.0330820269882679, 0.017394213005900383, 0.0592963881790638, 0.16619333624839783, 0.18072688579559326, 0.04802052304148674, 0.26000210642814636, 0.006736087612807751, 0.004113651812076569, 0.0019290994387120008, 0.0026642093434929848, 0.007190258242189884, 0.006868966855108738, 0.004474995657801628, 0.010398702695965767, 0.18020132184028625], [0.06455700099468231, 0.041797466576099396, 0.015422980301082134, 0.03485666960477829, 0.02291341871023178, 0.03552282974123955, 0.06079132854938507, 0.2982935607433319, 0.03033851459622383, 0.029037032276391983, 0.02657744474709034, 0.025076409801840782, 0.03708687424659729, 0.022693945094943047, 0.018732743337750435, 0.028577197343111038, 0.20772463083267212], [0.12974059581756592, 0.014950132928788662, 0.0012736949138343334, 0.010799301788210869, 0.017651354894042015, 0.004274541977792978, 0.009416776709258556, 0.5103034377098083, 0.010291455313563347, 0.00874420627951622, 0.002133300993591547, 0.0034947518724948168, 0.01764869876205921, 0.001449701376259327, 0.0009878795826807618, 0.00324090919457376, 0.2535991370677948], [0.1732528954744339, 0.034419331699609756, 0.002133424859493971, 0.01040138490498066, 0.041287679225206375, 0.008173053152859211, 0.0031463054474443197, 0.29453790187835693, 0.11390023678541183, 0.032152775675058365, 0.010630889795720577, 0.012554335407912731, 0.08591412752866745, 0.015542315319180489, 0.002182758878916502, 0.01234707422554493, 0.14742346107959747], [0.03705664724111557, 0.03378953039646149, 0.0021149488165974617, 0.012498689815402031, 0.005732903257012367, 0.009516551159322262, 0.004738130606710911, 0.44205066561698914, 0.08005455881357193, 0.021972032263875008, 0.011893939226865768, 0.03666716814041138, 0.03173566982150078, 0.013363306410610676, 0.00517281936481595, 0.009985632263123989, 0.24165676534175873], [0.040867049247026443, 0.006408818531781435, 0.0008636480779387057, 0.006286347284913063, 0.008997154422104359, 0.004769672639667988, 0.004067576956003904, 0.5399175882339478, 0.01839715801179409, 0.013025360181927681, 0.01708166114985943, 0.00895574688911438, 0.025088349357247353, 0.00862768106162548, 0.004291999153792858, 0.009642605669796467, 0.28271156549453735], [0.04358716681599617, 0.009029299952089787, 0.00043730303877964616, 0.003099664580076933, 0.0020504770800471306, 0.005667248275130987, 0.0049999915063381195, 0.626607358455658, 0.014447716996073723, 0.010923797264695168, 0.0031206731218844652, 0.005062049720436335, 0.011322245001792908, 0.004709620960056782, 0.0018938934663310647, 0.002454267116263509, 0.25058725476264954], [0.009867788292467594, 0.0020610562060028315, 0.0009217222104780376, 0.002791976323351264, 0.013349143788218498, 0.0070045506581664085, 0.00919921975582838, 0.5282589793205261, 0.003543288679793477, 0.005040670279413462, 0.004555019084364176, 0.002274732105433941, 0.032936401665210724, 0.01601550541818142, 0.022406984120607376, 0.028827598318457603, 0.31094545125961304], [0.004374152049422264, 0.0018487671623006463, 0.0009372894419357181, 0.0010584363481029868, 0.006303075235337019, 0.011689656414091587, 0.0048846034333109856, 0.46653640270233154, 0.003789415815845132, 0.006957157980650663, 0.010274345055222511, 0.00495007773861289, 0.042025431990623474, 0.0339990071952343, 0.01766934245824814, 0.1132320985198021, 0.2694707214832306], [0.007941938005387783, 0.0023342163767665625, 0.0007820338360033929, 0.005454189144074917, 0.012781509198248386, 0.006826317869126797, 0.006013272795826197, 0.541538417339325, 0.003465645480901003, 0.004199946764856577, 0.0015145627548918128, 0.0014299885369837284, 0.01982855424284935, 0.027335969731211662, 0.014360249973833561, 0.01678217388689518, 0.327411025762558], [0.03722979500889778, 0.027692332863807678, 0.011576073244214058, 0.0180767010897398, 0.009861810132861137, 0.025834443047642708, 0.03891235962510109, 0.4274941384792328, 0.018572892993688583, 0.021416790783405304, 0.022626884281635284, 0.02072407677769661, 0.01835119165480137, 0.013281087391078472, 0.015771254897117615, 0.0159608107060194, 0.25661730766296387]], [[0.04273043945431709, 0.010925447568297386, 0.005210877396166325, 0.0032660067081451416, 0.0020151452627032995, 0.0037017532158643007, 0.012778128497302532, 0.31039005517959595, 0.12078451365232468, 0.08311635255813599, 0.014020518399775028, 0.012070292606949806, 0.08964333683252335, 0.020753590390086174, 0.009982466697692871, 0.011838034726679325, 0.2467731237411499], [0.007399150636047125, 0.1258382946252823, 0.13569891452789307, 0.10218138992786407, 0.015062427148222923, 0.04485991224646568, 0.03572942316532135, 0.35047611594200134, 0.0004776774439960718, 0.0005739125772379339, 0.0007727932534180582, 0.00019142891687806696, 0.0007229851908050478, 0.00026804913068190217, 0.00030213070567697287, 0.0007308548665605485, 0.1787145882844925], [0.001321085961535573, 0.024015599861741066, 0.016293777152895927, 0.023575162515044212, 0.003283160040155053, 0.011694109998643398, 0.004126603249460459, 0.6879144906997681, 6.340537947835401e-05, 0.00012875768879894167, 0.00016115524340420961, 3.621394716901705e-05, 0.00012140576291130856, 4.55836379842367e-05, 0.00016726837202440947, 0.0003153449506498873, 0.2267368882894516], [0.009141659364104271, 0.08237382024526596, 0.017047986388206482, 0.09430057555437088, 0.009857485070824623, 0.045781295746564865, 0.01893915794789791, 0.5224566459655762, 0.00030265742680057883, 0.00012161168706370518, 0.00028831709641963243, 6.121781916590407e-05, 0.00014336152526084334, 0.0001676673418842256, 8.097315730992705e-05, 0.00030263231019489467, 0.19863292574882507], [0.009433622471988201, 0.09246213734149933, 0.01818373054265976, 0.22767426073551178, 0.05350743234157562, 0.09488721191883087, 0.019041597843170166, 0.34482529759407043, 0.0005079461843706667, 0.00043087685480713844, 0.0002716234012041241, 9.168343967758119e-05, 0.0006290461169555783, 0.0007786031346768141, 0.0005363051895983517, 0.0013717144029214978, 0.13536690175533295], [0.015284883789718151, 0.10623462498188019, 0.034661464393138885, 0.1645023375749588, 0.027306262403726578, 0.06057338789105415, 0.0940045639872551, 0.3338431715965271, 0.0005853463662788272, 0.0002055448858300224, 0.0005140762077644467, 0.00014623426250182092, 0.0006559526664204895, 0.00039254321018233895, 0.0003486171190161258, 0.0010051511926576495, 0.15973585844039917], [0.03406502306461334, 0.07375169545412064, 0.028445472940802574, 0.08269959688186646, 0.01772785745561123, 0.1501976102590561, 0.057427335530519485, 0.35381844639778137, 0.003376561449840665, 0.0016309269703924656, 0.0027231350541114807, 0.001046848250553012, 0.0038115649949759245, 0.0018797150114551187, 0.0004354999109636992, 0.0019243810093030334, 0.1850382387638092], [0.031044378876686096, 0.07606549561023712, 0.03163041174411774, 0.052209511399269104, 0.011481339111924171, 0.02878413163125515, 0.15662439167499542, 0.3389674127101898, 0.0045960890129208565, 0.0024492552038282156, 0.0022612647153437138, 0.0011915897484868765, 0.003949080593883991, 0.001848690677434206, 0.0011421144008636475, 0.004432867746800184, 0.2513218820095062], [0.011731116101145744, 0.00788077898323536, 0.0008507791790179908, 0.002779273083433509, 0.0010942775988951325, 0.004038802348077297, 0.002311753574758768, 0.5421214699745178, 0.06768756359815598, 0.04160356894135475, 0.028307568281888962, 0.009283688850700855, 0.06931769102811813, 0.00840551033616066, 0.0026959136594086885, 0.005769119132310152, 0.19412115216255188], [0.0026390759740024805, 0.0007802514592185616, 0.00021088791254442185, 0.0008833729079924524, 0.00015455765242222697, 0.00034645397681742907, 0.00031303163268603384, 0.7197991609573364, 0.014235266484320164, 0.006716256495565176, 0.0025569095741957426, 0.0014836564660072327, 0.005455496720969677, 0.0029846525285393, 0.00422503100708127, 0.004242599010467529, 0.23297329246997833], [0.0054240841418504715, 0.005965266842395067, 0.0008701793267391622, 0.003537815995514393, 0.0004694205126725137, 0.0021527146454900503, 0.000971428060438484, 0.609679639339447, 0.018107783049345016, 0.024918559938669205, 0.017089668661355972, 0.013821265660226345, 0.024988412857055664, 0.004319941624999046, 0.00562899699434638, 0.0046049985103309155, 0.2574498653411865], [0.007058931514620781, 0.002013731049373746, 0.0003378711699042469, 0.001891269814223051, 0.0003427169576752931, 0.0012942911125719547, 0.001309575280174613, 0.5599812269210815, 0.04482473433017731, 0.021705903112888336, 0.03798069432377815, 0.012835314497351646, 0.02319633588194847, 0.0074927546083927155, 0.005582157522439957, 0.009797673672437668, 0.2623548209667206], [0.009803524240851402, 0.0029658377170562744, 0.0010147469583898783, 0.0012084090849384665, 0.0004892810829915106, 0.0015343212289735675, 0.0020414497703313828, 0.5519570708274841, 0.048851098865270615, 0.023732688277959824, 0.01146373338997364, 0.018584730103611946, 0.032608337700366974, 0.016888994723558426, 0.005994120147079229, 0.011097241193056107, 0.2597644627094269], [0.014140384271740913, 0.0031386781483888626, 0.0004538831708487123, 0.0032695969566702843, 0.0008648817893117666, 0.0035730735398828983, 0.006762362085282803, 0.5959579944610596, 0.016225961968302727, 0.008355597034096718, 0.009427226148545742, 0.003593903034925461, 0.023407243192195892, 0.01857495680451393, 0.017281439155340195, 0.023952100425958633, 0.25102075934410095], [0.004708952270448208, 0.001267062034457922, 0.0006041160668246448, 0.0015428881160914898, 0.00017277186270803213, 0.0012620817869901657, 0.0012238433118909597, 0.5757011771202087, 0.004138552583754063, 0.004114903509616852, 0.0029165279120206833, 0.002329937182366848, 0.012792681343853474, 0.01566375233232975, 0.03188899904489517, 0.09874361008405685, 0.24092817306518555], [0.007550605107098818, 0.0019501439528539777, 0.0008454478229396045, 0.00232049822807312, 0.0003437183331698179, 0.0015127371298149228, 0.004983719903975725, 0.6133180260658264, 0.003920156508684158, 0.006099069956690073, 0.0034598740749061108, 0.0011328920954838395, 0.0051718661561608315, 0.006166866049170494, 0.04804175719618797, 0.024923225864768028, 0.2682594358921051], [0.014222539961338043, 0.029148094356060028, 0.01745740696787834, 0.020666802302002907, 0.005656434688717127, 0.012613311409950256, 0.03119693696498871, 0.4950505495071411, 0.003500905353575945, 0.0030985630583018064, 0.002852467354387045, 0.001683223876170814, 0.0038790518883615732, 0.001813106588087976, 0.0017073705093935132, 0.004356758203357458, 0.3510964810848236]], [[0.011158901266753674, 0.004521003924310207, 0.0007136596250347793, 0.0007620264077559114, 0.0013626370346173644, 0.0027816647198051214, 0.0026639881543815136, 0.06025908887386322, 0.26260703802108765, 0.12689010798931122, 0.013043787330389023, 0.019864559173583984, 0.3376324474811554, 0.07171697914600372, 0.011365552432835102, 0.0148816779255867, 0.0577748566865921], [0.002692459849640727, 0.08263766020536423, 0.3456130623817444, 0.1343354880809784, 0.056612368673086166, 0.10466238111257553, 0.08449030667543411, 0.08749958127737045, 0.0038051169831305742, 0.005857839714735746, 0.0025309559423476458, 0.0007692703511565924, 0.01071303803473711, 0.004306906834244728, 0.002287847688421607, 0.003971503581851721, 0.0672142431139946], [0.0009448195924051106, 0.0059924800880253315, 0.04499656707048416, 0.011285275220870972, 0.01102572400122881, 0.011542621068656445, 0.028732718899846077, 0.568437397480011, 0.0002918278332799673, 0.0007985151023603976, 0.0010200971737504005, 0.00020962917187716812, 0.0008846409618854523, 0.0005966214812360704, 0.006591382902115583, 0.003406726522371173, 0.30324292182922363], [0.0027174961287528276, 0.027588602155447006, 0.09413395822048187, 0.02922937460243702, 0.01616697758436203, 0.029608123004436493, 0.0493009053170681, 0.457540899515152, 0.0012701961677521467, 0.0021479427814483643, 0.001426215749233961, 0.00035268341889604926, 0.0013589683221653104, 0.0011366907274350524, 0.0045110746286809444, 0.002552115125581622, 0.2789577841758728], [0.003021674230694771, 0.045711830258369446, 0.18625934422016144, 0.09758056700229645, 0.04106978327035904, 0.052408237010240555, 0.1505175232887268, 0.21909047663211823, 0.0040421634912490845, 0.009419073350727558, 0.0022433525882661343, 0.0010041756322607398, 0.006586202885955572, 0.0021299482323229313, 0.004082442726939917, 0.00436494080349803, 0.1704683154821396], [0.0032792820129543543, 0.08111587911844254, 0.29668134450912476, 0.0739460438489914, 0.024001166224479675, 0.07609695941209793, 0.3239842653274536, 0.04282943159341812, 0.0029396878089755774, 0.004696402233093977, 0.0025891661643981934, 0.0008170694927684963, 0.00800387654453516, 0.004587074741721153, 0.004224342294037342, 0.00638478621840477, 0.04382329806685448], [0.004747460596263409, 0.02744414657354355, 0.06158469244837761, 0.022616134956479073, 0.03276015818119049, 0.044029105454683304, 0.0463341660797596, 0.49174752831459045, 0.003418539883568883, 0.0055325767025351524, 0.0020336902234703302, 0.000597115489654243, 0.003906251396983862, 0.004696831572800875, 0.0036662518978118896, 0.005834254901856184, 0.23905107378959656], [0.010392608121037483, 0.22914732992649078, 0.1337614506483078, 0.11753973364830017, 0.02351071685552597, 0.1414770632982254, 0.16925881803035736, 0.10205116868019104, 0.003986799623817205, 0.00211681192740798, 0.0012361564440652728, 0.0011931228218600154, 0.007259796839207411, 0.003208471927791834, 0.001928925747051835, 0.00350143457762897, 0.048429690301418304], [0.01589983142912388, 0.014701459556818008, 0.019344279542565346, 0.0051245903596282005, 0.007373419590294361, 0.015348684042692184, 0.024280158802866936, 0.10684330761432648, 0.09720287472009659, 0.10340415686368942, 0.014108635485172272, 0.02476648800075054, 0.34748783707618713, 0.05735474079847336, 0.023075560107827187, 0.03820529580116272, 0.08547863364219666], [0.02258485183119774, 0.006129854824393988, 0.008313841186463833, 0.004836616571992636, 0.00945699866861105, 0.006378726102411747, 0.013692068867385387, 0.33381393551826477, 0.043640367686748505, 0.08847491443157196, 0.007146009709686041, 0.011188755743205547, 0.07713112235069275, 0.021844208240509033, 0.03167419880628586, 0.037142928689718246, 0.2765505313873291], [0.018296917900443077, 0.004053308628499508, 0.01671082153916359, 0.004922103136777878, 0.005493265576660633, 0.006995729636400938, 0.016017628833651543, 0.376251220703125, 0.0168628953397274, 0.031931012868881226, 0.010762694291770458, 0.01932031475007534, 0.038631368428468704, 0.02878636121749878, 0.07986465841531754, 0.08324635773897171, 0.24185340106487274], [0.03560624644160271, 0.005974023137241602, 0.006559197790920734, 0.0025752075016498566, 0.004245084710419178, 0.0052162278443574905, 0.014633269980549812, 0.2707018554210663, 0.05423135682940483, 0.09029130637645721, 0.01172156073153019, 0.017886098474264145, 0.09245486557483673, 0.02883598580956459, 0.03858368471264839, 0.06316250562667847, 0.25732147693634033], [0.030650025233626366, 0.011764624156057835, 0.008643248118460178, 0.0024146963842213154, 0.003851533867418766, 0.00712395366281271, 0.01014288142323494, 0.31867292523384094, 0.06474132835865021, 0.0849815309047699, 0.012711056508123875, 0.01607181690633297, 0.0888357013463974, 0.031135128811001778, 0.038838621228933334, 0.04391677677631378, 0.22550411522388458], [0.01888444274663925, 0.01145879551768303, 0.021535808220505714, 0.003775527235120535, 0.002406398067250848, 0.012178423814475536, 0.04780573397874832, 0.17063415050506592, 0.04587945342063904, 0.03875456377863884, 0.018893133848905563, 0.034133147448301315, 0.18193329870700836, 0.09197019040584564, 0.0766286700963974, 0.09655112028121948, 0.12657716870307922], [0.017223220318555832, 0.002907813061028719, 0.008872142061591148, 0.0025642900727689266, 0.0021096616983413696, 0.0030711661092936993, 0.057768262922763824, 0.14524421095848083, 0.016294686123728752, 0.03436644375324249, 0.022975541651248932, 0.026953907683491707, 0.033345017582178116, 0.05669255182147026, 0.28809434175491333, 0.14350973069667816, 0.1380070298910141], [0.027745669707655907, 0.005201238207519054, 0.018724678084254265, 0.004337739665061235, 0.002414804883301258, 0.004956688266247511, 0.06717684119939804, 0.13012005388736725, 0.028314024209976196, 0.07089640200138092, 0.0216904878616333, 0.03879619762301445, 0.0755167081952095, 0.06026041880249977, 0.18603254854679108, 0.13385190069675446, 0.12396354228258133], [0.0066335913725197315, 0.03936045616865158, 0.04464258253574371, 0.033172063529491425, 0.012921283021569252, 0.04102327674627304, 0.05410892888903618, 0.5457468628883362, 0.002781172515824437, 0.003042014315724373, 0.001652611419558525, 0.0011335837189108133, 0.004574459511786699, 0.0022479919716715813, 0.004736421629786491, 0.004569776356220245, 0.19765296578407288]], [[0.12167654186487198, 0.0075149559415876865, 0.00568244606256485, 0.015084643848240376, 0.01070312224328518, 0.013011274859309196, 0.02738182432949543, 0.4257955253124237, 0.027130404487252235, 0.023754164576530457, 0.00953770149499178, 0.013212757185101509, 0.04177046939730644, 0.009404133073985577, 0.0007858492317609489, 0.012394112534821033, 0.23516003787517548], [0.036025520414114, 0.10796020925045013, 0.008994265459477901, 0.0045096902176737785, 0.002820024499669671, 0.016944676637649536, 0.0127598587423563, 0.39483538269996643, 0.04500686004757881, 0.017008762806653976, 0.009139886125922203, 0.016575990244746208, 0.008166804909706116, 0.03665744885802269, 0.0004256327520124614, 0.012831593863666058, 0.26933741569519043], [0.04463111236691475, 0.06399090588092804, 0.08614081889390945, 0.01091488916426897, 0.009027852676808834, 0.02923930250108242, 0.023150887340307236, 0.44600412249565125, 0.005042591597884893, 0.02119682915508747, 0.004272188991308212, 0.0035210398491472006, 0.0038310610689222813, 0.002406490035355091, 0.0005241000326350331, 0.005662405397742987, 0.24044331908226013], [0.056566815823316574, 0.012478964403271675, 0.005365872755646706, 0.09342630952596664, 0.0054843612015247345, 0.00976174883544445, 0.005522636231034994, 0.4518509805202484, 0.009336377494037151, 0.011234820820391178, 0.00781663041561842, 0.05988069623708725, 0.010795553214848042, 0.007403965573757887, 0.0015120870666578412, 0.008721054531633854, 0.24284112453460693], [0.08154693245887756, 0.0027746197301894426, 0.0013148515718057752, 0.011014983057975769, 0.10185054689645767, 0.018399015069007874, 0.0054785641841590405, 0.45541641116142273, 0.004451461136341095, 0.0068691265769302845, 0.0016468436224386096, 0.00757928192615509, 0.009812194854021072, 0.00464586028829217, 0.0010231704218313098, 0.0028087319806218147, 0.2833673059940338], [0.05010572075843811, 0.00985656213015318, 0.000960049219429493, 0.0021305258851498365, 0.006634508725255728, 0.2027469277381897, 0.015112022869288921, 0.37733858823776245, 0.023084938526153564, 0.003908269107341766, 0.0024688125122338533, 0.006782107055187225, 0.003491209354251623, 0.052300289273262024, 0.001991088269278407, 0.0075134700164198875, 0.23357492685317993], [0.016254669055342674, 0.014096998609602451, 0.005467211827635765, 0.0045744129456579685, 0.006139964796602726, 0.04467669501900673, 0.06678405404090881, 0.4398675262928009, 0.01339893788099289, 0.07942674309015274, 0.0029487742576748133, 0.002640861552208662, 0.022433796897530556, 0.014198639430105686, 0.009087512269616127, 0.030869876965880394, 0.22713324427604675], [0.027892235666513443, 0.017620913684368134, 0.00917307659983635, 0.010625910013914108, 0.009526582434773445, 0.02136579342186451, 0.017181389033794403, 0.5515143275260925, 0.008893921039998531, 0.009031180292367935, 0.008005949668586254, 0.007138391491025686, 0.009437083266675472, 0.00878322683274746, 0.002699170960113406, 0.007876801304519176, 0.2732340395450592], [0.06346085667610168, 0.038456402719020844, 0.0005582711310125887, 0.005327790044248104, 0.004390807822346687, 0.024011097848415375, 0.008978872559964657, 0.253012090921402, 0.39613470435142517, 0.01788020133972168, 0.0029545528814196587, 0.006175209302455187, 0.006637741345912218, 0.029089756309986115, 6.256261258386075e-05, 0.0026612654328346252, 0.14020775258541107], [0.051207736134529114, 0.028233295306563377, 0.006243335083127022, 0.01151212491095066, 0.0074438066221773624, 0.02501414157450199, 0.025927409529685974, 0.31088733673095703, 0.02345028519630432, 0.25393667817115784, 0.009313829243183136, 0.018825778737664223, 0.012124288827180862, 0.008955847471952438, 0.0008209029911085963, 0.0067854346707463264, 0.19931770861148834], [0.07341303676366806, 0.009023376740515232, 0.0012193082366138697, 0.0036357028875499964, 0.0009598745382390916, 0.004928015638142824, 0.005239473190158606, 0.22134119272232056, 0.007957222871482372, 0.005379486363381147, 0.5196216106414795, 0.011210243217647076, 0.005913604516535997, 0.0015146328369155526, 0.0001174936187453568, 0.0021524287294596434, 0.1263732761144638], [0.09293791651725769, 0.008962472900748253, 0.003793872892856598, 0.02564520575106144, 0.004378491081297398, 0.009970835410058498, 0.007413683459162712, 0.47899267077445984, 0.006562027148902416, 0.0032391552813351154, 0.00940283015370369, 0.07483521848917007, 0.013380778022110462, 0.0031927209347486496, 0.0005609721993096173, 0.0039094723761081696, 0.252821683883667], [0.144046351313591, 0.0074857925064861774, 0.00152568647172302, 0.017268776893615723, 0.022590212523937225, 0.005724957678467035, 0.014552626758813858, 0.38225996494293213, 0.009616016410291195, 0.009515472687780857, 0.005666916258633137, 0.02074669487774372, 0.1149546280503273, 0.009165883995592594, 0.00021735642803832889, 0.009443937800824642, 0.22521871328353882], [0.011630082502961159, 0.0037700356915593147, 0.00020818108168896288, 0.0004194157081656158, 0.0016446671215817332, 0.013736303895711899, 0.008694665506482124, 0.563090443611145, 0.009616745635867119, 0.0033997532445937395, 0.00030985273770056665, 0.0008021551184356213, 0.0012680522631853819, 0.15150929987430573, 0.0008413230534642935, 0.004538369830697775, 0.22452062368392944], [0.006878736428916454, 0.0006689493311569095, 0.00032080794335342944, 0.00022600509691983461, 0.0004274733364582062, 0.013320536352694035, 0.01279827393591404, 0.5222806334495544, 0.00021103110339026898, 0.0009719950030557811, 0.00010051353456219658, 0.0006432710215449333, 0.0001427538227289915, 0.003683994524180889, 0.18318185210227966, 0.005724438466131687, 0.24841876327991486], [0.027138030156493187, 0.002518105087801814, 0.0008993195369839668, 0.0006400993443094194, 0.0008186582126654685, 0.006884696893393993, 0.02197936736047268, 0.6433086395263672, 0.001396615756675601, 0.0023123135324567556, 0.0009465551120229065, 0.000827718002256006, 0.001291204011067748, 0.004476301837712526, 0.0005229961825534701, 0.012062284164130688, 0.27197712659835815], [0.03181227669119835, 0.013683241792023182, 0.007585412822663784, 0.007766473572701216, 0.007233914919197559, 0.01794290728867054, 0.020610477775335312, 0.525634229183197, 0.006388143170624971, 0.008704784326255322, 0.0070458874106407166, 0.006547593045979738, 0.006779894232749939, 0.0083800433203578, 0.0026259450241923332, 0.009229892864823341, 0.3120289444923401]], [[0.03988378867506981, 0.016114575788378716, 0.0019474404398351908, 0.003629102371633053, 0.0048802695237100124, 0.0040347822941839695, 0.0175474863499403, 0.422654926776886, 0.05543402582406998, 0.1350107192993164, 0.004487651400268078, 0.017643090337514877, 0.06273152679204941, 0.004586480092257261, 0.0026807342655956745, 0.006280739791691303, 0.2004527747631073], [0.008343123830854893, 0.06480352580547333, 0.06734392046928406, 0.15420940518379211, 0.020311636850237846, 0.007981322705745697, 0.03950008377432823, 0.44986358284950256, 0.0035799748729914427, 0.006267112214118242, 0.00032282964093610644, 0.0013669135514646769, 0.007876849733293056, 0.000252675439696759, 4.106622509425506e-05, 0.0006186027894727886, 0.16731739044189453], [0.0007404094212688506, 0.00471192691475153, 0.028277264907956123, 0.02903577871620655, 0.0038026783149689436, 0.002236943459138274, 0.0075379288755357265, 0.692842960357666, 7.785890920786187e-05, 0.0017359106568619609, 0.0001720025611575693, 0.00022148275456856936, 0.00040373875526711345, 2.4747483621467836e-05, 0.0003487702342681587, 0.0007463162182830274, 0.2270832657814026], [0.0010223849676549435, 0.008215422742068768, 0.03839661180973053, 0.03378839045763016, 0.005131866317242384, 0.004833152983337641, 0.01119320560246706, 0.6800437569618225, 0.0002889969327952713, 0.0015977438306435943, 0.00013645924627780914, 0.0002946999447885901, 0.0019201133400201797, 9.586686792317778e-05, 7.265232125064358e-05, 0.0003744226705748588, 0.21259425580501556], [0.006782756187021732, 0.01822287030518055, 0.06442795693874359, 0.2947053015232086, 0.033405888825654984, 0.027554195374250412, 0.2540339231491089, 0.15079092979431152, 0.002170210238546133, 0.023842360824346542, 0.00027821955154649913, 0.0013749625068157911, 0.006694023963063955, 0.0005033547058701515, 0.0004729980428237468, 0.0014454862102866173, 0.1132945716381073], [0.006512910593301058, 0.022717759013175964, 0.14369069039821625, 0.09887434542179108, 0.023103442043066025, 0.010689659975469112, 0.146686851978302, 0.35266202688217163, 0.0024755720514804125, 0.004967536311596632, 0.00045513565419241786, 0.0012000019196420908, 0.015088378451764584, 0.0009681986994110048, 0.00019964328384958208, 0.0014922262635082006, 0.1682155579328537], [0.0059058330953121185, 0.019681943580508232, 0.04358895495533943, 0.060179464519023895, 0.013434232212603092, 0.010341964662075043, 0.05094344913959503, 0.5666913390159607, 0.0018343155970796943, 0.0027329346630722284, 0.00017579644918441772, 0.0005925801815465093, 0.0028153720777481794, 0.001525204279460013, 0.00048009847523644567, 0.0020878068171441555, 0.2169886976480484], [0.006440408062189817, 0.01626414805650711, 0.01415314432233572, 0.01805269531905651, 0.0033530760556459427, 0.007003031205385923, 0.016636159271001816, 0.6600470542907715, 0.00398850254714489, 0.008421809412539005, 0.001386127551086247, 0.003300258656963706, 0.010614327155053616, 0.0020348820835351944, 0.0008414218900725245, 0.00505413394421339, 0.22240883111953735], [0.012853391468524933, 0.004776861052960157, 0.0032535125501453876, 0.003657883033156395, 0.002092045033350587, 0.0011796079343184829, 0.019057398661971092, 0.377326101064682, 0.03503330424427986, 0.15644468367099762, 0.004486836027354002, 0.01563706248998642, 0.1515296846628189, 0.00624211085960269, 0.0011050363536924124, 0.0059441314078867435, 0.19938041269779205], [0.0030522027518600225, 0.0014701259788125753, 0.0025209663435816765, 0.004543520510196686, 0.0018592218402773142, 0.0005791988805867732, 0.0039147580973804, 0.34437522292137146, 0.03269872069358826, 0.2222934067249298, 0.007597112096846104, 0.03466250002384186, 0.10350389033555984, 0.005456297658383846, 0.0015545738860964775, 0.005576569586992264, 0.22434177994728088], [0.0018067974597215652, 0.0011239623418077826, 0.002497342647984624, 0.001246894709765911, 0.0005790196591988206, 0.0005001184181310236, 0.0032356134615838528, 0.5607758164405823, 0.01831120438873768, 0.015686875209212303, 0.013449402526021004, 0.018523871898651123, 0.05473627150058746, 0.024943485856056213, 0.005261460784822702, 0.015012739226222038, 0.26230910420417786], [0.004094633273780346, 0.001175370649434626, 0.001451934571377933, 0.001484404900111258, 0.0005301795899868011, 0.0004408232925925404, 0.007176279090344906, 0.43427199125289917, 0.04389841482043266, 0.05618460848927498, 0.008489617146551609, 0.013249777257442474, 0.11628926545381546, 0.017227115109562874, 0.00130241340957582, 0.005977923516184092, 0.2867552638053894], [0.004619250539690256, 0.001441951491869986, 0.0010173094924539328, 0.0006104934145696461, 0.0003160855558235198, 0.0015940176090225577, 0.013335556723177433, 0.5105360746383667, 0.016380852088332176, 0.053148671984672546, 0.003988656215369701, 0.00916975922882557, 0.04018981382250786, 0.012267998419702053, 0.0022186245769262314, 0.005697234068065882, 0.3234676122665405], [0.003948579076677561, 0.0011868085712194443, 0.0015574127901345491, 0.000564271118491888, 0.0003261391830164939, 0.001464462373405695, 0.013518979772925377, 0.5092573165893555, 0.00967163685709238, 0.02442529983818531, 0.0047238050028681755, 0.007272594142705202, 0.07750549167394638, 0.06930477172136307, 0.007030998356640339, 0.020727451890707016, 0.24751393496990204], [0.002848736010491848, 0.0007423038478009403, 0.0048252884298563, 0.0009277939680032432, 0.00026626561884768307, 0.0007104513933882117, 0.0038932666648179293, 0.35254186391830444, 0.011644139885902405, 0.024805789813399315, 0.008339283987879753, 0.008197507821023464, 0.10998876392841339, 0.10456009954214096, 0.02098405361175537, 0.10858126729726791, 0.23614317178726196], [0.0019770367071032524, 0.0005874845664948225, 0.005322036799043417, 0.001890881103463471, 0.0007340035517700016, 0.0004957995843142271, 0.02190230041742325, 0.25823909044265747, 0.03236871585249901, 0.05970766022801399, 0.007474488578736782, 0.009765254333615303, 0.16584376990795135, 0.14644932746887207, 0.02097996324300766, 0.07949724048376083, 0.1867649257183075], [0.002232104307040572, 0.005492780823260546, 0.0063850162550807, 0.005171177908778191, 0.0010029423283413053, 0.0024654208682477474, 0.005871399771422148, 0.7407540082931519, 0.0016971953446045518, 0.003997058607637882, 0.0009317698422819376, 0.0014509912580251694, 0.004707454703748226, 0.001061597722582519, 0.0006480918964371085, 0.0029277054127305746, 0.21320320665836334]], [[0.13112637400627136, 0.098299041390419, 0.03797876834869385, 0.08319570124149323, 0.01173576433211565, 0.06636365503072739, 0.3366886079311371, 0.09494784474372864, 0.015217344276607037, 0.002942705526947975, 0.0007382103358395398, 0.0037700894754379988, 0.03500792384147644, 0.003419391345232725, 0.0005956703098490834, 0.0026796257589012384, 0.07529328763484955], [0.08929329365491867, 0.17417871952056885, 0.07677249610424042, 0.05096729099750519, 0.014344840310513973, 0.03537403792142868, 0.04329613223671913, 0.19510518014431, 0.054338205605745316, 0.01832316629588604, 0.004299004562199116, 0.005284843500703573, 0.011951224878430367, 0.008844699710607529, 0.01188187301158905, 0.01629089191555977, 0.18945419788360596], [0.018616752699017525, 0.026499483734369278, 0.1842060536146164, 0.05412338301539421, 0.04867803305387497, 0.022157808765769005, 0.049234408885240555, 0.2578485310077667, 0.01987922005355358, 0.026008866727352142, 0.012836821377277374, 0.007937817834317684, 0.0034649462904781103, 0.013394936919212341, 0.026508202776312828, 0.031100325286388397, 0.1975044161081314], [0.010292326100170612, 0.009523626416921616, 0.034536782652139664, 0.2279977798461914, 0.07522223144769669, 0.016023078933358192, 0.055351823568344116, 0.27501586079597473, 0.007916836999356747, 0.009241070598363876, 0.0021122493781149387, 0.002398906974121928, 0.0032665543258190155, 0.004605850670486689, 0.01365593820810318, 0.021331992000341415, 0.23150701820850372], [0.0076224831864237785, 0.024256709963083267, 0.13846223056316376, 0.39482539892196655, 0.1134888157248497, 0.028290383517742157, 0.0966339111328125, 0.07660440355539322, 0.011470011435449123, 0.004032960627228022, 0.0009242743835784495, 0.0012889470672234893, 0.002588564995676279, 0.0034280067775398493, 0.003765709465369582, 0.01141941174864769, 0.08089785277843475], [0.060092926025390625, 0.07290763407945633, 0.0268718209117651, 0.07306665182113647, 0.038684502243995667, 0.08964470028877258, 0.1679389327764511, 0.12508589029312134, 0.060754843056201935, 0.02643582783639431, 0.006259375251829624, 0.010927294380962849, 0.026710020378232002, 0.017997274175286293, 0.009899232536554337, 0.03717024251818657, 0.14955288171768188], [0.046239420771598816, 0.029400838539004326, 0.037940654903650284, 0.027903564274311066, 0.013527109287679195, 0.012348546646535397, 0.07214798778295517, 0.48802459239959717, 0.002776437671855092, 0.0025310534983873367, 0.0015120228054001927, 0.0016497262986376882, 0.010194830596446991, 0.0019752690568566322, 0.00380500964820385, 0.013101187534630299, 0.23492184281349182], [0.06643960624933243, 0.07960347831249237, 0.04663100838661194, 0.06740008294582367, 0.0429096557199955, 0.06993631273508072, 0.06676341593265533, 0.13688868284225464, 0.036490168422460556, 0.028345592319965363, 0.029728369787335396, 0.03036848083138466, 0.03555829077959061, 0.02969147264957428, 0.031145256012678146, 0.043867386877536774, 0.1582326889038086], [0.029347438365221024, 0.054731737822294235, 0.02423117868602276, 0.018885619938373566, 0.0028740796260535717, 0.022202041000127792, 0.08798391371965408, 0.04044013470411301, 0.3346095085144043, 0.0208131093531847, 0.011984352953732014, 0.08050338178873062, 0.14684908092021942, 0.06363319605588913, 0.0010034691076725721, 0.006376439705491066, 0.05353127792477608], [0.028015557676553726, 0.07446622848510742, 0.0808921679854393, 0.046195488423109055, 0.02071835659444332, 0.05952954664826393, 0.08718353509902954, 0.09213827550411224, 0.12403269857168198, 0.0449085496366024, 0.010576332919299603, 0.04887360334396362, 0.12313409149646759, 0.0640578344464302, 0.002776630688458681, 0.014259631745517254, 0.0782414972782135], [0.02594302035868168, 0.010174636729061604, 0.01232949085533619, 0.008751103654503822, 0.002157071838155389, 0.009560581296682358, 0.025953933596611023, 0.3126137852668762, 0.06809480488300323, 0.014314180240035057, 0.039235055446624756, 0.04431382194161415, 0.06792160123586655, 0.08493804186582565, 0.0036732121370732784, 0.01663738489151001, 0.25338831543922424], [0.01993793621659279, 0.020772119984030724, 0.01225958950817585, 0.017012128606438637, 0.004575349856168032, 0.02879076451063156, 0.0663299635052681, 0.1862061321735382, 0.19002898037433624, 0.01549505814909935, 0.015554988756775856, 0.07573859393596649, 0.061943169683218, 0.09717091917991638, 0.0015127590158954263, 0.012691052630543709, 0.17398056387901306], [0.05456272140145302, 0.035197675228118896, 0.020155729725956917, 0.0217574592679739, 0.004355205222964287, 0.017263438552618027, 0.060405176132917404, 0.25601309537887573, 0.10454913228750229, 0.005443638190627098, 0.008728220127522945, 0.04110456630587578, 0.09524977207183838, 0.03495359420776367, 0.0024226305540651083, 0.011277208104729652, 0.2265608012676239], [0.01369340717792511, 0.024213401600718498, 0.024550100788474083, 0.02103821188211441, 0.006036289501935244, 0.015708865597844124, 0.049844756722450256, 0.1699150651693344, 0.19129596650600433, 0.017268482595682144, 0.033372703939676285, 0.0654980018734932, 0.0750468298792839, 0.09836339950561523, 0.0037567552644759417, 0.017078053206205368, 0.1733197122812271], [0.007743460591882467, 0.019458720460534096, 0.07256335765123367, 0.017791632562875748, 0.029292121529579163, 0.035826176404953, 0.03024124912917614, 0.3797381520271301, 0.0035287984646856785, 0.013947450555860996, 0.005080304574221373, 0.003365755081176758, 0.0075022196397185326, 0.014577755704522133, 0.02779126539826393, 0.09007030725479126, 0.2414812445640564], [0.014611335471272469, 0.049723733216524124, 0.11213212460279465, 0.06209392473101616, 0.0251474529504776, 0.06838910281658173, 0.08262432366609573, 0.245748832821846, 0.010322009213268757, 0.005408711265772581, 0.0063196769915521145, 0.005497562233358622, 0.018217522650957108, 0.030363747850060463, 0.012123221531510353, 0.0676974430680275, 0.18357928097248077], [0.05940486863255501, 0.05989648029208183, 0.04702525585889816, 0.061764635145664215, 0.03798598051071167, 0.05553501844406128, 0.09038205444812775, 0.17047938704490662, 0.027133382856845856, 0.025649219751358032, 0.026100708171725273, 0.024749338626861572, 0.032878562808036804, 0.024263527244329453, 0.025066150352358818, 0.04202720522880554, 0.18965817987918854]], [[0.039511971175670624, 0.023227136582136154, 0.020986786112189293, 0.021415524184703827, 0.01165375579148531, 0.04903898388147354, 0.06704988330602646, 0.3532221019268036, 0.009824379347264767, 0.03167214244604111, 0.012398001737892628, 0.0768713727593422, 0.008869635872542858, 0.006480373442173004, 0.005483519285917282, 0.026049794629216194, 0.23624460399150848], [0.017336489632725716, 0.032022152096033096, 0.014891345985233784, 0.03003665618598461, 0.0279243104159832, 0.018684355542063713, 0.0761832669377327, 0.48912903666496277, 0.004628525581210852, 0.005104967392981052, 0.006504908204078674, 0.011768101714551449, 0.00846546795219183, 0.0019485322991386056, 0.0005397393833845854, 0.006927827373147011, 0.24790436029434204], [0.008171319961547852, 0.084152452647686, 0.013201438821852207, 0.012643053196370602, 0.009366299025714397, 0.034474655985832214, 0.007402634713798761, 0.5951064229011536, 0.020463852211833, 0.00323022180236876, 0.0012982020853087306, 0.0016313092783093452, 0.0021555216517299414, 0.011141949333250523, 0.0002599233412183821, 0.0027131270617246628, 0.1925877034664154], [0.010296142660081387, 0.031723108142614365, 0.020072493702173233, 0.02865152806043625, 0.024666277691721916, 0.019877085462212563, 0.01879529282450676, 0.594613790512085, 0.004515630658715963, 0.006347935181111097, 0.0027950911317020655, 0.006079019047319889, 0.0013216607039794326, 0.001422750297933817, 0.00019706672173924744, 0.00798801053315401, 0.22063709795475006], [0.009207528084516525, 0.005360978655517101, 0.0016547112027183175, 0.07299891114234924, 0.01845093071460724, 0.009594419039785862, 0.004134891089051962, 0.6762566566467285, 0.00041089014848694205, 0.0005293187568895519, 0.00021270563593134284, 0.00038877929910086095, 0.0007947482517920434, 0.0005327356047928333, 0.00011561471910681576, 0.002714182948693633, 0.19664205610752106], [0.016401639208197594, 0.046295344829559326, 0.005714193917810917, 0.014934827573597431, 0.05732293426990509, 0.04511968791484833, 0.03517167642712593, 0.39767220616340637, 0.02251366153359413, 0.0012446582550182939, 0.010193945840001106, 0.005028130952268839, 0.06642040610313416, 0.023687804117798805, 0.00040203702519647777, 0.010458456352353096, 0.24141839146614075], [0.01394050195813179, 0.04503249377012253, 0.034818243235349655, 0.02575930953025818, 0.06134670972824097, 0.12455001473426819, 0.034929059445858, 0.37352633476257324, 0.01239830069243908, 0.003985373303294182, 0.00104584451764822, 0.001423695357516408, 0.005541021004319191, 0.008495138958096504, 0.0003801221610046923, 0.00732740294188261, 0.24550047516822815], [0.027604270726442337, 0.034146737307310104, 0.0149156479164958, 0.017389317974448204, 0.011722655035555363, 0.030974365770816803, 0.024882083758711815, 0.4633851647377014, 0.008861260488629341, 0.005521596409380436, 0.004859026055783033, 0.006376627832651138, 0.013753213919699192, 0.01222193706780672, 0.0027775082271546125, 0.019386490806937218, 0.30122217535972595], [0.06330961734056473, 0.022987423464655876, 0.0022124273236840963, 0.008632184006273746, 0.006022817455232143, 0.008915998041629791, 0.033968206495046616, 0.4593507945537567, 0.023617399856448174, 0.025318635627627373, 0.01784508302807808, 0.06438597291707993, 0.02260000631213188, 0.003553911345079541, 0.00034618188510648906, 0.009915308095514774, 0.2270180583000183], [0.007415013387799263, 0.027772966772317886, 0.0027599381282925606, 0.004768472630530596, 0.0063121565617620945, 0.015607202425599098, 0.0040848334319889545, 0.17809036374092102, 0.44697999954223633, 0.010828511789441109, 0.008027183823287487, 0.006266927346587181, 0.006259840447455645, 0.17732086777687073, 0.0007812769035808742, 0.006514615844935179, 0.09020976722240448], [0.011212128214538097, 0.011064586229622364, 0.00929186586290598, 0.004193339496850967, 0.0035620415583252907, 0.007418776396661997, 0.011346505954861641, 0.4610574245452881, 0.026436612010002136, 0.07780169695615768, 0.019763408228754997, 0.04618450254201889, 0.009108318015933037, 0.016566812992095947, 0.001661119400523603, 0.04846096411347389, 0.23486988246440887], [0.023965485394001007, 0.00413536699488759, 0.0002565238100942224, 0.002042680513113737, 0.0006106494111008942, 0.0022966735996305943, 0.006188013590872288, 0.5016011595726013, 0.009961288422346115, 0.006403713952749968, 0.21530619263648987, 0.00360947591252625, 0.011030085384845734, 0.006289708893746138, 0.00014728176756761968, 0.002058879006654024, 0.20409689843654633], [0.023803701624274254, 0.005861188285052776, 0.0017903328407555819, 0.006711424794048071, 0.0020297346636652946, 0.010083501227200031, 0.02964775636792183, 0.42952361702919006, 0.015205491334199905, 0.0437641441822052, 0.013394557870924473, 0.09778983891010284, 0.008603587746620178, 0.013961265794932842, 0.0009002153528854251, 0.04851935803890228, 0.24841026961803436], [0.023781629279255867, 0.004507013596594334, 0.0009687670972198248, 0.0035743513144552708, 0.00451942253857851, 0.0041238125413656235, 0.010898624546825886, 0.5111874341964722, 0.004257243126630783, 0.004228512290865183, 0.007192783523350954, 0.010231683030724525, 0.09671378135681152, 0.013191650621592999, 0.001124248723499477, 0.020542453974485397, 0.2789566218852997], [0.0025529831182211637, 0.003895567962899804, 0.0009600265184417367, 0.0006933326949365437, 0.0018411152996122837, 0.0051749334670603275, 0.0005668971571139991, 0.6422987580299377, 0.0033138084691017866, 0.0003125722869299352, 0.00016155010962393135, 0.0005037641967646778, 0.007405666168779135, 0.05880843475461006, 0.0009845438180491328, 0.011188008822500706, 0.25933799147605896], [0.007157592568546534, 0.0022572800517082214, 0.001597763504832983, 0.0014352599391713738, 0.0025404782500118017, 0.004817312583327293, 0.011677275411784649, 0.6589229106903076, 0.001687935902737081, 0.005234307609498501, 0.0021309650037437677, 0.003476101206615567, 0.0032438577618449926, 0.015381939709186554, 0.05429365858435631, 0.012633481062948704, 0.21151188015937805], [0.025185398757457733, 0.02365874871611595, 0.012047477066516876, 0.015559067018330097, 0.008607608266174793, 0.023236973211169243, 0.018744250759482384, 0.5220514535903931, 0.007733246777206659, 0.005960468202829361, 0.0038136481307446957, 0.004987360443919897, 0.0098489448428154, 0.009204031899571419, 0.002490875544026494, 0.012728097848594189, 0.29414233565330505]], [[0.018313711509108543, 0.005940890405327082, 0.005876679439097643, 0.009499255567789078, 0.008861305192112923, 0.008682413958013058, 0.010257997550070286, 0.3123798370361328, 0.05833736062049866, 0.03860795125365257, 0.031123854219913483, 0.06081439554691315, 0.09164400398731232, 0.05548124015331268, 0.036623723804950714, 0.0506565198302269, 0.19689884781837463], [0.014657718129456043, 0.09716122597455978, 0.2672867178916931, 0.07259227335453033, 0.02334175445139408, 0.12757930159568787, 0.10065503418445587, 0.19113008677959442, 0.0003826104511972517, 0.0002926587185356766, 0.0008912059129215777, 0.000835260609164834, 0.0010664568981155753, 0.0005245751235634089, 0.0005235543358139694, 0.0006538259331136942, 0.10042572021484375], [0.008349491283297539, 0.024902788922190666, 0.01833903044462204, 0.01826833374798298, 0.008128257468342781, 0.03520580008625984, 0.0535917654633522, 0.6064145565032959, 0.00034187757410109043, 0.00030555357807315886, 0.0007571222959086299, 0.0003847410553134978, 0.0012817602837458253, 0.0004431113484315574, 0.0007479701307602227, 0.0005889513995498419, 0.22194884717464447], [0.017153488472104073, 0.03340350091457367, 0.053740888833999634, 0.01841166988015175, 0.013879344798624516, 0.06374309211969376, 0.04480122774839401, 0.5024486780166626, 0.00024233687145169824, 0.00022781525331083685, 0.0006953829433768988, 0.0005272513371892273, 0.0012173546710982919, 0.0006870084907859564, 0.0015585289802402258, 0.0010477708419784904, 0.24621467292308807], [0.03209710493683815, 0.05804307386279106, 0.09475362300872803, 0.07003344595432281, 0.02263137511909008, 0.0779813677072525, 0.06429700553417206, 0.37905317544937134, 0.0005263236234895885, 0.0006622651126235723, 0.0011128010228276253, 0.001376914675347507, 0.0021012439392507076, 0.0008525536395609379, 0.0023185620084404945, 0.0020378162153065205, 0.1901213526725769], [0.02833014540374279, 0.04718078300356865, 0.1283685714006424, 0.04076273739337921, 0.010877200402319431, 0.0696110725402832, 0.1136908158659935, 0.3647468388080597, 0.000303879874991253, 0.00021889631170779467, 0.0013681071577593684, 0.0008177232812158763, 0.0015278429491445422, 0.0006616472383029759, 0.0028530596755445004, 0.0008280561887659132, 0.1878526657819748], [0.003715561470016837, 0.009763323701918125, 0.004705017898231745, 0.006761363707482815, 0.001900262082926929, 0.010291218757629395, 0.0034087798558175564, 0.7826794981956482, 6.97610157658346e-05, 5.0676189857767895e-05, 0.00021489750361070037, 5.193654942559078e-05, 0.0001578514202265069, 9.552129631629214e-05, 0.00032906749402172863, 0.00019312824588268995, 0.17561209201812744], [0.017970794811844826, 0.062402356415987015, 0.16696622967720032, 0.10648570209741592, 0.015602869912981987, 0.05875176191329956, 0.09175565093755722, 0.3043697774410248, 0.0012507723877206445, 0.000907141191419214, 0.0042973109520971775, 0.001423128880560398, 0.0034312354400753975, 0.0013226415030658245, 0.0031563902739435434, 0.001798111479729414, 0.15810813009738922], [0.0052038091234862804, 0.0024959673173725605, 0.008137423545122147, 0.0035186398308724165, 0.0017063079867511988, 0.005162283778190613, 0.007024023216217756, 0.41655731201171875, 0.009800322353839874, 0.0238154549151659, 0.056142132729291916, 0.07141576707363129, 0.07245911657810211, 0.01625915803015232, 0.06637050956487656, 0.02129381150007248, 0.21263788640499115], [0.0008455067873001099, 0.0003738566010724753, 0.000620844482909888, 0.0002351491857552901, 0.00020837137708440423, 0.0007162520196288824, 0.0007110076257959008, 0.7040684819221497, 0.0015717188362032175, 0.0019563871901482344, 0.010446811094880104, 0.003112013917416334, 0.0041496604681015015, 0.002115382347255945, 0.012127276510000229, 0.003165746806189418, 0.25357550382614136], [0.0014939660904929042, 0.0009045872720889747, 0.0023026205599308014, 0.0007152347243390977, 0.00040488646482117474, 0.0016372604295611382, 0.0027870042249560356, 0.6150442957878113, 0.0015586180379614234, 0.0034028575755655766, 0.019005898386240005, 0.014205393381416798, 0.022172749042510986, 0.004024488851428032, 0.04001874476671219, 0.008228459395468235, 0.26209285855293274], [0.001227494445629418, 0.000421987846493721, 0.0011604413157328963, 0.0008276190492324531, 0.000390278291888535, 0.0010860420297831297, 0.0027716755867004395, 0.5841808319091797, 0.0030065886676311493, 0.005965443793684244, 0.028624000027775764, 0.012488435953855515, 0.0202639102935791, 0.006492956541478634, 0.047732431441545486, 0.012052754871547222, 0.271307110786438], [0.0020037859212607145, 0.0006074415287002921, 0.0015963200712576509, 0.0007590478053316474, 0.0006195286405272782, 0.0015772695187479258, 0.002311371499672532, 0.5919230580329895, 0.0019783054012805223, 0.005207367707043886, 0.00911188405007124, 0.0070528993383049965, 0.012199620716273785, 0.007806513458490372, 0.05115320906043053, 0.027284417301416397, 0.276807963848114], [0.0009521570173092186, 0.0004146900027990341, 0.002537815598770976, 0.0016025726217776537, 0.00038626641617156565, 0.001143086003139615, 0.0031797948759049177, 0.6058680415153503, 0.0007702126749791205, 0.0024959351867437363, 0.010859010741114616, 0.004986986052244902, 0.012542469426989555, 0.002838464453816414, 0.05980003625154495, 0.009359298273921013, 0.28026318550109863], [0.00013300411228556186, 5.819192301714793e-05, 0.00026542070554569364, 8.597633132012561e-05, 2.5102901417994872e-05, 0.00017787377873901278, 0.00018030447245109826, 0.7573187947273254, 4.654625809052959e-05, 0.0001712797675281763, 0.0005805459804832935, 0.00016586539277341217, 0.00038547965232282877, 0.00017712608678266406, 0.003550883149728179, 0.0009645915124565363, 0.23571306467056274], [0.0006251715822145343, 0.00020973793289158493, 0.0009723530965857208, 0.00034984468948096037, 9.998944733524695e-05, 0.0007569047156721354, 0.0025404649786651134, 0.6856247782707214, 0.00031071557896211743, 0.0008398502832278609, 0.005234739743173122, 0.0011217205319553614, 0.00293088611215353, 0.0013648845488205552, 0.031489331275224686, 0.0037305825389921665, 0.26179805397987366], [0.007324892096221447, 0.014801794663071632, 0.03206262364983559, 0.02388639748096466, 0.006559554021805525, 0.017767498269677162, 0.020332414656877518, 0.6187670826911926, 0.0007042190409265459, 0.0006598670734092593, 0.0021553265396505594, 0.0007668306352570653, 0.0019324008608236909, 0.000753125234041363, 0.0023014014586806297, 0.0015532320830971003, 0.24767139554023743]], [[0.03434949740767479, 0.09732656925916672, 0.02025909721851349, 0.019034333527088165, 0.026469845324754715, 0.02357860468327999, 0.04970265179872513, 0.35647785663604736, 0.009763795882463455, 0.013690073043107986, 0.0070798443630337715, 0.003477126359939575, 0.00954556092619896, 0.006846221629530191, 0.008786574006080627, 0.011321065947413445, 0.30229121446609497], [0.10732719302177429, 0.0077919322066009045, 0.0046666888520121574, 0.01163120474666357, 0.02692550979554653, 0.015101059339940548, 0.07073713093996048, 0.27855873107910156, 0.02543705515563488, 0.016258245334029198, 0.007674371358007193, 0.008647495880723, 0.08392240107059479, 0.02051379717886448, 0.006652924697846174, 0.035290226340293884, 0.2728639543056488], [0.016788607463240623, 0.005209881812334061, 0.0005765125388279557, 0.00066500308457762, 0.0019229698227718472, 0.00864559318870306, 0.03432951867580414, 0.5353429913520813, 0.011305931955575943, 0.009043002501130104, 0.005086126271635294, 0.007540910970419645, 0.014036419801414013, 0.00844405498355627, 0.00811686273664236, 0.016590815037488937, 0.31635481119155884], [0.02454465441405773, 0.021562485024333, 0.00226368079893291, 0.00042608322110027075, 0.0018893694505095482, 0.01237315684556961, 0.03123902902007103, 0.40341389179229736, 0.060900088399648666, 0.03134068101644516, 0.018840786069631577, 0.023252304643392563, 0.02691102959215641, 0.021040039137005806, 0.007053886540234089, 0.02019565925002098, 0.29275304079055786], [0.01828431710600853, 0.02547898329794407, 0.005784305743873119, 0.001544356346130371, 0.002042484236881137, 0.01551911886781454, 0.05223841220140457, 0.30757850408554077, 0.09024263173341751, 0.0343596525490284, 0.02852541394531727, 0.026669323444366455, 0.053964436054229736, 0.0375848188996315, 0.01355960126966238, 0.03747671842575073, 0.24914702773094177], [0.060479335486888885, 0.01665169931948185, 0.004634980112314224, 0.005256681703031063, 0.0076979827135801315, 0.005134024657309055, 0.039985742419958115, 0.14534661173820496, 0.11328884959220886, 0.018417738378047943, 0.017103714868426323, 0.021427813917398453, 0.21878406405448914, 0.08965367078781128, 0.01007001381367445, 0.06670805811882019, 0.1593589186668396], [0.027636492624878883, 0.009725716896355152, 0.0029750564135611057, 0.0017882157117128372, 0.00404423987492919, 0.008586341515183449, 0.004201411735266447, 0.5246608853340149, 0.027693891897797585, 0.007156014908105135, 0.006775363348424435, 0.01339282002300024, 0.028928864747285843, 0.03153292462229729, 0.013443696312606335, 0.02900322899222374, 0.2584547698497772], [0.0916278287768364, 0.07970874011516571, 0.04223792254924774, 0.07642164081335068, 0.034982819110155106, 0.05841580405831337, 0.09722817689180374, 0.07318754494190216, 0.04362071305513382, 0.03164798393845558, 0.03369677811861038, 0.05337397754192352, 0.05776224657893181, 0.050733018666505814, 0.0373653918504715, 0.050092630088329315, 0.08789677172899246], [0.043698132038116455, 0.028239743784070015, 0.02698923833668232, 0.03819839656352997, 0.05100812390446663, 0.0450790673494339, 0.15290401875972748, 0.3101671040058136, 0.0005759591003879905, 0.0037953967694193125, 0.0009508946677669883, 0.0002976637042593211, 0.0014606856275349855, 0.0008908226154744625, 0.008029889315366745, 0.005630896892398596, 0.2820839285850525], [0.018396327272057533, 0.018730850890278816, 0.013195384293794632, 0.021400542929768562, 0.016324162483215332, 0.028658747673034668, 0.04996702820062637, 0.4518030881881714, 0.003378210822120309, 0.001182866282761097, 0.004205227363854647, 0.0020564193837344646, 0.004762664437294006, 0.01012951135635376, 0.023584511131048203, 0.017732229083776474, 0.31449222564697266], [0.03573884814977646, 0.011488850228488445, 0.01618473045527935, 0.021006055176258087, 0.028205715119838715, 0.018216989934444427, 0.08476338535547256, 0.4386696219444275, 0.000896258803550154, 0.004407650791108608, 0.0009507917566224933, 0.0006028193165548146, 0.002079796278849244, 0.0014561492716893554, 0.02278987690806389, 0.01099998876452446, 0.3015424609184265], [0.025901664048433304, 0.01776093803346157, 0.027034148573875427, 0.02124025486409664, 0.034851785749197006, 0.043526556342840195, 0.10954408347606659, 0.39232337474823, 0.0005592065863311291, 0.002172097796574235, 0.0012641492066904902, 0.0001356762513751164, 0.001580702723003924, 0.0018871459178626537, 0.023597436025738716, 0.013929898850619793, 0.2826909124851227], [0.025609178468585014, 0.04390484094619751, 0.04311978816986084, 0.018645450472831726, 0.03771309554576874, 0.048595983535051346, 0.0823967456817627, 0.3866143524646759, 0.0005619650473818183, 0.007204623892903328, 0.0012324160197749734, 0.000353985553374514, 0.00040680725942365825, 0.0006121664773672819, 0.0071108099073171616, 0.003588866675272584, 0.29232898354530334], [0.0790477767586708, 0.023823222145438194, 0.02318662777543068, 0.01139883417636156, 0.008783871307969093, 0.028504153713583946, 0.11555903404951096, 0.3652096390724182, 0.0017764047952368855, 0.00642913905903697, 0.0020696839783340693, 0.0010477869072929025, 0.0022462178021669388, 0.002651205752044916, 0.014996293932199478, 0.009383216500282288, 0.3038869798183441], [0.06269603967666626, 0.034566786140203476, 0.04043963551521301, 0.015412146225571632, 0.008476761169731617, 0.03715137392282486, 0.09280227869749069, 0.3399847149848938, 0.01990005187690258, 0.008868354372680187, 0.008542236872017384, 0.012698394246399403, 0.01711549609899521, 0.029993318021297455, 0.006099820137023926, 0.024025702849030495, 0.24122688174247742], [0.07257869839668274, 0.06747972220182419, 0.07185255736112595, 0.02079324796795845, 0.010351445525884628, 0.04249030351638794, 0.055663954466581345, 0.26376715302467346, 0.027923552319407463, 0.021356778219342232, 0.015784848481416702, 0.0158582404255867, 0.017359206452965736, 0.026533914729952812, 0.030341243371367455, 0.022461794316768646, 0.21740344166755676], [0.06814394891262054, 0.07158570736646652, 0.051978837698698044, 0.06937866657972336, 0.029563002288341522, 0.0553324818611145, 0.07971587777137756, 0.1107180118560791, 0.04086760804057121, 0.030400117859244347, 0.02911742776632309, 0.04730575159192085, 0.05056744068861008, 0.04930172115564346, 0.037636250257492065, 0.04828399047255516, 0.1301032453775406]]], [[[0.22334961593151093, 0.04544507712125778, 0.012901648879051208, 0.11139638721942902, 0.022305287420749664, 0.028880508616566658, 0.14566709101200104, 0.06444352120161057, 0.07382683455944061, 0.008636174723505974, 0.011243883520364761, 0.01433570496737957, 0.08674127608537674, 0.028488367795944214, 0.003859615186229348, 0.011602390557527542, 0.10687657445669174], [0.2054121494293213, 0.1656344085931778, 0.016584224998950958, 0.01615143194794655, 0.007097871974110603, 0.04185329005122185, 0.07906866073608398, 0.0988493263721466, 0.14013904333114624, 0.02221917174756527, 0.008556939661502838, 0.009960401803255081, 0.048155996948480606, 0.008807132951915264, 0.0017102649435400963, 0.010089295916259289, 0.11971042305231094], [0.11538565158843994, 0.1335320621728897, 0.049986790865659714, 0.03760025277733803, 0.02531510777771473, 0.057201310992240906, 0.03913629427552223, 0.11763870716094971, 0.023321058601140976, 0.017073171213269234, 0.005146743729710579, 0.0031193189788609743, 0.010933952406048775, 0.006641590967774391, 0.0030563760083168745, 0.00934162363409996, 0.34556999802589417], [0.19842711091041565, 0.1941831111907959, 0.017649250105023384, 0.05712096020579338, 0.014766949228942394, 0.047162532806396484, 0.05200577154755592, 0.09889904409646988, 0.044739555567502975, 0.05029186233878136, 0.004741714335978031, 0.005300873424857855, 0.027047136798501015, 0.005149001721292734, 0.0032374528236687183, 0.010221526026725769, 0.16905608773231506], [0.07979316264390945, 0.039140235632658005, 0.023519888520240784, 0.026372916996479034, 0.016375048086047173, 0.04451238736510277, 0.04727036505937576, 0.14241185784339905, 0.023241175338625908, 0.0342213436961174, 0.004796512890607119, 0.004432917572557926, 0.01822643168270588, 0.005359556060284376, 0.006127100437879562, 0.014163713902235031, 0.47003546357154846], [0.09409326314926147, 0.09914492070674896, 0.012539638206362724, 0.025707218796014786, 0.011062676087021828, 0.08826858550310135, 0.13363900780677795, 0.1642266809940338, 0.04140052944421768, 0.015601911582052708, 0.0041093360632658005, 0.004592863842844963, 0.018961625173687935, 0.005631592124700546, 0.0017440576339140534, 0.006951889954507351, 0.2723243236541748], [0.03883233293890953, 0.12268908321857452, 0.03693877533078194, 0.09221889823675156, 0.022110823541879654, 0.07010053843259811, 0.04896273463964462, 0.12092266231775284, 0.02243798039853573, 0.005595621652901173, 0.002802611328661442, 0.004663331899791956, 0.014370420947670937, 0.0054308767430484295, 0.0007929342100396752, 0.004955661948770285, 0.3861747086048126], [0.029682574793696404, 0.023776760324835777, 0.02015557698905468, 0.018913792446255684, 0.006949894595891237, 0.020058300346136093, 0.03902122378349304, 0.19137701392173767, 0.012320042587816715, 0.009402570314705372, 0.008538808673620224, 0.004676864016801119, 0.008718949742615223, 0.007471541408449411, 0.006121929734945297, 0.01721966825425625, 0.5755945444107056], [0.20950822532176971, 0.02008242905139923, 0.00826579611748457, 0.0074532185681164265, 0.001041739247739315, 0.008228019811213017, 0.06663142889738083, 0.08298065513372421, 0.24637892842292786, 0.005853989627212286, 0.03994271159172058, 0.023511014878749847, 0.0934559628367424, 0.04390285536646843, 0.0031090739648789167, 0.009409905411303043, 0.13024404644966125], [0.07167889922857285, 0.03921680897474289, 0.04868743196129799, 0.014409826137125492, 0.0027814190834760666, 0.019617771729826927, 0.026637515053153038, 0.09544185549020767, 0.15808409452438354, 0.01385003887116909, 0.04845315217971802, 0.056853797286748886, 0.05582408979535103, 0.06893220543861389, 0.005575505085289478, 0.02964528277516365, 0.24431034922599792], [0.06710069626569748, 0.016882674768567085, 0.02311631478369236, 0.018601343035697937, 0.0013110608560964465, 0.012744827196002007, 0.012153306044638157, 0.06607306748628616, 0.2938641607761383, 0.006731635425239801, 0.0906231626868248, 0.0702696219086647, 0.07799869030714035, 0.10082665085792542, 0.004739957395941019, 0.011453518643975258, 0.1255093216896057], [0.09179053455591202, 0.01987321302294731, 0.030382022261619568, 0.009743300266563892, 0.0013207471929490566, 0.012596304528415203, 0.03721640259027481, 0.11156687885522842, 0.17090731859207153, 0.011862560175359249, 0.04265942424535751, 0.033101994544267654, 0.050616104155778885, 0.07870694249868393, 0.004741842858493328, 0.015433902852237225, 0.2774805426597595], [0.2576143741607666, 0.02710997313261032, 0.021341141313314438, 0.019533343613147736, 0.0027714292518794537, 0.015587026253342628, 0.05105403810739517, 0.08885181695222855, 0.17595644295215607, 0.006353248842060566, 0.020285453647375107, 0.014229584485292435, 0.06799513101577759, 0.036016955971717834, 0.004394363146275282, 0.00834150705486536, 0.1825641691684723], [0.17798307538032532, 0.011355855502188206, 0.00811835564672947, 0.004083146806806326, 0.0007110852748155594, 0.006480460055172443, 0.03835621848702431, 0.07030012458562851, 0.2545212507247925, 0.007946201600134373, 0.04424753040075302, 0.04049238935112953, 0.0911286324262619, 0.10645229369401932, 0.008307570591568947, 0.014278880320489407, 0.11523699760437012], [0.0330929234623909, 0.011904238723218441, 0.02422855980694294, 0.02565036341547966, 0.0029049755539745092, 0.005626334343105555, 0.01817009039223194, 0.17400415241718292, 0.031150320544838905, 0.013966593891382217, 0.04520203545689583, 0.010443846695125103, 0.015949053689837456, 0.021923020482063293, 0.014933396130800247, 0.02724243514239788, 0.5236077308654785], [0.06568560749292374, 0.01716724969446659, 0.028122151270508766, 0.022932587191462517, 0.0036854213103652, 0.013836492784321308, 0.04440990090370178, 0.1338202953338623, 0.04952190816402435, 0.022777052596211433, 0.03206106647849083, 0.01763630285859108, 0.021503973752260208, 0.022958431392908096, 0.015991495922207832, 0.025937054306268692, 0.46195289492607117], [0.016917074099183083, 0.017068440094590187, 0.04986407607793808, 0.02099430188536644, 0.015518118627369404, 0.026000872254371643, 0.03714475780725479, 0.1345081925392151, 0.01887369155883789, 0.032998282462358475, 0.027714664116501808, 0.020497098565101624, 0.014761744998395443, 0.01996416039764881, 0.03010367974638939, 0.0459233857691288, 0.4711473882198334]], [[0.04745788127183914, 0.02330286242067814, 0.00460421247407794, 0.022984430193901062, 0.001726088230498135, 0.008383411914110184, 0.0055729420855641365, 0.11049287021160126, 0.016018880531191826, 0.0343489944934845, 0.006528163328766823, 0.0102576594799757, 0.02138156071305275, 0.004719866905361414, 0.0015259117353707552, 0.005403659772127867, 0.6752907037734985], [0.0009574544965289533, 0.002554595237597823, 0.027084091678261757, 0.005700032226741314, 0.0003321235708426684, 0.0007152469479478896, 0.003384950803592801, 0.04437915235757828, 0.0007298921700567007, 0.002300325781106949, 0.00017232676327694207, 0.0008278345339931548, 0.002465510740876198, 0.0004068814741913229, 0.0002494769578333944, 0.0016195033676922321, 0.906120777130127], [0.00016531070286873728, 0.0006252555176615715, 0.1420842856168747, 0.0010585576528683305, 0.00020448744180612266, 0.0005079521215520799, 0.001388776465319097, 0.029470516368746758, 9.030548244481906e-05, 0.0008749518892727792, 4.1537332435837016e-05, 0.0003755349025595933, 0.00014518832904286683, 0.00010915933671640232, 0.0002005560090765357, 0.0010218382813036442, 0.8216357231140137], [0.00047921971417963505, 0.001248587155714631, 0.0066655585542321205, 0.06171967834234238, 0.0013536717742681503, 0.0008476137300021946, 0.003490882460027933, 0.029017062857747078, 0.0009102790500037372, 0.0009072656976059079, 0.00031423219479620457, 0.001746626803651452, 0.001966055715456605, 0.0005210721283219755, 0.0001863153011072427, 0.0013494685990735888, 0.8872764706611633], [0.0004168582381680608, 0.0010174739873036742, 0.005488310940563679, 0.0066293273121118546, 0.00565591175109148, 0.0025001955218613148, 0.010205171070992947, 0.034396130591630936, 0.0005716965533792973, 0.0014805422397330403, 0.0005790446302853525, 0.0009272907627746463, 0.0028798049315810204, 0.00030082796001806855, 0.00012959912419319153, 0.0016169508453458548, 0.925204873085022], [0.0006333431228995323, 0.000854389276355505, 0.022259458899497986, 0.008508424274623394, 0.000823633570689708, 0.00474295299500227, 0.01637878455221653, 0.05356863513588905, 0.0004959264770150185, 0.00320037011988461, 0.00023786294332239777, 0.0010376371210440993, 0.0011369785061106086, 0.00040708534652367234, 0.0004274403618182987, 0.0022035797592252493, 0.8830835819244385], [0.0003692205063998699, 0.0005525645683519542, 0.014511391520500183, 0.004177744500339031, 0.0008507512975484133, 0.0016379663720726967, 0.017305616289377213, 0.04013440012931824, 0.000274520629318431, 0.0017871917225420475, 0.0004125804698560387, 0.0003394914383534342, 0.0013362254248932004, 0.00046668306458741426, 0.0005942053976468742, 0.0013454346917569637, 0.9139041304588318], [0.0005615242989733815, 0.0008174806134775281, 0.0031197357457131147, 0.0016172801842913032, 0.00022943184012547135, 0.0008133886149153113, 0.0027137103024870157, 0.05553648993372917, 0.0002570735232438892, 0.0006341561675071716, 0.0003574947768356651, 0.0004817527951672673, 0.0006982883205637336, 0.0003166442911606282, 0.00031529372790828347, 0.000871897442266345, 0.9306583404541016], [0.009538985788822174, 0.0036278332117944956, 0.001779107959009707, 0.005225180648267269, 0.0006457989220507443, 0.001817913493141532, 0.01900104247033596, 0.051933590322732925, 0.1896081566810608, 0.027474183589220047, 0.012506905011832714, 0.020788928493857384, 0.07309415191411972, 0.05930856242775917, 0.001197939389385283, 0.010513280518352985, 0.5119383931159973], [0.00324865966103971, 0.0017587152542546391, 0.017829954624176025, 0.0037363844458013773, 0.00023736088769510388, 0.0007947667618282139, 0.024374626576900482, 0.05261950194835663, 0.0075270733796060085, 0.16821813583374023, 0.0031757475808262825, 0.010573476552963257, 0.013765878044068813, 0.00248114881105721, 0.0017197425477206707, 0.01186852715909481, 0.6760703325271606], [0.0013691228814423084, 0.00028685404686257243, 0.0004353309341240674, 0.000653051829431206, 0.00011953420471400023, 0.00024034528178162873, 0.007357578258961439, 0.04939660802483559, 0.002377317985519767, 0.005314792972058058, 0.1814413219690323, 0.004728224594146013, 0.014040631242096424, 0.0022460210602730513, 0.0008619906147941947, 0.005620535928755999, 0.7235108017921448], [0.0035129443276673555, 0.0021358351223170757, 0.003529864829033613, 0.00904749147593975, 0.00033770076697692275, 0.001375570078380406, 0.013912446796894073, 0.06434755772352219, 0.010392139665782452, 0.013372473418712616, 0.008855434134602547, 0.12270186096429825, 0.013607763685286045, 0.005099679343402386, 0.0012336504878476262, 0.025387519970536232, 0.701150119304657], [0.006513859145343304, 0.0012168746907263994, 0.00026273741968907416, 0.0013329992070794106, 0.0002614783588796854, 0.0005129208439029753, 0.00653555104508996, 0.039823200553655624, 0.02370712161064148, 0.005794275552034378, 0.023297714069485664, 0.008835051208734512, 0.2976222634315491, 0.011080462485551834, 0.000275573693215847, 0.005315711721777916, 0.5676122307777405], [0.0050578187219798565, 0.0015462074661627412, 0.0017522982088848948, 0.0019736173562705517, 0.00048714448348619044, 0.0019928989931941032, 0.02408319152891636, 0.0606706477701664, 0.06658952683210373, 0.011744302697479725, 0.019833983853459358, 0.026256138458848, 0.0824577733874321, 0.167005717754364, 0.0025754880625754595, 0.04184817522764206, 0.4841251075267792], [9.332132322015241e-05, 0.00016968167619779706, 0.002285520313307643, 0.00037579829222522676, 4.4935575715499e-05, 0.00014303285570349544, 0.0015385915758088231, 0.027171317487955093, 0.0001933723542606458, 0.0010026845848187804, 0.000521582318469882, 0.0007205139263533056, 0.0004809320962522179, 0.0006698325742036104, 0.018094154074788094, 0.004332810174673796, 0.9421619772911072], [0.0012205411912873387, 0.0009156119194813073, 0.0058319903910160065, 0.0012008778285235167, 0.00015880161663517356, 0.0009117217850871384, 0.00689153466373682, 0.05865723267197609, 0.0019034824799746275, 0.002919857855886221, 0.0017273338744416833, 0.00487901596352458, 0.006055849604308605, 0.003261469304561615, 0.0011902570258826017, 0.03724973276257515, 0.8650246858596802], [0.0010529852006584406, 0.0012099314481019974, 0.004019669257104397, 0.0023122832644730806, 0.0005112547078169882, 0.0011719244066625834, 0.00534292496740818, 0.05445542931556702, 0.0010839805472642183, 0.0018686613766476512, 0.0015111233806237578, 0.0015619475161656737, 0.0020400662906467915, 0.001221836544573307, 0.0008549784542992711, 0.0026152473874390125, 0.9171657562255859]], [[0.01675097830593586, 0.029751010239124298, 0.01335798017680645, 0.08050186187028885, 0.00881408154964447, 0.021077139303088188, 0.1438671350479126, 0.27450665831565857, 0.030976010486483574, 0.009374797344207764, 0.016454769298434258, 0.015129131264984608, 0.031961724162101746, 0.008224301040172577, 0.0026940149255096912, 0.003284914419054985, 0.2932735085487366], [0.009807988069951534, 0.04848639667034149, 0.018892770633101463, 0.04522402957081795, 0.006287489086389542, 0.02015862427651882, 0.012982192449271679, 0.20467634499073029, 0.0012946610804647207, 0.0037379288114607334, 0.001596081187017262, 0.0005624869372695684, 0.0007748717907816172, 0.0009768962627276778, 0.002531283302232623, 0.0017632426461204886, 0.6202467083930969], [0.0033734412863850594, 0.036748338490724564, 0.10612611472606659, 0.06509694457054138, 0.01802372746169567, 0.02739577554166317, 0.021190552040934563, 0.12178847938776016, 0.00028220072272233665, 0.002377759199589491, 0.0008520626579411328, 0.00016741218860261142, 0.0002790550352074206, 0.00043050170643255115, 0.006088642403483391, 0.0032250024378299713, 0.5865539908409119], [0.016044748947024345, 0.01759292371571064, 0.012684546411037445, 0.04510921984910965, 0.013592444360256195, 0.01100713200867176, 0.006792750675231218, 0.15109962224960327, 0.0019960757344961166, 0.005790626630187035, 0.004013760946691036, 0.001138357911258936, 0.0024620031472295523, 0.0011090605985373259, 0.005360256880521774, 0.0026596926618367434, 0.7015467882156372], [0.016137422993779182, 0.011018023826181889, 0.008834225125610828, 0.01629466377198696, 0.01579330675303936, 0.02493588626384735, 0.009196176193654537, 0.1381547451019287, 0.0018121449975296855, 0.0028578429482877254, 0.0012880483409389853, 0.0012763841077685356, 0.0015178743051365018, 0.0026097972877323627, 0.002755651483312249, 0.0030731752049177885, 0.742444634437561], [0.019156169146299362, 0.0235374104231596, 0.01014165859669447, 0.025943588465452194, 0.035142578184604645, 0.09075773507356644, 0.0418345145881176, 0.19485025107860565, 0.0013698843540623784, 0.008079328574240208, 0.001734949299134314, 0.0018870810745283961, 0.00258996500633657, 0.0020618541166186333, 0.00639730878174305, 0.008622993715107441, 0.5258926749229431], [0.08572693169116974, 0.02584565058350563, 0.011317632161080837, 0.021288689225912094, 0.06010239198803902, 0.0797385424375534, 0.09059523791074753, 0.18106722831726074, 0.008589528501033783, 0.015902483835816383, 0.0034983570221811533, 0.0019173118053004146, 0.015633245930075645, 0.004648653324693441, 0.0028057326562702656, 0.008108342066407204, 0.38321396708488464], [0.017198767513036728, 0.020206568762660027, 0.009553317911922932, 0.01158920768648386, 0.005120620597153902, 0.022092066705226898, 0.020027711987495422, 0.22345352172851562, 0.001523945014923811, 0.0035616119857877493, 0.0022848770022392273, 0.0015681169461458921, 0.002939742524176836, 0.0017024994594976306, 0.0031027314253151417, 0.0035284715704619884, 0.6505462527275085], [0.030897509306669235, 0.029706252738833427, 0.005836248863488436, 0.012706184759736061, 0.001009413506835699, 0.008909190073609352, 0.03950071334838867, 0.22726328670978546, 0.01513813529163599, 0.00919159036129713, 0.0046218703500926495, 0.012753555551171303, 0.016143543645739555, 0.0076603456400334835, 0.0010620193788781762, 0.0023256863933056593, 0.5752743482589722], [0.0716332271695137, 0.12027689814567566, 0.018818138167262077, 0.029427461326122284, 0.010100743733346462, 0.06141945719718933, 0.056331392377614975, 0.18822944164276123, 0.01111608650535345, 0.013545231893658638, 0.0010341029847040772, 0.003141344292089343, 0.005898731295019388, 0.004912815988063812, 0.005096282344311476, 0.004002546425908804, 0.3950161039829254], [0.05005652830004692, 0.023346595466136932, 0.013149992562830448, 0.008236930705606937, 0.0016482234932482243, 0.016047902405261993, 0.021162724122405052, 0.26056280732154846, 0.010271149687469006, 0.01741165854036808, 0.008194400928914547, 0.020672256126999855, 0.016174929216504097, 0.013013890944421291, 0.008251837454736233, 0.014158356003463268, 0.4976397752761841], [0.036740124225616455, 0.056287072598934174, 0.010855757631361485, 0.02545304037630558, 0.0025417148135602474, 0.017475077882409096, 0.030599679797887802, 0.2700594961643219, 0.0076386891305446625, 0.006507040932774544, 0.0016206931322813034, 0.0035116563085466623, 0.0031255113426595926, 0.0031680818647146225, 0.0018867311300709844, 0.0025250008329749107, 0.5200045704841614], [0.04290362074971199, 0.013052191585302353, 0.0032662921585142612, 0.006345386616885662, 0.0014085173606872559, 0.006052630487829447, 0.037504758685827255, 0.26874879002571106, 0.029189229011535645, 0.007478197105228901, 0.005296583753079176, 0.008596551604568958, 0.023249449208378792, 0.016859835013747215, 0.002502261893823743, 0.003873936366289854, 0.5236716866493225], [0.03169481083750725, 0.029814638197422028, 0.011814289726316929, 0.008182005025446415, 0.0032363005448132753, 0.022044910117983818, 0.056778859347105026, 0.26119160652160645, 0.009015236049890518, 0.009280598722398281, 0.004600833635777235, 0.013636325486004353, 0.021582169458270073, 0.023674391210079193, 0.007662277203053236, 0.018768150359392166, 0.467022567987442], [0.039222002029418945, 0.043641261756420135, 0.02286531776189804, 0.004107121843844652, 0.0040258243680000305, 0.045961979776620865, 0.04067198932170868, 0.20811690390110016, 0.0079402020201087, 0.018867485225200653, 0.006081747822463512, 0.00906203594058752, 0.016396893188357353, 0.016430266201496124, 0.054756175726652145, 0.06617768108844757, 0.3956751823425293], [0.02755782939493656, 0.0332137756049633, 0.01498511154204607, 0.007432879880070686, 0.00479091377928853, 0.036900416016578674, 0.03010246902704239, 0.24406196177005768, 0.004864844027906656, 0.008963802829384804, 0.0019401784520596266, 0.004367450252175331, 0.007558388635516167, 0.012306895107030869, 0.011382388882339, 0.01914852298796177, 0.5304221510887146], [0.013592668808996677, 0.01773017644882202, 0.018561914563179016, 0.029333721846342087, 0.00874125026166439, 0.01348085142672062, 0.01935093104839325, 0.1219952255487442, 0.0028546026442199945, 0.005693513434380293, 0.0035189699847251177, 0.00231643277220428, 0.004245359916239977, 0.002541603520512581, 0.004276231862604618, 0.004385537002235651, 0.7273810505867004]], [[0.009952981024980545, 0.04738154634833336, 0.005876624956727028, 0.00358164613135159, 0.0065017701126635075, 0.033864617347717285, 0.20481941103935242, 0.16318625211715698, 0.01629616506397724, 0.009542688727378845, 0.011546830646693707, 0.008731205016374588, 0.006188470404595137, 0.01297728717327118, 0.016307521611452103, 0.028125157579779625, 0.41511979699134827], [0.09298623353242874, 0.00035160023253411055, 0.0002208138903370127, 0.0005738306790590286, 0.00026605636230669916, 0.001048707403242588, 0.05959992855787277, 0.09746970236301422, 0.006356380879878998, 0.013193927705287933, 0.0026062987744808197, 0.002854249905794859, 0.011599290184676647, 0.010597079992294312, 0.0027982122264802456, 0.023695431649684906, 0.6737822890281677], [0.06408531218767166, 0.0017792348517104983, 0.0005354708409868181, 0.0022419702727347612, 0.0016292397631332278, 0.0034711831249296665, 0.10403171926736832, 0.17723245918750763, 0.006626461166888475, 0.01894967071712017, 0.004174056928604841, 0.005490654148161411, 0.022562619298696518, 0.010317179374396801, 0.008353099226951599, 0.04515991359949112, 0.5233598351478577], [0.033806588500738144, 0.0013781951274722815, 0.0008582998998463154, 0.000281147105852142, 0.0002780203358270228, 0.002088468987494707, 0.015017666853964329, 0.14943912625312805, 0.009429430589079857, 0.006226611789315939, 0.004327022936195135, 0.0029319231398403645, 0.007113713771104813, 0.010861149057745934, 0.0021692633163183928, 0.008762134239077568, 0.7450311183929443], [0.025979332625865936, 0.004423297941684723, 0.0023171042557805777, 0.001342316623777151, 0.00048344553215429187, 0.005202721804380417, 0.08007914572954178, 0.17749916017055511, 0.013963770121335983, 0.011629631742835045, 0.006705607753247023, 0.00352342426776886, 0.006770276464521885, 0.014785654842853546, 0.0034628526773303747, 0.016041573137044907, 0.6257906556129456], [0.04126323014497757, 0.000779440626502037, 0.0002913215139415115, 0.00040699890814721584, 0.00019081206119153649, 0.0005449484451673925, 0.04097115620970726, 0.131930872797966, 0.00479675829410553, 0.004714667331427336, 0.00319970422424376, 0.001387981348671019, 0.005079562775790691, 0.00883167702704668, 0.00305686192587018, 0.013283140026032925, 0.7392708659172058], [0.02620951645076275, 0.008779098279774189, 0.010552973486483097, 0.0015640028286725283, 0.0023082434199750423, 0.009384912438690662, 0.01815946400165558, 0.13902446627616882, 0.012570589780807495, 0.009399722330272198, 0.01834506168961525, 0.008715079165995121, 0.0056093232706189156, 0.017529653385281563, 0.015946317464113235, 0.01876763068139553, 0.6771338582038879], [0.11347553879022598, 0.025345291942358017, 0.015065250918269157, 0.01908656768500805, 0.011953286826610565, 0.02736031450331211, 0.17395128309726715, 0.11931822448968887, 0.029880160465836525, 0.030435921624302864, 0.028830060735344887, 0.025185909122228622, 0.038336385041475296, 0.043343670666217804, 0.02363160438835621, 0.06330331414937973, 0.21149714291095734], [0.02720571495592594, 0.025808049365878105, 0.002410859102383256, 0.015553703531622887, 0.03539576753973961, 0.12781444191932678, 0.07928664982318878, 0.19401277601718903, 0.0026284398045390844, 0.006241445429623127, 0.003444816218689084, 0.0034225096460431814, 0.006596450228244066, 0.0057450272142887115, 0.003905800636857748, 0.01936313696205616, 0.44116446375846863], [0.014987330883741379, 0.01654433086514473, 0.004670808557420969, 0.005397018976509571, 0.01839473471045494, 0.0595112070441246, 0.056780021637678146, 0.24405008554458618, 0.005477063823491335, 0.005542357452213764, 0.012659691274166107, 0.004686854314059019, 0.006718002259731293, 0.018565839156508446, 0.008033482357859612, 0.022138889878988266, 0.49584221839904785], [0.023367779329419136, 0.00929228775203228, 0.0057346755638718605, 0.010682579129934311, 0.03002633899450302, 0.03418240323662758, 0.06527934223413467, 0.15962931513786316, 0.0058802710846066475, 0.016777807846665382, 0.005971783772110939, 0.005403390619903803, 0.013459893874824047, 0.018081102520227432, 0.011667517945170403, 0.0308490339666605, 0.5537144541740417], [0.033160556107759476, 0.018078090623021126, 0.003678599139675498, 0.015722043812274933, 0.022328738123178482, 0.035864416509866714, 0.035239096730947495, 0.20345520973205566, 0.008283070288598537, 0.00493867602199316, 0.006215236149728298, 0.0013074162416160107, 0.011122860945761204, 0.016681626439094543, 0.0038104248233139515, 0.013658096082508564, 0.5664558410644531], [0.008265090174973011, 0.07553514093160629, 0.012609598226845264, 0.009644989855587482, 0.020348671823740005, 0.18203967809677124, 0.04354019835591316, 0.1949760913848877, 0.0037799314595758915, 0.003312955843284726, 0.004396146163344383, 0.0024564391933381557, 0.0014623665483668447, 0.005814823321998119, 0.0032012425363063812, 0.010792434215545654, 0.41782429814338684], [0.02328040637075901, 0.031033296138048172, 0.004317097365856171, 0.02397785894572735, 0.03226369246840477, 0.155626580119133, 0.10364849865436554, 0.21960018575191498, 0.0026270789094269276, 0.005100365728139877, 0.0027488917112350464, 0.0032041475642472506, 0.005506304558366537, 0.004633365664631128, 0.002524188719689846, 0.014377628453075886, 0.3655303418636322], [0.054188668727874756, 0.01049194484949112, 0.02951459214091301, 0.00977024994790554, 0.028101466596126556, 0.031393393874168396, 0.12242131680250168, 0.18197093904018402, 0.010021853260695934, 0.02751804329454899, 0.018421322107315063, 0.0161918755620718, 0.016567107290029526, 0.025737939402461052, 0.007757308427244425, 0.06321997195482254, 0.34671199321746826], [0.03405828773975372, 0.04465678706765175, 0.044988345354795456, 0.0158408060669899, 0.036020077764987946, 0.07103562355041504, 0.11676257103681564, 0.22488602995872498, 0.011830955743789673, 0.009717056527733803, 0.009797455742955208, 0.0113305589184165, 0.015888774767518044, 0.018153980374336243, 0.0065796454437077045, 0.011301378719508648, 0.3171516954898834], [0.07268758118152618, 0.06594744324684143, 0.049214161932468414, 0.06161388009786606, 0.06561169028282166, 0.07568227499723434, 0.16902905702590942, 0.05642540752887726, 0.040534183382987976, 0.04548986628651619, 0.035365715622901917, 0.03673427551984787, 0.051265012472867966, 0.04364994168281555, 0.030975796282291412, 0.05540724843740463, 0.044366504997015]], [[0.005204841028898954, 0.03407548367977142, 0.002107110572978854, 0.1225392073392868, 0.015153540298342705, 0.050548799335956573, 0.6777923107147217, 0.04215997830033302, 0.0019765684846788645, 0.0022024039644747972, 0.0008170875371433794, 0.0005653582629747689, 0.0031630704179406166, 0.00042511516949161887, 0.00014279474271461368, 0.0006465389742515981, 0.04047980159521103], [0.14373140037059784, 0.02910265326499939, 0.008474052883684635, 0.00771362753584981, 0.006538440473377705, 0.017508842051029205, 0.006405793130397797, 0.17720316350460052, 0.011413845233619213, 0.02183070033788681, 0.0028578867204487324, 0.0057427240535616875, 0.020800096914172173, 0.0018734498880803585, 0.0013526835246011615, 0.008654248900711536, 0.528796374797821], [0.02394009567797184, 0.017240235581994057, 0.012724577449262142, 0.01197814755141735, 0.004304948262870312, 0.018189184367656708, 0.002811320824548602, 0.106321319937706, 0.0020229408983141184, 0.0029710757080465555, 0.0015049027279019356, 0.0019795820116996765, 0.003968261182308197, 0.0018299929797649384, 0.0014813030138611794, 0.002951623173430562, 0.7837804555892944], [0.07994788140058517, 0.014072759076952934, 0.03966859355568886, 0.026510458439588547, 0.006724148523062468, 0.0042208340018987656, 0.0015313595067709684, 0.07985836267471313, 0.005621130578219891, 0.025973543524742126, 0.005685980431735516, 0.007424330338835716, 0.010493740439414978, 0.0017971335910260677, 0.0018485661130398512, 0.009578398428857327, 0.6790427565574646], [0.04743397980928421, 0.03711431100964546, 0.08984775096178055, 0.04952513054013252, 0.027655702084302902, 0.046471353620290756, 0.011732588522136211, 0.14207734167575836, 0.0027335607446730137, 0.00987397599965334, 0.003886388847604394, 0.0024463788140565157, 0.005156119354069233, 0.001468632253818214, 0.0023436136543750763, 0.010650315321981907, 0.5095828175544739], [0.12473386526107788, 0.06508834660053253, 0.02551909349858761, 0.016711929813027382, 0.016396967694163322, 0.050547242164611816, 0.01509903371334076, 0.23998820781707764, 0.011749571189284325, 0.013211091049015522, 0.004407459404319525, 0.00413055345416069, 0.018108565360307693, 0.005678538698703051, 0.002544565824791789, 0.014523865655064583, 0.37156108021736145], [0.03510328754782677, 0.03967694193124771, 0.04875274375081062, 0.04046376422047615, 0.026164203882217407, 0.10067471116781235, 0.03682190552353859, 0.19395391643047333, 0.0035414870362728834, 0.008903183043003082, 0.006341889966279268, 0.0020892443135380745, 0.008663601242005825, 0.006561058573424816, 0.005935242865234613, 0.018559131771326065, 0.4177936911582947], [0.0015197531320154667, 0.0026193922385573387, 0.0029102624393999577, 0.002283281646668911, 0.0006328781601041555, 0.0021114093251526356, 0.0032510675955563784, 0.052734874188899994, 0.0003216416807845235, 0.0007327760104089975, 0.0007540354272350669, 0.0002972047950606793, 0.0010756902629509568, 0.0002729342086240649, 0.0004701016878243536, 0.0013630398316308856, 0.9266497492790222], [0.013844256289303303, 0.007088549435138702, 0.0024541073944419622, 0.002832758706063032, 0.0003913055988959968, 0.0017200567526742816, 0.0017184932949021459, 0.03334829583764076, 0.009892580099403858, 0.012575415894389153, 0.0038891471922397614, 0.00887538492679596, 0.013123350217938423, 0.0020779240876436234, 0.0011346886167302728, 0.0037584041710942984, 0.8812752366065979], [0.10204585641622543, 0.05041077360510826, 0.015535542741417885, 0.01755315065383911, 0.001965764444321394, 0.006713863927870989, 0.010417301207780838, 0.022889409214258194, 0.1259976178407669, 0.1013011485338211, 0.009401791729032993, 0.061093609780073166, 0.08726859092712402, 0.020065616816282272, 0.0038671300280839205, 0.01900539919734001, 0.34446752071380615], [0.0034805976320058107, 0.004502805881202221, 0.004970182664692402, 0.0036184703931212425, 0.0003958944871556014, 0.0020440497901290655, 0.003213439369574189, 0.021063873544335365, 0.006273280829191208, 0.016385797411203384, 0.008800084702670574, 0.008656669408082962, 0.01625555381178856, 0.005845724139362574, 0.003200517501682043, 0.010672219097614288, 0.8806210160255432], [0.027767449617385864, 0.02237100526690483, 0.009050381369888783, 0.00980295054614544, 0.0015436640242114663, 0.004774736240506172, 0.005631148815155029, 0.03611321002244949, 0.0648488700389862, 0.038524650037288666, 0.012134503573179245, 0.03662845864892006, 0.062070515006780624, 0.0170111283659935, 0.004340749699622393, 0.014499379321932793, 0.6328871250152588], [0.01581813581287861, 0.003917735535651445, 0.0013730424689128995, 0.0033506813924759626, 0.0002956163662020117, 0.0017931296024471521, 0.004185955971479416, 0.02266732230782509, 0.009949246421456337, 0.013715957291424274, 0.004446935374289751, 0.011877600103616714, 0.01683041825890541, 0.003616312285885215, 0.0017293255077674985, 0.007327555678784847, 0.8771049380302429], [0.006913828197866678, 0.005203189793974161, 0.0014318230096250772, 0.0014848390128463507, 0.0003076017019338906, 0.0026958186645060778, 0.0020691261161118746, 0.048091497272253036, 0.00479669077321887, 0.0034601581282913685, 0.0038403982762247324, 0.003438242245465517, 0.008097176440060139, 0.004802146460860968, 0.0026416077744215727, 0.007860969752073288, 0.8928649425506592], [0.0025386104825884104, 0.0012687371345236897, 0.0012562754563987255, 0.0013890565605834126, 0.00016666334704495966, 0.0016160283703356981, 0.004016486927866936, 0.02181178145110607, 0.003016178496181965, 0.006748720537871122, 0.004395125433802605, 0.006255495827645063, 0.007816332392394543, 0.010647032409906387, 0.011209199205040932, 0.024585027247667313, 0.8912632465362549], [0.00715211033821106, 0.011410025879740715, 0.007488959934562445, 0.007057667709887028, 0.0012691087322309613, 0.00901399552822113, 0.0116407610476017, 0.047445062547922134, 0.012017988599836826, 0.030119163915514946, 0.010706176050007343, 0.016342613846063614, 0.017787788063287735, 0.025879979133605957, 0.017043061554431915, 0.05693989619612694, 0.7106855511665344], [0.002402522834017873, 0.003384125418961048, 0.0087100425735116, 0.008039746433496475, 0.0018000125419348478, 0.0034357854165136814, 0.0035392078571021557, 0.03138187900185585, 0.0014184446772560477, 0.004346534144133329, 0.004506658297032118, 0.0019616528879851103, 0.003484683111310005, 0.00155422103125602, 0.002486765617504716, 0.00468825688585639, 0.9128594994544983]], [[0.019963618367910385, 0.017944909632205963, 0.008660348132252693, 0.0829276368021965, 0.008861251175403595, 0.01492066215723753, 0.012928025797009468, 0.13546733558177948, 0.00262853573076427, 0.0014781263889744878, 0.003441938664764166, 0.0023351472336798906, 0.01256809663027525, 0.0022736184764653444, 0.0034237003419548273, 0.0075202807784080505, 0.6626569032669067], [0.001451709191314876, 0.02922528237104416, 0.011584442108869553, 0.010571611113846302, 0.0013170856982469559, 0.008208170533180237, 0.006780949886888266, 0.08624384552240372, 0.0052875555120408535, 0.0038706744089722633, 0.004101715981960297, 0.0027224882505834103, 0.0015493856044486165, 0.0028075301088392735, 0.0006345934816636145, 0.006242303643375635, 0.8174006938934326], [0.0001719670108286664, 0.0035341333132237196, 0.021625351160764694, 0.0036666709929704666, 0.000366371386917308, 0.0018375549698248506, 0.0027572105173021555, 0.04479680582880974, 0.00033248934778384864, 0.001326196943409741, 0.002177425194531679, 0.001169707509689033, 0.00035607090103439987, 0.00029292749240994453, 0.0004649751936085522, 0.0027508726343512535, 0.9123733639717102], [0.0009402955183759332, 0.006159205920994282, 0.005289788357913494, 0.03663551062345505, 0.003113113809376955, 0.0038910494185984135, 0.0035400190390646458, 0.054859526455402374, 0.002104731509461999, 0.0027322738897055387, 0.002351493341848254, 0.004767206497490406, 0.005189696326851845, 0.001345552853308618, 0.0006212627631612122, 0.003477463498711586, 0.8629818558692932], [0.0007799412705935538, 0.002961116610094905, 0.002921845531091094, 0.008369104005396366, 0.008242540061473846, 0.006066012196242809, 0.004739702679216862, 0.06448233872652054, 0.0012943390756845474, 0.003386968746781349, 0.0032445744145661592, 0.00345986126922071, 0.003348858328536153, 0.0014997366815805435, 0.00042004932765848935, 0.0017598906997591257, 0.8830230832099915], [0.0011904079001396894, 0.01727544330060482, 0.010482990182936192, 0.006945687346160412, 0.002596807898953557, 0.03711846098303795, 0.013369420543313026, 0.08942043036222458, 0.00395588856190443, 0.004293615464121103, 0.0016496313037350774, 0.0031550463754683733, 0.0008638176368549466, 0.005054011009633541, 0.0011756974272429943, 0.004799703136086464, 0.7966529130935669], [0.0023757999297231436, 0.011649273335933685, 0.011796004138886929, 0.008691658265888691, 0.003217199118807912, 0.0060715265572071075, 0.013582095503807068, 0.09593246132135391, 0.007818793877959251, 0.014080547727644444, 0.0024773161858320236, 0.003488111076876521, 0.003472256241366267, 0.0015034826938062906, 0.002870396710932255, 0.0073859053663909435, 0.8035871982574463], [0.0007149599259719253, 0.004970194306224585, 0.007638526149094105, 0.0038881544023752213, 0.0009603402577340603, 0.0033868192695081234, 0.004386423621326685, 0.06881207227706909, 0.0008635923149995506, 0.0020303898490965366, 0.0020160796120762825, 0.000930745794903487, 0.0011523665161803365, 0.0009559186873957515, 0.0013196816435083747, 0.0020660783629864454, 0.8939076066017151], [0.0027490798383951187, 0.06937077641487122, 0.0027846633456647396, 0.010555428452789783, 0.004104809835553169, 0.016678549349308014, 0.005868773441761732, 0.07777124643325806, 0.16701601445674896, 0.02342330850660801, 0.004890852142125368, 0.008676744997501373, 0.017197806388139725, 0.019120611250400543, 0.002128365682438016, 0.00863741710782051, 0.5590255856513977], [0.0003786626330111176, 0.006038539577275515, 0.0031546750105917454, 0.002935584168881178, 0.0008885682909749448, 0.002901389030739665, 0.006924943532794714, 0.04647042974829674, 0.007295882795006037, 0.06296028941869736, 0.0011092759668827057, 0.0033829393796622753, 0.0038824931252747774, 0.0013635430950671434, 0.0022644298151135445, 0.005618731491267681, 0.8424296975135803], [0.0005607164348475635, 0.003467060159891844, 0.003592583118006587, 0.007783203385770321, 0.0010583025868982077, 0.0011570238275453448, 0.0033899289555847645, 0.043512772768735886, 0.0023253445979207754, 0.0033401160035282373, 0.06721832603216171, 0.0019491189159452915, 0.006089289207011461, 0.0014075723011046648, 0.005742185283452272, 0.00932177621871233, 0.8380846381187439], [0.002854106714949012, 0.013637579046189785, 0.0017915490316227078, 0.004972030408680439, 0.0011234721168875694, 0.007480855565518141, 0.007400804664939642, 0.05811676010489464, 0.009035220369696617, 0.006369575392454863, 0.002448309911414981, 0.039697445929050446, 0.006502930074930191, 0.004875097889453173, 0.003461618674919009, 0.012202311307191849, 0.8180304169654846], [0.006158399861305952, 0.01355404406785965, 0.002361282706260681, 0.016571441665291786, 0.006993391085416079, 0.004436856601387262, 0.0062576839700341225, 0.09070660918951035, 0.00902932696044445, 0.011618823744356632, 0.005224694963544607, 0.0052330526523292065, 0.08861920237541199, 0.005328644532710314, 0.0035941810347139835, 0.008537694811820984, 0.7157747149467468], [0.0006229078280739486, 0.015455705113708973, 0.0012113885022699833, 0.0027264016680419445, 0.0016949604032561183, 0.01601889356970787, 0.002823124174028635, 0.0633392482995987, 0.008544481359422207, 0.005897350609302521, 0.0011063824640586972, 0.0031771736685186625, 0.002762431977316737, 0.04461854323744774, 0.007901536300778389, 0.0131955835968256, 0.8089039325714111], [1.9743392840609886e-05, 0.0004500415816437453, 0.0004715374088846147, 0.00039749735151417553, 8.697786688571796e-05, 0.0004887345130555332, 0.0008893915801309049, 0.025480810552835464, 0.0001090382065740414, 0.0008887277217581868, 0.00031570845749229193, 0.00028931332053616643, 0.00013554411998484284, 0.000275845464784652, 0.018402287736535072, 0.003188299247995019, 0.9481105208396912], [0.00037971074925735593, 0.006612955126911402, 0.0020894003100693226, 0.0037192211020737886, 0.0006177524919621646, 0.004095480777323246, 0.003083091462031007, 0.044498663395643234, 0.0010159629164263606, 0.0017183946911245584, 0.0009487641509622335, 0.0017218609573319554, 0.0007459381013177335, 0.0025372295640408993, 0.00574138667434454, 0.015787392854690552, 0.904686689376831], [0.001449964358471334, 0.009770967066287994, 0.01471242681145668, 0.006604079622775316, 0.0021621491760015488, 0.005783585831522942, 0.007534752134233713, 0.0740838572382927, 0.003073271131142974, 0.005856260657310486, 0.00516884122043848, 0.002985231811180711, 0.0042382958345115185, 0.0026556849479675293, 0.0028897086158394814, 0.004687050823122263, 0.8463438153266907]], [[0.016096120700240135, 0.13382329046726227, 0.014357886277139187, 0.021329296752810478, 0.02052520215511322, 0.07405827939510345, 0.408550500869751, 0.1327538937330246, 0.0018403137801215053, 0.0024281744845211506, 0.0011819369392469525, 0.0015521059976890683, 0.0058926865458488464, 0.0023284454364329576, 0.00250889896415174, 0.004178596194833517, 0.1565943956375122], [0.037153638899326324, 0.038786303251981735, 0.02145415171980858, 0.01877027563750744, 0.003492446383461356, 0.010193943977355957, 0.4106038510799408, 0.0827629566192627, 0.002281911438331008, 0.0074691204354166985, 0.000514674000442028, 0.001166763948276639, 0.0034067693632096052, 0.0007805405184626579, 0.0007238438702188432, 0.004326911643147469, 0.3561118245124817], [0.001365329371765256, 0.0035628604236990213, 0.08014890551567078, 0.040386129170656204, 0.007226244080811739, 0.0015976839931681752, 0.04852479696273804, 0.03362465649843216, 0.0003860781725961715, 0.003915050532668829, 0.0007521227817051113, 0.000802406168077141, 0.0008350430289283395, 0.00021051250223536044, 0.0006633169250562787, 0.0017157656839117408, 0.7742831110954285], [0.01886221393942833, 0.006805638782680035, 0.07404080033302307, 0.11813027411699295, 0.013047714717686176, 0.0032274613622576, 0.18080078065395355, 0.03769517317414284, 0.0007883197977207601, 0.00939881894737482, 0.0008239536546170712, 0.001461843610741198, 0.0025893014390021563, 0.00025677744997665286, 0.0007734624086879194, 0.004707112908363342, 0.5265903472900391], [0.011932761408388615, 0.006194130517542362, 0.058642152696847916, 0.11044751852750778, 0.022584617137908936, 0.013125652447342873, 0.3767683804035187, 0.043282393366098404, 0.0004197584348730743, 0.005317150615155697, 0.00017148190818261355, 0.0005991854704916477, 0.0009085366036742926, 0.0003171299467794597, 0.0011311935959383845, 0.003780751023441553, 0.34437718987464905], [0.06856103241443634, 0.03890499100089073, 0.023272819817066193, 0.03888506442308426, 0.004010130185633898, 0.016962900757789612, 0.4789724051952362, 0.08434727042913437, 0.0011918001109734178, 0.004493058659136295, 0.0001661858696024865, 0.0008081780397333205, 0.0016254259971901774, 0.0003497777506709099, 0.00077697669621557, 0.003230108181014657, 0.2334417998790741], [0.11682802438735962, 0.009409385733306408, 0.021767914295196533, 0.035798151046037674, 0.003938011359423399, 0.008100315928459167, 0.08025515824556351, 0.08431868255138397, 0.013857990503311157, 0.03613949939608574, 0.004222785122692585, 0.00971829704940319, 0.01599297672510147, 0.006514201872050762, 0.003820879617705941, 0.017440643161535263, 0.5318770408630371], [0.003087541786953807, 0.004042636137455702, 0.0030598745215684175, 0.002984779654070735, 0.000565577473025769, 0.0017542984569445252, 0.019127970561385155, 0.07924424856901169, 0.0008036916260607541, 0.0014210441149771214, 0.000969588290899992, 0.0006972216651774943, 0.0013304202584549785, 0.0007782673928886652, 0.0008404210093431175, 0.002356303157284856, 0.8769360184669495], [0.00957976933568716, 0.005308450665324926, 0.004039777908474207, 0.004764830693602562, 0.0005008836160413921, 0.0008811689913272858, 0.006140991114079952, 0.03846541419625282, 0.010539148934185505, 0.0108424611389637, 0.0034803643357008696, 0.00765868229791522, 0.013385786674916744, 0.0035465536639094353, 0.002243749564513564, 0.007352892309427261, 0.8712690472602844], [0.02937508001923561, 0.009161903522908688, 0.03872396796941757, 0.010456687770783901, 0.0009737348882481456, 0.0019515060121193528, 0.01184465829282999, 0.027693478390574455, 0.0035726658534258604, 0.02475596033036709, 0.0010291518410667777, 0.008745943196117878, 0.012134036980569363, 0.0014358563348650932, 0.0039915586821734905, 0.014203697443008423, 0.7999500036239624], [0.0030119242146611214, 0.0006561827030964196, 0.0032892420422285795, 0.0023429165594279766, 0.0001588680170243606, 0.00018379726679995656, 0.005729831289499998, 0.02208845131099224, 0.002577513223513961, 0.007097077090293169, 0.0030587513465434313, 0.003712387289851904, 0.004916946403682232, 0.0019920659251511097, 0.005571035202592611, 0.017654256895184517, 0.91595858335495], [0.04107620194554329, 0.007080568466335535, 0.004519632086157799, 0.0042799669317901134, 0.0004111203597858548, 0.0007392471306957304, 0.006474243942648172, 0.04911993816494942, 0.008500535972416401, 0.010510614141821861, 0.0033881955314427614, 0.010264644399285316, 0.014346550218760967, 0.0040259999223053455, 0.0032017845660448074, 0.008532804436981678, 0.8235278725624084], [0.011058983393013477, 0.0024936494883149862, 0.0023217913694679737, 0.0016212797490879893, 0.0002243022172478959, 0.0006530315149575472, 0.01382074411958456, 0.03666466102004051, 0.006207116413861513, 0.012667644768953323, 0.003621638985350728, 0.007075544912368059, 0.012804175727069378, 0.0060827829875051975, 0.00492517976090312, 0.014195364899933338, 0.8635622262954712], [0.013653285801410675, 0.003218978177756071, 0.0027714110910892487, 0.0016830825479701161, 0.000355742551619187, 0.0008025728748179972, 0.013211856596171856, 0.05723506212234497, 0.011904837563633919, 0.00827172864228487, 0.007605534978210926, 0.008654761128127575, 0.022202743217349052, 0.021843893453478813, 0.013907914981245995, 0.04029374197125435, 0.7723827958106995], [0.020741207525134087, 0.0030393709894269705, 0.013729305937886238, 0.002319663530215621, 0.0003424892493057996, 0.0007230569026432931, 0.01394813135266304, 0.039713066071271896, 0.005860691890120506, 0.012212174013257027, 0.006439492106437683, 0.004778615664690733, 0.027000730857253075, 0.01280252169817686, 0.09212090820074081, 0.082762211561203, 0.6614664196968079], [0.02625867910683155, 0.00376529130153358, 0.008246025070548058, 0.0036081441212445498, 0.0003811988281086087, 0.0006819619447924197, 0.025134939700365067, 0.03763602301478386, 0.0060315802693367004, 0.013072504661977291, 0.004697764292359352, 0.006810815539211035, 0.02479272149503231, 0.012210153043270111, 0.03661264106631279, 0.10013675689697266, 0.6899227499961853], [0.002393735572695732, 0.0016227104933932424, 0.003556228242814541, 0.005267258267849684, 0.0013422287302091718, 0.0011458147782832384, 0.009916886687278748, 0.03763684630393982, 0.0017688735388219357, 0.004488819278776646, 0.004330613184720278, 0.002203918294981122, 0.003038444323465228, 0.0015174599830061197, 0.0019180168164893985, 0.004547069314867258, 0.9133050441741943]], [[0.000906763831153512, 0.050744153559207916, 0.02639336884021759, 0.187159925699234, 0.08230937272310257, 0.03982504457235336, 0.061395373195409775, 0.09639473259449005, 0.004528772085905075, 0.007414672989398241, 0.004482804797589779, 0.0026245275512337685, 0.009220033884048462, 0.0054501877166330814, 0.024053607136011124, 0.0198899507522583, 0.37720662355422974], [0.011070351116359234, 0.009799045510590076, 0.005767861381173134, 0.009205021895468235, 0.0011129496851935983, 0.004700861871242523, 0.049469154328107834, 0.1221969798207283, 0.027990350499749184, 0.009087374433875084, 0.010297801345586777, 0.009934695437550545, 0.031303998082876205, 0.029729288071393967, 0.030012857168912888, 0.03372729569673538, 0.6045941710472107], [0.004853948950767517, 0.006268101744353771, 0.002549688797444105, 0.003455031430348754, 0.001003713347017765, 0.0059422897174954414, 0.08274704217910767, 0.16008983552455902, 0.004605581518262625, 0.0038173275534063578, 0.004711626097559929, 0.006361815147101879, 0.009377473033964634, 0.005067574791610241, 0.014150059781968594, 0.02004263736307621, 0.6649561524391174], [0.00945646595209837, 0.004001582507044077, 0.005150882992893457, 0.0021252234000712633, 0.0005514695658348501, 0.0015678622294217348, 0.030907846987247467, 0.06205762177705765, 0.014962452463805676, 0.00566716818138957, 0.006028847303241491, 0.00827923696488142, 0.01201458740979433, 0.009773745201528072, 0.01991659216582775, 0.013877646066248417, 0.7936607599258423], [0.00647905096411705, 0.0073857782408595085, 0.008242154493927956, 0.003512867959216237, 0.001139424741268158, 0.010912303812801838, 0.19769339263439178, 0.13164570927619934, 0.005824823398143053, 0.0037134026642888784, 0.00609795656055212, 0.0038323327898979187, 0.004521284252405167, 0.007579228840768337, 0.009747693315148354, 0.016744494438171387, 0.5749281048774719], [0.020055562257766724, 0.008585106581449509, 0.02076585590839386, 0.00804352480918169, 0.0019872854463756084, 0.005722924135625362, 0.14465537667274475, 0.1684512197971344, 0.01796167902648449, 0.012763816863298416, 0.009934457018971443, 0.007416831329464912, 0.02281554415822029, 0.0272609144449234, 0.03374168276786804, 0.06175985187292099, 0.42807838320732117], [0.006483990699052811, 0.011460481211543083, 0.008030151017010212, 0.019388141110539436, 0.002664940431714058, 0.010789227671921253, 0.02192516438663006, 0.15021349489688873, 0.0035100807435810566, 0.004454520530998707, 0.005899609066545963, 0.002998376963660121, 0.006996167823672295, 0.004226486198604107, 0.00841425359249115, 0.015870770439505577, 0.7166740894317627], [0.0015931505477055907, 0.013661577366292477, 0.012497920542955399, 0.01631677709519863, 0.002686195308342576, 0.010868576355278492, 0.011905360035598278, 0.08679123967885971, 0.0012486709747463465, 0.0015782775590196252, 0.0017742505297064781, 0.0006825308664701879, 0.002101172460243106, 0.0022120897192507982, 0.005754286888986826, 0.010470524430274963, 0.8178573250770569], [0.007110044360160828, 0.01783519610762596, 0.010521511547267437, 0.018914561718702316, 0.008851875551044941, 0.019236227497458458, 0.07956486195325851, 0.15075530111789703, 0.0012141402112320065, 0.002753601176664233, 0.0010021496564149857, 0.0005999617860652506, 0.0036634220741689205, 0.0018728013383224607, 0.015453258529305458, 0.01620630733668804, 0.6444448232650757], [0.011477424763143063, 0.006543254479765892, 0.02102700062096119, 0.015857897698879242, 0.008194937370717525, 0.00981262605637312, 0.03838208690285683, 0.10718920081853867, 0.0018838875694200397, 0.0039007668383419514, 0.0018143182387575507, 0.0025233193300664425, 0.006021380890160799, 0.003142409725114703, 0.03089847043156624, 0.022452956065535545, 0.7088780403137207], [0.013289843685925007, 0.023302970454096794, 0.010529108345508575, 0.013399682939052582, 0.00630025751888752, 0.024835191667079926, 0.15268081426620483, 0.14439168572425842, 0.0020065451972186565, 0.0037215007469058037, 0.00220035994425416, 0.00199275859631598, 0.00788432452827692, 0.0032435525208711624, 0.03668351098895073, 0.03677188977599144, 0.5167659521102905], [0.005821907427161932, 0.0161801278591156, 0.009214363060891628, 0.011653145775198936, 0.008608686737716198, 0.01652952842414379, 0.08754339814186096, 0.14197038114070892, 0.0006180545897223055, 0.0020852372981607914, 0.0009915768168866634, 0.0003506006905809045, 0.001670464756898582, 0.0012539064045995474, 0.026027970016002655, 0.016824612393975258, 0.652656078338623], [0.004533727653324604, 0.04382079094648361, 0.009842464700341225, 0.028596336022019386, 0.011908582411706448, 0.03771304339170456, 0.07249567657709122, 0.16308701038360596, 0.0013943443773314357, 0.0023586414754390717, 0.0013987715356051922, 0.0008791270083747804, 0.0014847518177703023, 0.0017259694868698716, 0.010521831922233105, 0.0167612973600626, 0.591477632522583], [0.024227894842624664, 0.048664551228284836, 0.016500722616910934, 0.014033595100045204, 0.006679017562419176, 0.05169355124235153, 0.22779114544391632, 0.19557057321071625, 0.0030704704113304615, 0.004500692710280418, 0.0025277244858443737, 0.001336314482614398, 0.00819408893585205, 0.005154045298695564, 0.03850921615958214, 0.04654024913907051, 0.3050061762332916], [0.024144252762198448, 0.024788230657577515, 0.028786135837435722, 0.0032743774354457855, 0.0028314979281276464, 0.02723122574388981, 0.27553293108940125, 0.1290067881345749, 0.002800507005304098, 0.00669030612334609, 0.0034308359026908875, 0.005752743221819401, 0.011940257623791695, 0.008507481776177883, 0.03805782273411751, 0.07352161407470703, 0.3337029814720154], [0.01830858178436756, 0.03916364908218384, 0.023426348343491554, 0.006471519824117422, 0.0043768370524048805, 0.021874023601412773, 0.14725050330162048, 0.1528235226869583, 0.006464660633355379, 0.00871059950441122, 0.006051142234355211, 0.0065357135608792305, 0.013793005608022213, 0.02010812610387802, 0.0761624276638031, 0.05391079932451248, 0.39456862211227417], [0.0029914663173258305, 0.009616502560675144, 0.009361304342746735, 0.02810416929423809, 0.006877273321151733, 0.011342697776854038, 0.0069199674762785435, 0.05347660556435585, 0.00226470991037786, 0.003851720131933689, 0.00387831823900342, 0.0021672972943633795, 0.004126263316720724, 0.0025478960014879704, 0.007080969400703907, 0.011008071713149548, 0.8343847393989563]], [[0.04422936215996742, 0.08186426758766174, 0.003895988455042243, 0.014713240787386894, 0.009084610268473625, 0.02035294659435749, 0.5066334009170532, 0.07532518357038498, 0.04929773136973381, 0.03144022449851036, 0.0032217337284237146, 0.006176397204399109, 0.06304515153169632, 0.016060909256339073, 0.0010696607641875744, 0.02516286075115204, 0.04842633008956909], [0.004972547758370638, 0.27344971895217896, 0.032672423869371414, 0.006800626404583454, 0.0041805775836110115, 0.04383252561092377, 0.37001678347587585, 0.13129942119121552, 0.007802905980497599, 0.028581181541085243, 0.0003461095329839736, 0.0007519221398979425, 0.00501923868432641, 0.003604618366807699, 0.006832199636846781, 0.014377926476299763, 0.06545936316251755], [0.003839134704321623, 0.06916609406471252, 0.044810377061367035, 0.022687876597046852, 0.02155645191669464, 0.10964982211589813, 0.2649608254432678, 0.15751959383487701, 0.01089551206678152, 0.02678564004600048, 0.00358565803617239, 0.00321840844117105, 0.006970642134547234, 0.010073749348521233, 0.058066755533218384, 0.017138628289103508, 0.16907480359077454], [0.013488298282027245, 0.05678299814462662, 0.006234597414731979, 0.05780656635761261, 0.012818526476621628, 0.030081605538725853, 0.15294481813907623, 0.21541714668273926, 0.023089488968253136, 0.05930586904287338, 0.0070408787578344345, 0.003524349769577384, 0.04462914541363716, 0.007422216236591339, 0.009717191569507122, 0.05410899594426155, 0.24558718502521515], [0.008145415224134922, 0.13866642117500305, 0.008757086470723152, 0.05967237800359726, 0.012664929032325745, 0.09258584678173065, 0.1426382064819336, 0.21640470623970032, 0.010038260370492935, 0.01872316375374794, 0.0015596395824104548, 0.00040536149754188955, 0.006937309633940458, 0.003195555182173848, 0.005054440349340439, 0.011285543441772461, 0.2632657587528229], [0.021163474768400192, 0.17975778877735138, 0.016180288046598434, 0.0160412285476923, 0.010386673733592033, 0.11669427901506424, 0.28582677245140076, 0.20279473066329956, 0.014920133166015148, 0.007966996170580387, 0.000678957614582032, 0.0006675595068372786, 0.005925905425101519, 0.007760666310787201, 0.004529130179435015, 0.013593859039247036, 0.09511159360408783], [0.006479478441178799, 0.0238842461258173, 0.03678719326853752, 0.009921584278345108, 0.004772893153131008, 0.038999397307634354, 0.13747265934944153, 0.23777203261852264, 0.015634512528777122, 0.016395902261137962, 0.0044180224649608135, 0.006543873809278011, 0.016910163685679436, 0.03873804956674576, 0.022174153476953506, 0.0436687134206295, 0.33942708373069763], [0.027796359732747078, 0.03939647972583771, 0.01240848284214735, 0.014503776095807552, 0.009274546056985855, 0.022522369399666786, 0.06825103610754013, 0.23021787405014038, 0.012478718534111977, 0.009328285232186317, 0.008027424104511738, 0.006730545312166214, 0.016370205208659172, 0.016843261197209358, 0.011168819852173328, 0.0328465960919857, 0.46183517575263977], [0.07623476535081863, 0.04295634105801582, 0.0681067556142807, 0.012107250280678272, 0.007213548757135868, 0.005195703357458115, 0.07805454730987549, 0.11254994571208954, 0.0288547370582819, 0.02977735735476017, 0.020339468494057655, 0.011463291943073273, 0.08301892131567001, 0.023762298747897148, 0.006418890785425901, 0.12325628101825714, 0.2706899642944336], [0.02573772333562374, 0.02405744045972824, 0.11452793329954147, 0.024566099047660828, 0.022897500544786453, 0.034666210412979126, 0.1071513220667839, 0.1657632440328598, 0.03306864947080612, 0.0048604197800159454, 0.0056849573738873005, 0.0026986473239958286, 0.014614125713706017, 0.05339628458023071, 0.01030398067086935, 0.09361931681632996, 0.26238611340522766], [0.010657121427357197, 0.01862807385623455, 0.05080060288310051, 0.005595373455435038, 0.0034739139955490828, 0.0044942134991288185, 0.035013530403375626, 0.1197856143116951, 0.018958808854222298, 0.021456224843859673, 0.02199172042310238, 0.090030238032341, 0.024669427424669266, 0.07318168133497238, 0.05655091628432274, 0.23228950798511505, 0.21242302656173706], [0.019412251189351082, 0.04209768399596214, 0.021388910710811615, 0.01845170557498932, 0.010852758772671223, 0.011179538443684578, 0.10152754932641983, 0.10517358034849167, 0.013303502462804317, 0.006351870018988848, 0.08745172619819641, 0.005429872311651707, 0.025271447375416756, 0.022244367748498917, 0.008216959424316883, 0.045145392417907715, 0.4565008580684662], [0.030408412218093872, 0.048282112926244736, 0.021394852548837662, 0.03328685089945793, 0.008457688614726067, 0.004440601449459791, 0.0780673399567604, 0.1384260058403015, 0.029416752979159355, 0.010393653996288776, 0.030407484620809555, 0.019211532548069954, 0.05424105376005173, 0.03377123922109604, 0.004179730545729399, 0.1097353920340538, 0.3458792567253113], [0.03147478774189949, 0.010380346328020096, 0.01406579278409481, 0.006880968809127808, 0.004325403366237879, 0.0029883310198783875, 0.02699275314807892, 0.09793651849031448, 0.04302297532558441, 0.016078708693385124, 0.03824734315276146, 0.03171144798398018, 0.1631564050912857, 0.1261427104473114, 0.02615637145936489, 0.14880377054214478, 0.21163539588451385], [0.00778172817081213, 0.009830988943576813, 0.059318918734788895, 0.004314181860536337, 0.004251779057085514, 0.007212448865175247, 0.01041297148913145, 0.16321970522403717, 0.0102895712479949, 0.0036337373312562704, 0.009812244214117527, 0.004179798066616058, 0.013696269132196903, 0.11269930005073547, 0.06324440985918045, 0.18382528424263, 0.3322765827178955], [0.008445397950708866, 0.016613002866506577, 0.02462240308523178, 0.01025697123259306, 0.008217697963118553, 0.008807056583464146, 0.08861903101205826, 0.15083225071430206, 0.020876074209809303, 0.014402365311980247, 0.010052460245788097, 0.011736811138689518, 0.03781759366393089, 0.09352831542491913, 0.1438012272119522, 0.15646834671497345, 0.19490307569503784], [0.019796166568994522, 0.027480879798531532, 0.019196586683392525, 0.0236325953155756, 0.016570519655942917, 0.022032076492905617, 0.019750358536839485, 0.17440006136894226, 0.01535042468458414, 0.012110797688364983, 0.017709651961922646, 0.015547724440693855, 0.01754426583647728, 0.018689867109060287, 0.020275279879570007, 0.02797064185142517, 0.5319421291351318]], [[0.015246931463479996, 0.013385843485593796, 0.02019420638680458, 0.010327785275876522, 0.0018006489844992757, 0.007633740082383156, 0.00939234346151352, 0.07253661006689072, 0.004861520137637854, 0.0054192920215427876, 0.001565556158311665, 0.0023039754014462233, 0.019431259483098984, 0.005093954969197512, 0.0026894467882812023, 0.010698676109313965, 0.7974181771278381], [0.00046393301454372704, 0.043234530836343765, 0.011875048279762268, 0.011147168464958668, 0.0012852405197918415, 0.010253116488456726, 0.0039650024846196175, 0.061497680842876434, 0.02370206080377102, 0.0139395110309124, 0.0011737135937437415, 0.009147457778453827, 0.004871539771556854, 0.013085869140923023, 0.0015664977254346013, 0.013381113298237324, 0.775410532951355], [0.0002411292225588113, 0.003081181086599827, 0.22594444453716278, 0.002291774144396186, 0.00026171235367655754, 0.0012673604069277644, 0.0025909747928380966, 0.03713593631982803, 0.0006664738175459206, 0.007223309017717838, 0.00032766282674856484, 0.0010235554073005915, 0.0019719924312084913, 0.001377926324494183, 0.0014617781853303313, 0.007239329628646374, 0.705893337726593], [0.0036751343868672848, 0.014062722213566303, 0.01124251913279295, 0.22738303244113922, 0.002794047584757209, 0.008393730036914349, 0.004528466146439314, 0.08486386388540268, 0.01189387310296297, 0.008551799692213535, 0.0017982262652367353, 0.018838314339518547, 0.0077637420035898685, 0.004962647799402475, 0.0007381319883279502, 0.005396024323999882, 0.5831136703491211], [0.001979227177798748, 0.005542028695344925, 0.004086197819560766, 0.043277814984321594, 0.06966544687747955, 0.0117275295779109, 0.004571523983031511, 0.12147005647420883, 0.0034921264741569757, 0.010702955536544323, 0.0012043867027387023, 0.009387250058352947, 0.012918848544359207, 0.003164538647979498, 0.0010140646481886506, 0.015557792969048023, 0.6802381873130798], [0.0003339638060424477, 0.0157296322286129, 0.009720291942358017, 0.013261746615171432, 0.00472083268687129, 0.04568473622202873, 0.008833806030452251, 0.07925986498594284, 0.011851947754621506, 0.01566905342042446, 0.0018718215869739652, 0.003584359772503376, 0.002406198065727949, 0.009435252286493778, 0.002932731993496418, 0.012082514353096485, 0.7626213431358337], [0.00036728763370774686, 0.00993561465293169, 0.01923593319952488, 0.004037714563310146, 0.0018432453507557511, 0.012020604684948921, 0.10690385848283768, 0.06688430905342102, 0.007739650551229715, 0.026065493002533913, 0.0036709890700876713, 0.000726095458958298, 0.008345190435647964, 0.0041562700644135475, 0.0033376573119312525, 0.009675882756710052, 0.7150542736053467], [0.00022553755843546242, 0.002428916282951832, 0.0031281420961022377, 0.0021392726339399815, 0.0004553703183773905, 0.0022357061970978975, 0.0020539462566375732, 0.04660065844655037, 0.0006771025946363807, 0.0016337988199666142, 0.000808191136457026, 0.0006064434419386089, 0.0008463480626232922, 0.0008048076997511089, 0.0008260268368758261, 0.002653659088537097, 0.9318761229515076], [0.008302413858473301, 0.033818021416664124, 0.0011010293383151293, 0.020701998844742775, 0.00560405757278204, 0.01674659550189972, 0.0029658584389835596, 0.029762158170342445, 0.6319584250450134, 0.008551562204957008, 0.0034740413539111614, 0.024547694250941277, 0.015272450633347034, 0.05780765414237976, 0.00039393105544149876, 0.010477058589458466, 0.1285150647163391], [0.0018502206075936556, 0.0021611053962260485, 0.010355519130825996, 0.002824824769049883, 0.0009256464545615017, 0.003334233071655035, 0.06727015972137451, 0.04201066866517067, 0.0018075774423778057, 0.5992783308029175, 0.0019182218238711357, 0.0035904920659959316, 0.0027330697048455477, 0.0008578994311392307, 0.0007589388405904174, 0.0035008222330361605, 0.25482216477394104], [0.00044527111458592117, 0.0016157794743776321, 0.0011639399453997612, 0.0018496413249522448, 0.00042264279909431934, 0.001678523258306086, 0.006782063748687506, 0.045586612075567245, 0.001176418038085103, 0.002683894708752632, 0.4704757034778595, 0.0006513787084259093, 0.0019001486944034696, 0.0010079003404825926, 0.000517104403115809, 0.003692140569910407, 0.45835092663764954], [0.008614740334451199, 0.007661318406462669, 0.0035113003104925156, 0.03026190958917141, 0.005858259275555611, 0.006414595991373062, 0.0068552070297300816, 0.06202523037791252, 0.01065262220799923, 0.029981572180986404, 0.005856513511389494, 0.348324179649353, 0.00908289197832346, 0.005513958167284727, 0.001579249743372202, 0.016099082306027412, 0.4417072832584381], [0.01733076199889183, 0.0054876599460840225, 0.0026900027878582478, 0.010198323987424374, 0.010756014846265316, 0.002862523775547743, 0.007879423908889294, 0.052967619150877, 0.014288024045526981, 0.0157814621925354, 0.008304889313876629, 0.013206522911787033, 0.4126764237880707, 0.007508930284529924, 0.0008136812830343843, 0.021050244569778442, 0.39619749784469604], [0.00876555684953928, 0.03801996633410454, 0.0028394702821969986, 0.019567538052797318, 0.010557890869677067, 0.05843011289834976, 0.00859613437205553, 0.08866763114929199, 0.11872130632400513, 0.015160423703491688, 0.00968720018863678, 0.032527294009923935, 0.01618373394012451, 0.2176075130701065, 0.002371416660025716, 0.04627738520503044, 0.3060193657875061], [0.0002691778645385057, 0.0013008109526708722, 0.0019378368742763996, 0.0012751177418977022, 0.00048548803897574544, 0.0032885875552892685, 0.010943532921373844, 0.05474488437175751, 0.00016520568169653416, 0.004106463864445686, 0.0006718923687003553, 0.00045737726031802595, 0.00048116882680915296, 0.0005757768522016704, 0.07284609973430634, 0.0031370376236736774, 0.8433135747909546], [0.002497287467122078, 0.007153709419071674, 0.009711643680930138, 0.006086050998419523, 0.003268761560320854, 0.008485984988510609, 0.017787836492061615, 0.0934140756726265, 0.0030209331307560205, 0.010301587171852589, 0.004446374252438545, 0.003211465198546648, 0.004309740383177996, 0.0049566784873604774, 0.00233865762129426, 0.11591348052024841, 0.7030957341194153], [0.0003529141831677407, 0.002473391592502594, 0.006027156952768564, 0.002118234522640705, 0.0006853873492218554, 0.001961715519428253, 0.00332681299187243, 0.041182007640600204, 0.000996368587948382, 0.003425709903240204, 0.0020180977880954742, 0.001026528305374086, 0.002127985702827573, 0.0013180241221562028, 0.0017968215979635715, 0.0038581674452871084, 0.9253048300743103]], [[0.14615179598331451, 0.29475322365760803, 0.0492791049182415, 0.02877049706876278, 0.016756070777773857, 0.03605883568525314, 0.034481655806303024, 0.05537745729088783, 0.010581755079329014, 0.0008250960963778198, 0.000367346394341439, 0.001139401108957827, 0.008443539962172508, 0.0007200954714789987, 2.9688404538319446e-05, 0.0003143743670079857, 0.31595009565353394], [0.19807837903499603, 0.02737523801624775, 0.011400039307773113, 0.021505562588572502, 0.0029361823108047247, 0.007447346113622189, 0.0032321747858077288, 0.033192187547683716, 0.1084413081407547, 0.49240246415138245, 0.006663669366389513, 0.011003026738762856, 0.03700610622763634, 0.010328358970582485, 0.009748650714755058, 0.01872849650681019, 0.00051089096814394], [0.06220073625445366, 0.00612953444942832, 0.30659228563308716, 0.032677680253982544, 0.006595325656235218, 0.004487255588173866, 0.0137703288346529, 0.06155421957373619, 0.03515449911355972, 0.29578545689582825, 0.010319575667381287, 0.011571934446692467, 0.010242334567010403, 0.00864422507584095, 0.1012292131781578, 0.03122827410697937, 0.0018170855473726988], [0.007882066071033478, 0.0007458381587639451, 0.0009798771934583783, 0.6692019701004028, 0.02973533235490322, 0.00211479258723557, 0.0010880716145038605, 0.04914745315909386, 0.007429335732012987, 0.1416892409324646, 0.0049313935451209545, 0.01167623233050108, 0.010421382263302803, 0.005857311654835939, 0.010126985609531403, 0.046354420483112335, 0.0006183586665429175], [0.01203923486173153, 0.0020760800689458847, 0.006593538913875818, 0.20589783787727356, 0.5641418099403381, 0.020643990486860275, 0.010946057736873627, 0.09221168607473373, 0.0018855034140869975, 0.050740379840135574, 0.0003873748646583408, 0.001300169387832284, 0.0014675124548375607, 0.0027888373006135225, 0.006019601598381996, 0.016401542350649834, 0.004458867944777012], [0.14207597076892853, 0.009997328743338585, 0.011813648976385593, 0.04260970279574394, 0.032204367220401764, 0.1273876577615738, 0.036678433418273926, 0.12282378226518631, 0.034193381667137146, 0.3177248537540436, 0.005357630085200071, 0.016820499673485756, 0.013895963318645954, 0.012307005003094673, 0.018956920132040977, 0.052573926746845245, 0.002578927204012871], [0.1775095909833908, 0.012253967113792896, 0.01328973937779665, 0.01807781495153904, 0.007587058003991842, 0.040781907737255096, 0.12309662252664566, 0.08889757841825485, 0.12185962498188019, 0.08338769525289536, 0.025157373398542404, 0.05244005471467972, 0.08698497712612152, 0.03367258235812187, 0.013419395312666893, 0.0958399772644043, 0.005743961315602064], [0.1272442787885666, 0.05447671562433243, 0.04500873386859894, 0.05599890649318695, 0.02597709186375141, 0.03943775221705437, 0.059632230550050735, 0.12422442436218262, 0.07362616807222366, 0.05150140821933746, 0.038823697715997696, 0.03703863546252251, 0.060089826583862305, 0.046705905348062515, 0.03205953538417816, 0.048013836145401, 0.08014088124036789], [0.009373354725539684, 0.40875306725502014, 0.05365069583058357, 0.018372341990470886, 0.009999965317547321, 0.06317738443613052, 0.2760085463523865, 0.020824125036597252, 0.0048731653951108456, 0.00248927203938365, 0.003025827230885625, 0.0013944386737421155, 0.0035483287647366524, 0.0031825387850403786, 0.0001831016707001254, 0.0005708260578103364, 0.12057309597730637], [0.002632666612043977, 0.12195388227701187, 0.16283290088176727, 0.16862183809280396, 0.051586512476205826, 0.0804373100399971, 0.19991786777973175, 0.03918028622865677, 0.0011933052446693182, 0.003579026786610484, 0.0007870274130254984, 0.0005229761591181159, 0.0002491029445081949, 0.0009731090976856649, 0.0002609949151519686, 0.000721848919056356, 0.1645493358373642], [0.014372067525982857, 0.08523133397102356, 0.04417887702584267, 0.01496412418782711, 0.004440127406269312, 0.03523626923561096, 0.1911214292049408, 0.08577097952365875, 0.012516510672867298, 0.0017523553688079119, 0.12714983522891998, 0.020963400602340698, 0.010145303793251514, 0.02465946227312088, 0.004319602623581886, 0.005421238485723734, 0.31775709986686707], [0.0070927999913692474, 0.13741712272167206, 0.03032277338206768, 0.024265119805932045, 0.010920636355876923, 0.061935149133205414, 0.3456210792064667, 0.045897673815488815, 0.0033660910557955503, 0.0007586276042275131, 0.005913919769227505, 0.008542979136109352, 0.002115550683811307, 0.004603209439665079, 9.34754207264632e-05, 0.0008967682369984686, 0.31023693084716797], [0.01701468601822853, 0.24480147659778595, 0.04201527312397957, 0.01571742631494999, 0.012819208204746246, 0.06899499893188477, 0.27978062629699707, 0.03977333381772041, 0.004928700625896454, 0.0005385311087593436, 0.007037974428385496, 0.001527403830550611, 0.008739849552512169, 0.006277165841311216, 0.00018359145906288177, 0.0011387733975425363, 0.24871094524860382], [0.010000265203416348, 0.07943227142095566, 0.021069636568427086, 0.009315059520304203, 0.011345758102834225, 0.03819071128964424, 0.4252828061580658, 0.05690844729542732, 0.004192695952951908, 0.001430243020877242, 0.013546287082135677, 0.0040448433719575405, 0.0035740595776587725, 0.03202353045344353, 0.001291537773795426, 0.005628380458801985, 0.28272345662117004], [0.007395669352263212, 0.026031125336885452, 0.10057783871889114, 0.021848665550351143, 0.031300514936447144, 0.022250138223171234, 0.31541767716407776, 0.0791143998503685, 0.0015168387908488512, 0.0010223957942798734, 0.0014750591944903135, 0.0006549181998707354, 0.001038587768562138, 0.007426396012306213, 0.047537434846162796, 0.02111954800784588, 0.31427279114723206], [0.012238102033734322, 0.045410145074129105, 0.10139075666666031, 0.044205330312252045, 0.03329649195075035, 0.029889388009905815, 0.2617954909801483, 0.07685995101928711, 0.003647920908406377, 0.0026716962456703186, 0.003937035333365202, 0.002962136175483465, 0.0026276179123669863, 0.01465463824570179, 0.0068154786713421345, 0.025190886110067368, 0.33240699768066406], [0.06677703559398651, 0.07386758178472519, 0.10022735595703125, 0.06604262441396713, 0.05307063087821007, 0.06625064462423325, 0.09755215793848038, 0.08723471313714981, 0.039779406040906906, 0.030507462099194527, 0.03146573156118393, 0.030283087864518166, 0.03821169584989548, 0.029712621122598648, 0.027375781908631325, 0.031202251091599464, 0.13043923676013947]], [[0.043839484453201294, 0.018458504229784012, 0.018967917189002037, 0.033240530639886856, 0.011227412149310112, 0.008591712452471256, 0.005425500217825174, 0.05192077159881592, 0.08482605218887329, 0.12034690380096436, 0.04170461744070053, 0.022460008040070534, 0.1212872713804245, 0.03566421568393707, 0.04893875867128372, 0.12908567488193512, 0.20401468873023987], [0.0005790653522126377, 0.04906642809510231, 0.01104047428816557, 0.29612118005752563, 0.021664103493094444, 0.04097248241305351, 0.23143498599529266, 0.09270496666431427, 0.0005533567164093256, 0.0017051202012225986, 0.002947843400761485, 0.0007732336525805295, 0.0019642161205410957, 0.00042760741780512035, 0.0011643823236227036, 0.002625903580337763, 0.2442546784877777], [0.0037722799461334944, 0.028293704614043236, 0.0056546409614384174, 0.14773711562156677, 0.036736391484737396, 0.05317968130111694, 0.12043008208274841, 0.14426356554031372, 0.0007439480978064239, 0.0008441451354883611, 0.0021425888407975435, 0.0012242362136021256, 0.0010087455157190561, 0.000494159001391381, 0.0010413274867460132, 0.0008897356456145644, 0.45154353976249695], [0.0011590784415602684, 0.02242174930870533, 0.008470247499644756, 0.1018047109246254, 0.014652893878519535, 0.03303777053952217, 0.14731980860233307, 0.17970600724220276, 0.0012993603013455868, 0.000856948783621192, 0.005502596497535706, 0.0011445913696661592, 0.0020417727064341307, 0.001182502950541675, 0.001575973117724061, 0.0022075287997722626, 0.4756163954734802], [0.0015097997384145856, 0.01921578124165535, 0.01033713947981596, 0.08234476298093796, 0.006325128488242626, 0.030992664396762848, 0.11893230676651001, 0.15150590240955353, 0.0004954837495461106, 0.0006650839932262897, 0.0017170842038467526, 0.00047383885248564184, 0.0006071251118555665, 0.0005245858337730169, 0.001256351126357913, 0.0021577603183686733, 0.5709391236305237], [0.001060275943018496, 0.0924941748380661, 0.02343287132680416, 0.34283247590065, 0.01669440232217312, 0.0347212553024292, 0.2420918494462967, 0.0902428925037384, 0.0005636181449517608, 0.0007679818663746119, 0.0018245448591187596, 0.0005250336253084242, 0.0011792421573773026, 0.0005724553484469652, 0.001877280999906361, 0.002356108045205474, 0.14676354825496674], [0.0033337578643113375, 0.16263458132743835, 0.05366549268364906, 0.27147290110588074, 0.03128451108932495, 0.11822152137756348, 0.1743975281715393, 0.08086487650871277, 0.004194461740553379, 0.002514925319701433, 0.004633753560483456, 0.002993798116222024, 0.012483573518693447, 0.0014694520505145192, 0.0009913062676787376, 0.002747619990259409, 0.07209596782922745], [0.009775435552001, 0.02853635512292385, 0.011745673604309559, 0.06012551859021187, 0.0066396137699484825, 0.019984662532806396, 0.05341937020421028, 0.13376447558403015, 0.0012752044713124633, 0.0011308762477710843, 0.0029649478383362293, 0.0014005870325490832, 0.0017976699164137244, 0.0010451172711327672, 0.0013771617086604238, 0.003825996769592166, 0.6611912250518799], [0.20546390116214752, 0.0021567512303590775, 0.001431040815077722, 0.00487711513414979, 0.0010296498658135533, 0.0010882050264626741, 0.0035127755254507065, 0.03880619257688522, 0.01014336571097374, 0.026925912126898766, 0.010471105575561523, 0.009412623941898346, 0.02493830770254135, 0.0058933040127158165, 0.010588611476123333, 0.023249909281730652, 0.6200113296508789], [0.2145441323518753, 0.0006978691671974957, 0.0006750321481376886, 0.001458006096072495, 0.0003609904379118234, 0.0004274809907656163, 0.0010334079852327704, 0.025741081684827805, 0.005036884918808937, 0.006596334278583527, 0.010167498141527176, 0.008488005958497524, 0.006865047849714756, 0.005198287311941385, 0.010797584429383278, 0.01665516570210457, 0.6852571368217468], [0.2058870643377304, 0.0017421097727492452, 0.0013315603137016296, 0.0027966240886598825, 0.0006096321740187705, 0.0009131401893682778, 0.002946594264358282, 0.04171640798449516, 0.00991006288677454, 0.011625617742538452, 0.00837872177362442, 0.008499730378389359, 0.013049280270934105, 0.007724135648459196, 0.019342202693223953, 0.01733972877264023, 0.646187424659729], [0.1754949390888214, 0.00043812947114929557, 0.0004553050093818456, 0.0011572041548788548, 0.00018751212337519974, 0.00023844323004595935, 0.0008088863105513155, 0.023792898282408714, 0.0042212833650410175, 0.006916162557899952, 0.007370330858975649, 0.003825801657512784, 0.006451516877859831, 0.0024136498104780912, 0.004804171156138182, 0.006885697599500418, 0.7545379400253296], [0.14991098642349243, 0.0008220280287787318, 0.0010433158604428172, 0.0010635626967996359, 0.00029038111097179353, 0.0004619797400664538, 0.0007762779132463038, 0.03153219446539879, 0.005100863985717297, 0.008750739507377148, 0.007443456910550594, 0.005169952288269997, 0.008033890277147293, 0.005532578565180302, 0.01243700459599495, 0.02779497765004635, 0.7338358759880066], [0.24290090799331665, 0.004247015342116356, 0.001626569894142449, 0.005964693147689104, 0.0008909081807360053, 0.0016059810295701027, 0.003402833128347993, 0.04859835281968117, 0.008693018928170204, 0.00931200198829174, 0.007068591192364693, 0.0045773945748806, 0.012528970837593079, 0.006521244999021292, 0.02302958071231842, 0.030857477337121964, 0.5881744623184204], [0.22516126930713654, 0.001240291865542531, 0.0004349542723502964, 0.0018391059711575508, 0.00025114748859778047, 0.0006547844968736172, 0.0010822409531101584, 0.039358437061309814, 0.015868132933974266, 0.004862118046730757, 0.010113048367202282, 0.006636506412178278, 0.013325654901564121, 0.00992174819111824, 0.006901532877236605, 0.013714304193854332, 0.648634672164917], [0.21473778784275055, 0.0015322559047490358, 0.0008709524408914149, 0.0015236164908856153, 0.0002425406564725563, 0.0007383427582681179, 0.0018147750524803996, 0.05486539378762245, 0.008572828024625778, 0.0051580676808953285, 0.008073004893958569, 0.005754687823355198, 0.00822393223643303, 0.008547737263143063, 0.019347606226801872, 0.01674085669219494, 0.6432556509971619], [0.05667409300804138, 0.004187911283224821, 0.004363237880170345, 0.01009769830852747, 0.002748620230704546, 0.004088571760803461, 0.008385852910578251, 0.07417638599872589, 0.005385924596339464, 0.00445749145001173, 0.010954239405691624, 0.009569751098752022, 0.007125008385628462, 0.004697263706475496, 0.0048525030724704266, 0.010394505225121975, 0.7778409123420715]]], [[[0.119682677090168, 0.03657415509223938, 0.005879373289644718, 0.06013047322630882, 0.011748104356229305, 0.03125850111246109, 0.02326781302690506, 0.21318161487579346, 0.008368825539946556, 0.03946993127465248, 0.008382030762732029, 0.010378782637417316, 0.03116231970489025, 0.004587644245475531, 0.018135081976652145, 0.036566365510225296, 0.3412263095378876], [0.004728312138468027, 0.014894712716341019, 0.010823353193700314, 0.024893706664443016, 0.003325273049995303, 0.005501526407897472, 0.011762293986976147, 0.30839094519615173, 0.0011613927781581879, 0.0036468077450990677, 0.0012750983005389571, 0.001678634900599718, 0.0023293273989111185, 0.0007841555052436888, 0.0007959415088407695, 0.0025151828303933144, 0.6014932990074158], [0.003607547376304865, 0.004824730101972818, 0.00545975286513567, 0.008038468658924103, 0.0023065393324941397, 0.003487263573333621, 0.003361665876582265, 0.3172805607318878, 0.00027798183145932853, 0.0011741180205717683, 0.000451065570814535, 0.00047036277828738093, 0.00047873161383904517, 0.00030445720767602324, 0.000465520191937685, 0.002131069079041481, 0.6458801627159119], [0.004912715870887041, 0.003867042250931263, 0.007869108580052853, 0.011273816227912903, 0.0029014560859650373, 0.0028220918029546738, 0.00621917424723506, 0.30781400203704834, 0.0007236211095005274, 0.0022337862756103277, 0.0005384947289712727, 0.0011393395252525806, 0.001475237775593996, 0.0005238472949713469, 0.0009945313213393092, 0.002126105595380068, 0.6425655484199524], [0.0030781675595790148, 0.0014166809851303697, 0.010156515054404736, 0.0033183882478624582, 0.0009473193786107004, 0.0015475393738597631, 0.0037747046444565058, 0.3000498414039612, 0.0006951651885174215, 0.0039981659501791, 0.0006662574596703053, 0.0009595245937816799, 0.0011959432158619165, 0.000577148690354079, 0.0012109503149986267, 0.0019583944231271744, 0.6644493937492371], [0.0033948083873838186, 0.0072891297750175, 0.010611327365040779, 0.011299016885459423, 0.0021306446287781, 0.004896234255284071, 0.008769741281867027, 0.3206828832626343, 0.000801234389655292, 0.0028742686845362186, 0.000648608838673681, 0.0014026300050318241, 0.0012696697376668453, 0.0005421547102741897, 0.0009618783951736987, 0.0019377015996724367, 0.6204881072044373], [0.004026367794722319, 0.013461663387715816, 0.010646340437233448, 0.011515709571540356, 0.003191379364579916, 0.009856035932898521, 0.013087722472846508, 0.32650697231292725, 0.0006740486132912338, 0.0013634562492370605, 0.0010407947702333331, 0.0007359760929830372, 0.0008314431761391461, 0.0006198071641847491, 0.00041414666338823736, 0.0035610832273960114, 0.5984670519828796], [0.0015933964168652892, 0.003376591484993696, 0.0025837922003120184, 0.0027546784840524197, 0.001450772164389491, 0.002090931637212634, 0.0017436647322028875, 0.30805233120918274, 0.0002941359707619995, 0.0006639895145781338, 0.0003366142336744815, 0.0005711842095479369, 0.0007556130294688046, 0.00044608174357563257, 0.0007832861738279462, 0.0026589930057525635, 0.669843852519989], [0.0016423105262219906, 0.002691995818167925, 0.0007097254856489599, 0.003509949892759323, 0.0004096374032087624, 0.0005920448456890881, 0.0004623478453140706, 0.2503424286842346, 0.0003968988312408328, 0.0005569732165895402, 0.000437364709796384, 0.0006568634416908026, 0.001920860493555665, 0.000758867128752172, 0.0005842428654432297, 0.001861237222328782, 0.7324662804603577], [0.003160088323056698, 0.0035014653112739325, 0.0034264493733644485, 0.0032941659446805716, 0.0006368197500705719, 0.0021194887813180685, 0.0022297552786767483, 0.2753337323665619, 0.0012888801284134388, 0.0014904489507898688, 0.0012694848701357841, 0.0009460471919737756, 0.00278063234873116, 0.002565214177593589, 0.0020621048752218485, 0.004887385759502649, 0.6890078186988831], [0.0015936325071379542, 0.0004078051424585283, 0.0003029261715710163, 0.0009525819914415479, 0.000191978964721784, 0.00023177142429631203, 0.00030657180468551815, 0.23669315874576569, 0.00018273202294949442, 0.00042320872307755053, 0.00035787600791081786, 0.0003527290828060359, 0.0014475565403699875, 0.0006441067671403289, 0.0009437997941859066, 0.0022278870455920696, 0.7527396082878113], [0.0015819136751815677, 0.0015424680896103382, 0.0009919197764247656, 0.0028215101920068264, 0.0005708199460059404, 0.0008290550904348493, 0.0010436951415613294, 0.2607632875442505, 0.00033504777820780873, 0.0008360492647625506, 0.000572861754335463, 0.0006374907679855824, 0.0016542492667213082, 0.0008701059850864112, 0.001188513240776956, 0.0023548221215605736, 0.7214062213897705], [0.004719776567071676, 0.0015401177806779742, 0.0010998566867783666, 0.001277686795219779, 0.00029387581162154675, 0.0005708645912818611, 0.0005930970655754209, 0.24179522693157196, 0.0011372871231287718, 0.002820243127644062, 0.002008718904107809, 0.001389667158946395, 0.005891652777791023, 0.003633887507021427, 0.0023290328681468964, 0.007792665157467127, 0.7211062908172607], [0.003471784293651581, 0.0018713419558480382, 0.0005353756132535636, 0.0016390401870012283, 0.0003377064422238618, 0.0007498878403566778, 0.0006242459639906883, 0.2591135501861572, 0.0005434353370219469, 0.0007002720958553255, 0.0005820516380481422, 0.0007835139404051006, 0.0028150943107903004, 0.0017070486210286617, 0.0013770179357379675, 0.0041333651170134544, 0.7190151810646057], [0.0068931421265006065, 0.006108304485678673, 0.0026366703677922487, 0.0018198699690401554, 0.000469216174678877, 0.0016236165538430214, 0.0010705067543312907, 0.3010624945163727, 0.0015292505268007517, 0.0020762707572430372, 0.0008401786326430738, 0.0015810151817277074, 0.00412071542814374, 0.004189734812825918, 0.003698602318763733, 0.006428340915590525, 0.6538521647453308], [0.0063268868252635, 0.0027964734472334385, 0.0027081381995230913, 0.003542772028595209, 0.0008408916764892638, 0.0017972388304769993, 0.0033439956605434418, 0.28153344988822937, 0.001615142566151917, 0.0028539197519421577, 0.0033585582859814167, 0.0033434887882322073, 0.007771339267492294, 0.006587252486497164, 0.0055069769732654095, 0.013843133114278316, 0.652230441570282], [0.0020433806348592043, 0.004115737974643707, 0.0030028889887034893, 0.0034804437309503555, 0.0019187111174687743, 0.0024421897251158953, 0.0020967663731426, 0.31322988867759705, 0.00046413615928031504, 0.0008238296722993255, 0.0005198113503865898, 0.0007801755564287305, 0.0011932514607906342, 0.0007622826960869133, 0.0010801141615957022, 0.0036541586741805077, 0.6583921909332275]], [[0.08217322826385498, 0.011769932694733143, 0.0049791294150054455, 0.002052875468507409, 0.004368272144347429, 0.008919407613575459, 0.003587773535400629, 0.007302803453058004, 0.0786328911781311, 0.1414363533258438, 0.08520347625017166, 0.08471868932247162, 0.13337840139865875, 0.11672113835811615, 0.07130829244852066, 0.15378789603710175, 0.00965946726500988], [0.020278314128518105, 0.04187458008527756, 0.01494475919753313, 0.02258390560746193, 0.009907709434628487, 0.023853639140725136, 0.020233917981386185, 0.3241523802280426, 0.005544676911085844, 0.015030697919428349, 0.006997787859290838, 0.005184730049222708, 0.008194798603653908, 0.007977431640028954, 0.009925232268869877, 0.015825485810637474, 0.44748997688293457], [0.010544795542955399, 0.029439564794301987, 0.00422721728682518, 0.019836995750665665, 0.014153113588690758, 0.031493835151195526, 0.0258620698004961, 0.3534133732318878, 0.0011475352803245187, 0.003103602211922407, 0.001400218578055501, 0.0012866349425166845, 0.0030750080477446318, 0.001606268109753728, 0.0012649665586650372, 0.004861037712544203, 0.4932836890220642], [0.015261903405189514, 0.048418477177619934, 0.011135386303067207, 0.01936301775276661, 0.013434821739792824, 0.03379392251372337, 0.025605417788028717, 0.3265272378921509, 0.004502957221120596, 0.01239609532058239, 0.005546471104025841, 0.005603297147899866, 0.005943938158452511, 0.0053307292982935905, 0.00275730830617249, 0.014084560796618462, 0.45029449462890625], [0.004126267973333597, 0.013302577659487724, 0.005113800521939993, 0.020256631076335907, 0.007126645650714636, 0.0193606186658144, 0.018013369292020798, 0.34319260716438293, 0.0013680547708645463, 0.005474824924021959, 0.0022018759045749903, 0.0018413036596029997, 0.003111222293227911, 0.0022224027197808027, 0.0015112675027921796, 0.0066583664156496525, 0.5451182126998901], [0.015158239752054214, 0.03687872365117073, 0.011417957954108715, 0.017705122008919716, 0.011984637938439846, 0.03286717087030411, 0.018257884308695793, 0.3262115716934204, 0.003933407831937075, 0.010555451735854149, 0.006867057178169489, 0.004661425016820431, 0.009707938879728317, 0.007308835629373789, 0.007942474447190762, 0.019533582031726837, 0.45900845527648926], [0.08240387588739395, 0.16304844617843628, 0.054812416434288025, 0.08966787904500961, 0.056542836129665375, 0.14350254833698273, 0.05549096316099167, 0.13249064981937408, 0.012549197301268578, 0.013415309600532055, 0.005827395711094141, 0.0061945198103785515, 0.018521862104535103, 0.008129687048494816, 0.004391726106405258, 0.00952915195375681, 0.1434815227985382], [0.0019068621331825852, 0.004382157698273659, 0.0018128089141100645, 0.00594665901735425, 0.0024996513966470957, 0.005481114611029625, 0.00356927327811718, 0.3514494299888611, 0.0007200753898359835, 0.00167653092648834, 0.0008328179828822613, 0.0009431089856661856, 0.001657589222304523, 0.0013315133983269334, 0.0017236035782843828, 0.00315262982621789, 0.6109141111373901], [0.008972134441137314, 0.011384209617972374, 0.0017613935051485896, 0.009415870532393456, 0.003064833115786314, 0.007505562622100115, 0.006640893407166004, 0.33340805768966675, 0.004573074635118246, 0.007375861518085003, 0.0043639978393912315, 0.003601818112656474, 0.007725631818175316, 0.007405474316328764, 0.005547629203647375, 0.008368537761271, 0.5688850283622742], [0.039139069616794586, 0.02945924922823906, 0.006195316556841135, 0.02591659501194954, 0.019007060676813126, 0.03293213993310928, 0.012089717201888561, 0.2783716917037964, 0.01563883386552334, 0.01757432520389557, 0.008490326814353466, 0.01594448834657669, 0.039579931646585464, 0.017504308372735977, 0.00606664689257741, 0.01680098846554756, 0.4192892611026764], [0.0020675009582191706, 0.0020794107113033533, 0.0004512038140092045, 0.0013710932107642293, 0.0008696326403878629, 0.0031197366770356894, 0.0028233258053660393, 0.3159424364566803, 0.0008264543721452355, 0.0026929317973554134, 0.0006418319535441697, 0.0012668330455198884, 0.004074615426361561, 0.0018067086348310113, 0.003187961643561721, 0.006306775379925966, 0.6504716277122498], [0.008379057049751282, 0.007660302333533764, 0.0016622405964881182, 0.007345058489590883, 0.004450858570635319, 0.011136407032608986, 0.007255153264850378, 0.3107338547706604, 0.006476256996393204, 0.010816916823387146, 0.006973301526159048, 0.004971171263605356, 0.017492862418293953, 0.007554244715720415, 0.003542584367096424, 0.009426993317902088, 0.5741226077079773], [0.02080857753753662, 0.017689505591988564, 0.0028127802070230246, 0.008033149875700474, 0.0027605409268289804, 0.019351810216903687, 0.014678510837256908, 0.3184175491333008, 0.015050753951072693, 0.016025638207793236, 0.019362716004252434, 0.014855110086500645, 0.013317299075424671, 0.016443397849798203, 0.012139454483985901, 0.021641263738274574, 0.4666120707988739], [0.003064144402742386, 0.002824552357196808, 0.0005567368934862316, 0.0025749315973371267, 0.0012546905782073736, 0.004475915804505348, 0.00318732438609004, 0.33179470896720886, 0.003044947749003768, 0.00405132444575429, 0.003746705362573266, 0.0027252414729446173, 0.008651207201182842, 0.005377228371798992, 0.004926381632685661, 0.00733488192781806, 0.6104092001914978], [0.0020273199770599604, 0.002550124190747738, 0.0004479243070818484, 0.001750982366502285, 0.0014624390751123428, 0.005270801484584808, 0.0035210230853408575, 0.33058226108551025, 0.0022681145928800106, 0.004073954187333584, 0.0014548917533829808, 0.0038858617190271616, 0.006654147990047932, 0.004717937204986811, 0.002741041826084256, 0.008752353489398956, 0.6178388595581055], [0.016818545758724213, 0.015338807366788387, 0.003248669672757387, 0.00875090528279543, 0.006946306675672531, 0.027110446244478226, 0.01833324506878853, 0.277842253446579, 0.019766584038734436, 0.033802684396505356, 0.014293723739683628, 0.01551347877830267, 0.04968661069869995, 0.026264827698469162, 0.018351318314671516, 0.027776110917329788, 0.42015540599823], [0.0026396960020065308, 0.005944147240370512, 0.002647501416504383, 0.008423368446528912, 0.00321288057602942, 0.007306626997888088, 0.00531102204695344, 0.35773780941963196, 0.0011507720919325948, 0.0022929287515580654, 0.001232140464708209, 0.0014707983937114477, 0.00245307176373899, 0.002024380723014474, 0.0023018536157906055, 0.004039027728140354, 0.5898120403289795]], [[0.04800168424844742, 0.083704873919487, 0.0362408272922039, 0.0612020418047905, 0.01939435862004757, 0.042116910219192505, 0.059538647532463074, 0.027630694210529327, 0.11539368331432343, 0.043917082250118256, 0.028212858363986015, 0.0462634451687336, 0.16703559458255768, 0.11943523585796356, 0.031129876151680946, 0.03965875878930092, 0.031123431399464607], [0.024722959846258163, 0.12860937416553497, 0.07810866087675095, 0.06464578956365585, 0.046371445059776306, 0.05859127268195152, 0.068037249147892, 0.06396441161632538, 0.06323465704917908, 0.060312479734420776, 0.02402551844716072, 0.09665482491254807, 0.027649683877825737, 0.034777283668518066, 0.053983185440301895, 0.03702535480260849, 0.06928592920303345], [0.036931660026311874, 0.023660529404878616, 0.03167734667658806, 0.012308362871408463, 0.01149784866720438, 0.016100024804472923, 0.025463471189141273, 0.25650081038475037, 0.01571366935968399, 0.028854407370090485, 0.023389151319861412, 0.031172964721918106, 0.03470131382346153, 0.014117686077952385, 0.014034667983651161, 0.03899049013853073, 0.3848855495452881], [0.03456014394760132, 0.04433441907167435, 0.034770604223012924, 0.04458116367459297, 0.0323064811527729, 0.03720846772193909, 0.029751814901828766, 0.23611228168010712, 0.03089768998324871, 0.01945125125348568, 0.018567804247140884, 0.029807372018694878, 0.025672396644949913, 0.016509700566530228, 0.037210896611213684, 0.028814375400543213, 0.2994431257247925], [0.07590455561876297, 0.03289211168885231, 0.021031538024544716, 0.025601878762245178, 0.023594042286276817, 0.05189882591366768, 0.03998985141515732, 0.18357814848423004, 0.02816777490079403, 0.034736260771751404, 0.013865520246326923, 0.07580208033323288, 0.018452633172273636, 0.02570684254169464, 0.051033712923526764, 0.07774655520915985, 0.21999764442443848], [0.04332859441637993, 0.059592265635728836, 0.036509159952402115, 0.04535648971796036, 0.0506095290184021, 0.0830661952495575, 0.05717173218727112, 0.1383264809846878, 0.039717722684144974, 0.04031620919704437, 0.021302545443177223, 0.07544399797916412, 0.020062098279595375, 0.028950637206435204, 0.0510261245071888, 0.049120478332042694, 0.1600998491048813], [0.01835658960044384, 0.01365467719733715, 0.00576998433098197, 0.014165760949254036, 0.008845683187246323, 0.025218207389116287, 0.0257381834089756, 0.26286232471466064, 0.02751028537750244, 0.0195473525673151, 0.029272977262735367, 0.026405496522784233, 0.04130988195538521, 0.026949208229780197, 0.013819715939462185, 0.04643242061138153, 0.3941412568092346], [0.005077761132270098, 0.008481858298182487, 0.006909430492669344, 0.0063754115253686905, 0.00626607658341527, 0.009907298721373081, 0.011660799384117126, 0.3494066894054413, 0.003568548010662198, 0.0036061976570636034, 0.006961391307413578, 0.004364240448921919, 0.008432558737695217, 0.004373473580926657, 0.0063056801445782185, 0.010208433493971825, 0.5480942130088806], [0.04479110985994339, 0.020938469097018242, 0.029449041932821274, 0.011638815514743328, 0.01205146498978138, 0.021687813103199005, 0.01566934399306774, 0.20999109745025635, 0.04454429820179939, 0.045737188309431076, 0.03080802410840988, 0.030575744807720184, 0.06096185743808746, 0.029980752617120743, 0.04901446774601936, 0.027733733877539635, 0.31442686915397644], [0.020245997235178947, 0.011888718232512474, 0.01683388650417328, 0.008572207763791084, 0.007634407840669155, 0.015447032637894154, 0.017190929502248764, 0.19868670403957367, 0.01987418159842491, 0.05860292539000511, 0.04583500698208809, 0.06363064050674438, 0.06451527774333954, 0.028580324724316597, 0.02860206924378872, 0.06422074884176254, 0.3296389877796173], [0.013758668676018715, 0.009595423936843872, 0.006749797612428665, 0.004344916436821222, 0.003931278362870216, 0.010937336832284927, 0.007002472877502441, 0.2755400538444519, 0.011350201442837715, 0.02099747397005558, 0.04609410837292671, 0.019317712634801865, 0.04497183859348297, 0.01591489277780056, 0.01662302203476429, 0.027341432869434357, 0.46552935242652893], [0.00878020841628313, 0.014445461332798004, 0.011212329380214214, 0.013908833265304565, 0.0068495012819767, 0.017579704523086548, 0.02152176946401596, 0.18696393072605133, 0.03176922723650932, 0.04556567221879959, 0.06310161203145981, 0.08729543536901474, 0.03383221477270126, 0.05239216983318329, 0.05396859720349312, 0.0716511458158493, 0.27916210889816284], [0.011757960543036461, 0.03441855311393738, 0.018426017835736275, 0.013140322640538216, 0.011919529177248478, 0.0438603013753891, 0.02173873968422413, 0.25471290946006775, 0.026234019547700882, 0.0267641544342041, 0.013883348554372787, 0.03176715224981308, 0.02942175790667534, 0.03432067856192589, 0.05594983324408531, 0.03221704438328743, 0.3394675552845001], [0.018792476505041122, 0.03650365769863129, 0.03408867493271828, 0.015645088627934456, 0.01188909262418747, 0.03398241847753525, 0.03505265340209007, 0.25461286306381226, 0.016482515260577202, 0.02781561017036438, 0.016233807429671288, 0.027501383796334267, 0.02342269755899906, 0.023273857310414314, 0.0534541979432106, 0.03316257894039154, 0.33808639645576477], [0.04774869233369827, 0.02932026982307434, 0.01863635890185833, 0.01836075633764267, 0.008509619161486626, 0.043175600469112396, 0.04796186834573746, 0.09238579869270325, 0.036234937608242035, 0.05946137756109238, 0.027325576171278954, 0.11441640555858612, 0.07072995603084564, 0.04912677779793739, 0.11453230679035187, 0.10464799404144287, 0.11742564290761948], [0.050868045538663864, 0.027037782594561577, 0.01641017571091652, 0.019620303064584732, 0.008655347861349583, 0.03136634826660156, 0.03830014169216156, 0.22560995817184448, 0.014443653635680676, 0.03386171907186508, 0.016614075750112534, 0.033631712198257446, 0.02582203783094883, 0.02169068157672882, 0.03762481361627579, 0.07834500819444656, 0.32009828090667725], [0.0058554490096867085, 0.008349255658686161, 0.007420639973133802, 0.007008175365626812, 0.006696092430502176, 0.009897248819470406, 0.012517311610281467, 0.34621790051460266, 0.00386608368717134, 0.004134634044021368, 0.00832203309983015, 0.004663162399083376, 0.009876835159957409, 0.00481819175183773, 0.007104278542101383, 0.011770089156925678, 0.54148268699646]], [[0.2074430286884308, 0.14741578698158264, 0.0033191435504704714, 0.04809736832976341, 0.01198700163513422, 0.08085609972476959, 0.0562736950814724, 0.06281157582998276, 0.09421762079000473, 0.012885448522865772, 0.020288599655032158, 0.012258591130375862, 0.09828852862119675, 0.06451018899679184, 0.002725484548136592, 0.015925418585538864, 0.06069640815258026], [0.024529313668608665, 0.0593734011054039, 0.017329687252640724, 0.02269977331161499, 0.010221821255981922, 0.0484471395611763, 0.01769878901541233, 0.30250242352485657, 0.016154900193214417, 0.016319243237376213, 0.005342547316104174, 0.005754286888986826, 0.0094306580722332, 0.01667174883186817, 0.009067065082490444, 0.008337298408150673, 0.41011983156204224], [0.00961106363683939, 0.01739589497447014, 0.017575876787304878, 0.009604421444237232, 0.007021583151072264, 0.013477222993969917, 0.006432646885514259, 0.34103113412857056, 0.002910499693825841, 0.003699944820255041, 0.0010820941533893347, 0.001586521277204156, 0.0011604282772168517, 0.0029735988937318325, 0.0018788606394082308, 0.00433000735938549, 0.5582281947135925], [0.019364019855856895, 0.03563128039240837, 0.02533000521361828, 0.035002484917640686, 0.02213258668780327, 0.04294087737798691, 0.013286074623465538, 0.29597947001457214, 0.006531943567097187, 0.005418973974883556, 0.0016909103142097592, 0.0026729563251137733, 0.0026244947221130133, 0.0039365640841424465, 0.0019549408461898565, 0.003959295339882374, 0.48154303431510925], [0.009628181345760822, 0.03215481713414192, 0.021827371791005135, 0.031453363597393036, 0.018662652000784874, 0.030965562909841537, 0.00971188023686409, 0.31667032837867737, 0.004043431952595711, 0.004723276477307081, 0.001941994414664805, 0.0016159628285095096, 0.002147024730220437, 0.0032830198761075735, 0.0027111289091408253, 0.0036455481313169003, 0.5048143863677979], [0.040183790028095245, 0.06514546275138855, 0.02323908545076847, 0.0454246923327446, 0.02404799498617649, 0.11239853501319885, 0.05933195725083351, 0.23828360438346863, 0.01974361203610897, 0.017040347680449486, 0.00971305463463068, 0.009082472883164883, 0.008680508472025394, 0.01414855383336544, 0.007187818642705679, 0.01137139555066824, 0.2949770987033844], [0.017410650849342346, 0.007138041313737631, 0.003709490643814206, 0.009871086105704308, 0.0063913362100720406, 0.022145552560687065, 0.011498366482555866, 0.343851774930954, 0.00515216076746583, 0.005045154131948948, 0.0018807390006259084, 0.002558512380346656, 0.001532169757410884, 0.002005973132327199, 0.001107715885154903, 0.003863105783239007, 0.5548381805419922], [0.004079516977071762, 0.0047203125432133675, 0.002821495523676276, 0.003850964829325676, 0.0016170816961675882, 0.0039874231442809105, 0.0015067528001964092, 0.3421068787574768, 0.0018677576445043087, 0.0021211367566138506, 0.0019017898011952639, 0.0013703879667446017, 0.0016723050503060222, 0.0017877464415505528, 0.0010031778365373611, 0.0031638951040804386, 0.6204213500022888], [0.04106517881155014, 0.02077764831483364, 0.0013252486241981387, 0.008748630993068218, 0.0031970154959708452, 0.009084319695830345, 0.0027802586555480957, 0.3039073944091797, 0.0253163930028677, 0.00783852580934763, 0.02427324280142784, 0.005749383941292763, 0.018512684851884842, 0.006781077943742275, 0.0016864087665453553, 0.0036968847271054983, 0.5152596831321716], [0.012803342193365097, 0.011074734851717949, 0.0019112450536340475, 0.0040617771446704865, 0.00308970850892365, 0.004915929399430752, 0.0032301375176757574, 0.3082076907157898, 0.012998942285776138, 0.007384052500128746, 0.01677638851106167, 0.004890452139079571, 0.005245671607553959, 0.004357943776994944, 0.002176630310714245, 0.0046351198107004166, 0.5922402143478394], [0.001983937807381153, 0.001338261878117919, 0.00037334332591854036, 0.001632934552617371, 0.0006188882980495691, 0.001111381221562624, 0.00016824305930640548, 0.2932882010936737, 0.0008275036234408617, 0.000866173708345741, 0.009019426070153713, 0.0006821405841037631, 0.0009236318292096257, 0.0006526560173369944, 0.0005210568197071552, 0.0011271530529484153, 0.6848650574684143], [0.01473785750567913, 0.03897244110703468, 0.0044228206388652325, 0.011582763865590096, 0.005530445836484432, 0.012332597747445107, 0.007379963528364897, 0.29217132925987244, 0.022224081680178642, 0.010707680135965347, 0.04970953240990639, 0.015391586348414421, 0.010536017827689648, 0.004699649754911661, 0.003318654838949442, 0.005951371509581804, 0.4903312027454376], [0.012756599113345146, 0.005584961734712124, 0.0003656526387203485, 0.0028600210789591074, 0.0012193903094157577, 0.0033296719193458557, 0.002186309080570936, 0.3172440230846405, 0.011467457748949528, 0.004850554745644331, 0.005832584574818611, 0.0048761009238660336, 0.009281857870519161, 0.004790020175278187, 0.001543831080198288, 0.0035061032976955175, 0.6083047986030579], [0.01446382887661457, 0.015259920619428158, 0.0024432679638266563, 0.005181195680052042, 0.0026743989437818527, 0.010456634685397148, 0.002858532126992941, 0.3055904507637024, 0.017479103058576584, 0.010918620973825455, 0.021726764738559723, 0.007857512682676315, 0.01921847276389599, 0.014361138455569744, 0.006669118534773588, 0.010390042327344418, 0.5324509143829346], [0.00508884247392416, 0.006047142669558525, 0.0015585072105750442, 0.003651480423286557, 0.002216636436060071, 0.006175580900162458, 0.004610598087310791, 0.28795498609542847, 0.007041393779218197, 0.005238358862698078, 0.019474858418107033, 0.007629035506397486, 0.011667465791106224, 0.010918911546468735, 0.037192609161138535, 0.0171225443482399, 0.5664111375808716], [0.009754200465977192, 0.01005519088357687, 0.0027675193268805742, 0.006834049243479967, 0.0037423474714159966, 0.010609932243824005, 0.005040863994508982, 0.29746177792549133, 0.014197624288499355, 0.010821344330906868, 0.017718350514769554, 0.011069850996136665, 0.017552195116877556, 0.016413239762187004, 0.006116026546806097, 0.019156591966748238, 0.5406889915466309], [0.0050877779722213745, 0.005290530156344175, 0.0033858553506433964, 0.0044509475119411945, 0.0019621127285063267, 0.004259302746504545, 0.0015664122765883803, 0.33982330560684204, 0.0023756742011755705, 0.0027328773867338896, 0.0025786100886762142, 0.001844326383434236, 0.002129345666617155, 0.0022728934418410063, 0.0012245237594470382, 0.004008973482996225, 0.6150065660476685]], [[0.05618243291974068, 0.2741358280181885, 0.09632766991853714, 0.07832100987434387, 0.0835067629814148, 0.18855415284633636, 0.1566363126039505, 0.025941016152501106, 0.0029971522744745016, 0.001959498506039381, 0.0007813793490640819, 0.0006628622650168836, 0.004309113137423992, 0.002362916711717844, 0.002040873747318983, 0.0021966425701975822, 0.02308446727693081], [0.03346271067857742, 0.077074334025383, 0.020460108295083046, 0.021783312782645226, 0.008212226442992687, 0.05149408429861069, 0.04030098766088486, 0.2847473621368408, 0.01152790617197752, 0.011548184789717197, 0.0028917156159877777, 0.005928909871727228, 0.01574292965233326, 0.0068449582904577255, 0.0064364150166511536, 0.015974383801221848, 0.3855694532394409], [0.016293957829475403, 0.030578477308154106, 0.02606211043894291, 0.026662101969122887, 0.014459903351962566, 0.029192639514803886, 0.04235725477337837, 0.295195996761322, 0.0070534637197852135, 0.017041651532053947, 0.002871521282941103, 0.007637311704456806, 0.01679398864507675, 0.004611434880644083, 0.003939683549106121, 0.013077064417302608, 0.4461714029312134], [0.031093744561076164, 0.0347423292696476, 0.027745530009269714, 0.03439443185925484, 0.01247028075158596, 0.026796694844961166, 0.03315943479537964, 0.2898346185684204, 0.014364472590386868, 0.01565786637365818, 0.0045464495196938515, 0.010651420801877975, 0.025935987010598183, 0.006835500244051218, 0.0027143689803779125, 0.01221383921802044, 0.4168429672718048], [0.008422421291470528, 0.01071685180068016, 0.006573622114956379, 0.00888756848871708, 0.005669698119163513, 0.013364444486796856, 0.039052627980709076, 0.33330732583999634, 0.0011352422880008817, 0.004425784572958946, 0.001581091433763504, 0.002609552349895239, 0.0068012019619345665, 0.0013644706923514605, 0.004122323356568813, 0.006186676677316427, 0.5457791686058044], [0.022102737799286842, 0.04145985096693039, 0.016822926700115204, 0.021822649985551834, 0.010476550087332726, 0.048897262662649155, 0.056118521839380264, 0.28729087114334106, 0.008364906534552574, 0.015172748826444149, 0.004519328940659761, 0.007900619879364967, 0.01319943182170391, 0.008985215798020363, 0.011731725186109543, 0.023695075884461403, 0.40143951773643494], [0.19030217826366425, 0.07101424038410187, 0.019784459844231606, 0.05834343284368515, 0.024618420749902725, 0.07370980083942413, 0.01655205525457859, 0.18270185589790344, 0.020684864372015, 0.05508466437458992, 0.004532761871814728, 0.010942146182060242, 0.01257702149450779, 0.0026696305721998215, 0.004249166697263718, 0.013324275612831116, 0.23890899121761322], [0.00343344290740788, 0.0018569489475339651, 0.0017211452359333634, 0.003444572677835822, 0.0013160778908059, 0.0019132639281451702, 0.0012120791943743825, 0.31317800283432007, 0.0007890362176112831, 0.0014767281245440245, 0.0009965880308300257, 0.0011177058331668377, 0.0020568494219332933, 0.0009606924722902477, 0.0016576608177274466, 0.004134895280003548, 0.6587343811988831], [0.016159964725375175, 0.026301125064492226, 0.002665724139660597, 0.011301453225314617, 0.002337057376280427, 0.0139158945530653, 0.010839746333658695, 0.31047338247299194, 0.012295790947973728, 0.004322476219385862, 0.0017919761594384909, 0.003039939794689417, 0.015270252712070942, 0.005859699100255966, 0.002085032407194376, 0.005300505552440882, 0.5560399889945984], [0.008601350709795952, 0.017361970618367195, 0.004159077536314726, 0.010683134198188782, 0.0035069817677140236, 0.011520921252667904, 0.018825039267539978, 0.3069593608379364, 0.019341062754392624, 0.024759512394666672, 0.005204393062740564, 0.00882721971720457, 0.027684073895215988, 0.013680237345397472, 0.0038942911196500063, 0.01611429639160633, 0.49887707829475403], [0.005718030035495758, 0.00431881844997406, 0.0017100432887673378, 0.002536152023822069, 0.0014022400137037039, 0.004803814925253391, 0.005190389230847359, 0.32394370436668396, 0.0037135633174329996, 0.006583063863217831, 0.00435568718239665, 0.005466507747769356, 0.01612507365643978, 0.005814654286950827, 0.004786695819348097, 0.014074505306780338, 0.5894571542739868], [0.009385529905557632, 0.007813487201929092, 0.002013125456869602, 0.00750894658267498, 0.002573978155851364, 0.007493186742067337, 0.011063402518630028, 0.31510862708091736, 0.00472642108798027, 0.005400654394179583, 0.001698134932667017, 0.0027712963055819273, 0.008107032626867294, 0.002659829333424568, 0.0011809028219431639, 0.003839410375803709, 0.6066559553146362], [0.023887677118182182, 0.03224392980337143, 0.00695052882656455, 0.011273288168013096, 0.005296899937093258, 0.015338189899921417, 0.01803724281489849, 0.3041954040527344, 0.02121538110077381, 0.008265489712357521, 0.0039183697663247585, 0.004489358048886061, 0.032892171293497086, 0.01347479410469532, 0.0062412372790277, 0.015099842101335526, 0.4771801829338074], [0.009308203123509884, 0.013097485527396202, 0.0020599409472197294, 0.007820328697562218, 0.003675005864351988, 0.012056520208716393, 0.010700075887143612, 0.3255276083946228, 0.004543522372841835, 0.0033079974818974733, 0.0022691485937684774, 0.0025945971719920635, 0.011508548632264137, 0.008368651382625103, 0.005574628245085478, 0.011517897248268127, 0.5660699009895325], [0.008813844993710518, 0.009122364223003387, 0.0036607286892831326, 0.004631753545254469, 0.0023903970140963793, 0.00773899769410491, 0.006384776905179024, 0.28648608922958374, 0.019776713103055954, 0.0071369344368577, 0.017064454033970833, 0.010294297710061073, 0.022359345108270645, 0.039768584072589874, 0.025850174948573112, 0.05024990066885948, 0.4782707095146179], [0.04512160271406174, 0.0341656431555748, 0.009562162682414055, 0.02559579163789749, 0.010760623030364513, 0.02610526606440544, 0.02129720337688923, 0.27874723076820374, 0.016410348936915398, 0.00865959096699953, 0.008814923465251923, 0.008417502045631409, 0.02331508696079254, 0.0190575048327446, 0.014341109432280064, 0.0313006155192852, 0.41832786798477173], [0.00435033580288291, 0.0020859953947365284, 0.001911466009914875, 0.00408110860735178, 0.0015553674893453717, 0.0020089251920580864, 0.0012865117751061916, 0.3109433948993683, 0.0012692773016169667, 0.0019976466428488493, 0.0015325964195653796, 0.001660522073507309, 0.003067702054977417, 0.0015166333178058267, 0.002194985980167985, 0.005449657328426838, 0.6530879735946655]], [[0.04232114180922508, 0.09649496525526047, 0.14928686618804932, 0.026820320636034012, 0.02290155366063118, 0.11227355897426605, 0.23739971220493317, 0.08867177367210388, 0.004175014328211546, 0.04837368056178093, 0.008059220388531685, 0.016035662963986397, 0.015324994921684265, 0.004734131507575512, 0.009198764339089394, 0.024865470826625824, 0.09306320548057556], [0.011350112035870552, 0.013305430300533772, 0.03328068181872368, 0.01918182335793972, 0.009499961510300636, 0.011643611826002598, 0.027522534132003784, 0.28266119956970215, 0.01953517459332943, 0.03351131081581116, 0.0075009106658399105, 0.007862258702516556, 0.02282257005572319, 0.006842359900474548, 0.005692195147275925, 0.010853681713342667, 0.4769342839717865], [0.004513021092861891, 0.009924188256263733, 0.21584582328796387, 0.015526071190834045, 0.02126963622868061, 0.017546894028782845, 0.060557592660188675, 0.23653392493724823, 0.00602701585739851, 0.02567918598651886, 0.0029657084960490465, 0.0019612533506006002, 0.004334204830229282, 0.0034998648334294558, 0.007672810461372137, 0.007519862614572048, 0.3586229383945465], [0.019241122528910637, 0.018564864993095398, 0.03537897765636444, 0.08180919289588928, 0.057457081973552704, 0.018250595778226852, 0.03951866552233696, 0.24245139956474304, 0.01797865703701973, 0.03656528890132904, 0.009643707424402237, 0.007593615911900997, 0.019110914319753647, 0.006289719603955746, 0.008322046138346195, 0.00626251008361578, 0.3755616545677185], [0.00936030875891447, 0.003056734800338745, 0.022374963387846947, 0.013503648340702057, 0.007125658914446831, 0.010499159805476665, 0.05613376945257187, 0.2740703225135803, 0.00987249892205, 0.0406830869615078, 0.008957182988524437, 0.006763970945030451, 0.007040590979158878, 0.005469941534101963, 0.007926110178232193, 0.0057427422143518925, 0.5114192962646484], [0.009155905805528164, 0.005224063992500305, 0.027860505506396294, 0.010310674086213112, 0.010266975499689579, 0.016401495784521103, 0.04853912815451622, 0.29373931884765625, 0.009256888180971146, 0.03005668707191944, 0.005743303801864386, 0.004920328967273235, 0.007401268929243088, 0.005481506697833538, 0.00827424693852663, 0.010651263408362865, 0.4967164993286133], [0.00552574684843421, 0.00856723915785551, 0.04260649532079697, 0.02267889492213726, 0.048777222633361816, 0.029723694548010826, 0.26542070508003235, 0.19538618624210358, 0.005196097306907177, 0.035904355347156525, 0.0015820814296603203, 0.0016703640576452017, 0.0061356620863080025, 0.0043607973493635654, 0.02419430948793888, 0.012237961404025555, 0.2900322377681732], [0.0030194553546607494, 0.004915549885481596, 0.007716563064604998, 0.006234907079488039, 0.004013874102383852, 0.0052932859398424625, 0.005869765300303698, 0.35623010993003845, 0.0021981706377118826, 0.004564995877444744, 0.002698264317587018, 0.0036055566743016243, 0.005008687265217304, 0.0025475553702563047, 0.006630823481827974, 0.0067527382634580135, 0.5726997256278992], [0.02040507085621357, 0.0032341263722628355, 0.0173238143324852, 0.0076188319362699986, 0.001606703968718648, 0.0032612630166113377, 0.014017333276569843, 0.10487155616283417, 0.11201010644435883, 0.10296545177698135, 0.1182243674993515, 0.041134633123874664, 0.15693186223506927, 0.06926154345273972, 0.014525934122502804, 0.016634685918688774, 0.19597262144088745], [0.015897301957011223, 0.007367060519754887, 0.08161136507987976, 0.022293994203209877, 0.016770781949162483, 0.012517319060862064, 0.02607555314898491, 0.17320872843265533, 0.030148213729262352, 0.18042561411857605, 0.024517560377717018, 0.021644312888383865, 0.028065212070941925, 0.01871764473617077, 0.013476415537297726, 0.0192316472530365, 0.30803123116493225], [0.003858668962493539, 0.0007258429541252553, 0.0029697781428694725, 0.004007524345070124, 0.0008872230537235737, 0.0016181610990315676, 0.0013871000846847892, 0.1982976645231247, 0.011856707744300365, 0.031639933586120605, 0.1489957720041275, 0.029670868068933487, 0.04494030773639679, 0.0288385022431612, 0.013618751429021358, 0.019381625577807426, 0.4573054313659668], [0.013071308843791485, 0.002433515153825283, 0.013056688942015171, 0.012140342965722084, 0.002132544293999672, 0.004230610094964504, 0.005484212189912796, 0.21550752222537994, 0.01726675033569336, 0.046896159648895264, 0.08775759488344193, 0.07683480530977249, 0.027206772938370705, 0.018976856023073196, 0.008337763138115406, 0.02235312946140766, 0.42631348967552185], [0.01285009179264307, 0.0024513816460967064, 0.0058423294685781, 0.007112795487046242, 0.0015669222921133041, 0.0033078743144869804, 0.007918434217572212, 0.15291278064250946, 0.02040022611618042, 0.07820294052362442, 0.08586178719997406, 0.08420699089765549, 0.1439448744058609, 0.04103025421500206, 0.017675936222076416, 0.039698079228401184, 0.29501622915267944], [0.010009771212935448, 0.001352645573206246, 0.006296622566878796, 0.0025057534221559763, 0.0006921530002728105, 0.0030623802449554205, 0.013235009275376797, 0.18998777866363525, 0.04038381949067116, 0.0403616726398468, 0.06355856359004974, 0.020783577114343643, 0.06914114207029343, 0.10579755902290344, 0.03138360381126404, 0.02990671433508396, 0.3715413212776184], [0.002076906617730856, 0.001370165147818625, 0.016123879700899124, 0.0017693289555609226, 0.0016236965311691165, 0.0032006725668907166, 0.010895710438489914, 0.26106199622154236, 0.0031777445692569017, 0.010591310448944569, 0.006238625384867191, 0.003371572121977806, 0.0077077532187104225, 0.011078106239438057, 0.15918317437171936, 0.015074336901307106, 0.4854550063610077], [0.005353460554033518, 0.002908449387177825, 0.012110202573239803, 0.003961585462093353, 0.00306505779735744, 0.006802800111472607, 0.01967332325875759, 0.2574544847011566, 0.005549621302634478, 0.021577833220362663, 0.014396502636373043, 0.010954482480883598, 0.015402917750179768, 0.019289463758468628, 0.07337898015975952, 0.06304943561553955, 0.46507129073143005], [0.003073973348364234, 0.005010214634239674, 0.007118604611605406, 0.006891571916639805, 0.004044119734317064, 0.0048408471047878265, 0.0050462097860872746, 0.35102880001068115, 0.0026515841018408537, 0.004968714900314808, 0.003124925307929516, 0.003689427627250552, 0.0061007109470665455, 0.0031185720581561327, 0.008159730583429337, 0.00749787176027894, 0.5736342668533325]], [[0.07369639724493027, 0.06646707653999329, 0.027117695659399033, 0.11574169993400574, 0.05512894690036774, 0.04044606164097786, 0.20047423243522644, 0.12875236570835114, 0.008158335462212563, 0.04547131434082985, 0.006007398013025522, 0.015745803713798523, 0.005470056552439928, 0.0014908823650330305, 0.006962532643228769, 0.02823432721197605, 0.17463482916355133], [0.05542846396565437, 0.01124868355691433, 0.02454628422856331, 0.026355070993304253, 0.019488122314214706, 0.012210761196911335, 0.023015080019831657, 0.26772618293762207, 0.005193982273340225, 0.03074617125093937, 0.0034885250497609377, 0.013948245905339718, 0.0072042085230350494, 0.0035873064771294594, 0.004246192518621683, 0.014170823618769646, 0.4773958921432495], [0.04697722941637039, 0.022519346326589584, 0.028560655191540718, 0.0222694743424654, 0.015614613890647888, 0.02410825714468956, 0.0283406563103199, 0.2919083535671234, 0.0038357439916580915, 0.024649107828736305, 0.002359227742999792, 0.01195433083921671, 0.005251923110336065, 0.0017506118165329099, 0.002866454655304551, 0.013217863626778126, 0.45381617546081543], [0.02970200777053833, 0.007593267131596804, 0.008573697879910469, 0.016933970153331757, 0.017368700355291367, 0.01071375422179699, 0.01126678567379713, 0.28950244188308716, 0.00538649270310998, 0.024895720183849335, 0.0034789550118148327, 0.01681510917842388, 0.007550680544227362, 0.0035433887969702482, 0.003161534434184432, 0.01561226136982441, 0.5279012322425842], [0.020988058298826218, 0.013602803461253643, 0.011609681881964207, 0.026721207424998283, 0.0187680646777153, 0.01995508000254631, 0.013124396093189716, 0.3104734718799591, 0.002716768765822053, 0.014677703380584717, 0.0017458200454711914, 0.008210321888327599, 0.0027182952035218477, 0.0013459989568218589, 0.003196855541318655, 0.006851984187960625, 0.5232934951782227], [0.033656422048807144, 0.018862219527363777, 0.02450457029044628, 0.028104659169912338, 0.020014004781842232, 0.019139496609568596, 0.020495254546403885, 0.2794070541858673, 0.005529644433408976, 0.026969892904162407, 0.002682621357962489, 0.011337081901729107, 0.005688576493412256, 0.0024268722627311945, 0.005383355543017387, 0.009453056380152702, 0.48634523153305054], [0.03913676366209984, 0.05213551968336105, 0.045441094785928726, 0.029415041208267212, 0.052475299686193466, 0.06715046614408493, 0.0392816998064518, 0.2356158345937729, 0.013735495507717133, 0.033451665192842484, 0.005136617459356785, 0.01994931511580944, 0.013192065991461277, 0.005165292881429195, 0.0072790514677762985, 0.01997443474829197, 0.32146430015563965], [0.004923336673527956, 0.009209001436829567, 0.004183240234851837, 0.00962235126644373, 0.0033193263225257397, 0.007896710187196732, 0.004053845070302486, 0.3053766191005707, 0.0017435811460018158, 0.00570980878546834, 0.003149841446429491, 0.003646510886028409, 0.003116130130365491, 0.0020640771836042404, 0.003027547150850296, 0.009284725412726402, 0.6196733713150024], [0.011320503428578377, 0.01287847850471735, 0.004231036640703678, 0.014071362093091011, 0.0029739534948021173, 0.00562923913821578, 0.004689460154622793, 0.2829137146472931, 0.0037934472784399986, 0.003342414740473032, 0.0035454153548926115, 0.005482340697199106, 0.011129328049719334, 0.006163335405290127, 0.002601891290396452, 0.01211392879486084, 0.6131200790405273], [0.012826054356992245, 0.03667604923248291, 0.008872858248651028, 0.02027476765215397, 0.006596200633794069, 0.018432293087244034, 0.012225233018398285, 0.27316218614578247, 0.007550734095275402, 0.009530528448522091, 0.011643902398645878, 0.012616467662155628, 0.021441683173179626, 0.008149184286594391, 0.00552860414609313, 0.021854035556316376, 0.512619137763977], [0.012938289903104305, 0.006909831892699003, 0.003271212335675955, 0.006321427412331104, 0.0015122736804187298, 0.003253504866734147, 0.0018738596700131893, 0.2612558603286743, 0.009647320955991745, 0.0071680257096886635, 0.01058227475732565, 0.011552436277270317, 0.036340419203042984, 0.017787376418709755, 0.007855677045881748, 0.016623713076114655, 0.5851064324378967], [0.014614767394959927, 0.014787524938583374, 0.010407062247395515, 0.011491176672279835, 0.0045379167422652245, 0.009533913806080818, 0.006606985814869404, 0.27419313788414, 0.005573054775595665, 0.00898142158985138, 0.010056321509182453, 0.008097633719444275, 0.01585564576089382, 0.007022731006145477, 0.004047931171953678, 0.019683275371789932, 0.5745095610618591], [0.016067491844296455, 0.008714119903743267, 0.0033892665524035692, 0.008226985111832619, 0.0022685329895466566, 0.004592550452798605, 0.004389595706015825, 0.2804817855358124, 0.007465191651135683, 0.004341114778071642, 0.010291137732565403, 0.009035010822117329, 0.023603443056344986, 0.013946149498224258, 0.0044854735024273396, 0.018871871754527092, 0.5798302292823792], [0.016847174614667892, 0.021087702363729477, 0.004635756369680166, 0.013821098953485489, 0.002564225113019347, 0.007469319738447666, 0.0032626574393361807, 0.28790995478630066, 0.005061584059149027, 0.0038806104566901922, 0.004189354833215475, 0.007197131868451834, 0.017809836193919182, 0.00875290110707283, 0.003011691849678755, 0.013249438256025314, 0.5792496800422668], [0.013578974641859531, 0.02599377930164337, 0.015929365530610085, 0.011834003031253815, 0.004205123987048864, 0.01191774196922779, 0.003465887624770403, 0.2665345072746277, 0.006189959589391947, 0.010602318681776524, 0.010265601798892021, 0.016645876690745354, 0.0272119902074337, 0.01637011580169201, 0.009696041233837605, 0.037133775651454926, 0.5124248266220093], [0.01754230447113514, 0.034152958542108536, 0.00626472057774663, 0.01711321622133255, 0.00392908463254571, 0.011529541574418545, 0.004422815516591072, 0.26351505517959595, 0.01865469478070736, 0.010900565423071384, 0.01442689634859562, 0.02330930344760418, 0.043453577905893326, 0.02524515613913536, 0.006000441964715719, 0.02027956396341324, 0.47926002740859985], [0.005555327050387859, 0.011321173049509525, 0.004849494434893131, 0.010968709364533424, 0.004033650271594524, 0.00914172362536192, 0.004737885668873787, 0.30682477355003357, 0.002726864069700241, 0.006774841342121363, 0.0044799125753343105, 0.005042620934545994, 0.004981603939086199, 0.003614451503381133, 0.004061547107994556, 0.012512635439634323, 0.5983728170394897]], [[0.03580186516046524, 0.05296562984585762, 0.028911808505654335, 0.01862507313489914, 0.012676135636866093, 0.03156374394893646, 0.09162185341119766, 0.17097042500972748, 0.005672859959304333, 0.031676508486270905, 0.011108944192528725, 0.009009270928800106, 0.026260053738951683, 0.009014475159347057, 0.13359680771827698, 0.09297959506511688, 0.23754499852657318], [0.0030962699092924595, 0.010401977226138115, 0.023885956034064293, 0.013663072139024734, 0.005381747614592314, 0.005242825485765934, 0.01035457756370306, 0.27398738265037537, 0.001726997783407569, 0.008226906880736351, 0.0018223561346530914, 0.0026275673881173134, 0.0028652807231992483, 0.0009516663267277181, 0.00705374451354146, 0.008208733052015305, 0.6205028891563416], [0.00846416037529707, 0.011402738280594349, 0.03220270201563835, 0.017334338277578354, 0.0064049880020320415, 0.010059409774839878, 0.03178410977125168, 0.30258655548095703, 0.0020582920406013727, 0.01351278368383646, 0.0013449022080749273, 0.0036434463690966368, 0.0026107511948794127, 0.0015680515207350254, 0.01581207476556301, 0.018937107175588608, 0.5202736258506775], [0.003462919732555747, 0.004322130233049393, 0.021770374849438667, 0.01821298524737358, 0.007592326961457729, 0.004668138455599546, 0.00881330668926239, 0.2756400406360626, 0.0022820935118943453, 0.009952614083886147, 0.001797498669475317, 0.004641758278012276, 0.0025868876837193966, 0.001001017983071506, 0.0041532572358846664, 0.007438203785568476, 0.6216645240783691], [0.0021714605391025543, 0.0025463092606514692, 0.01565895415842533, 0.008556120097637177, 0.0034909965470433235, 0.004218393471091986, 0.01106936950236559, 0.28213393688201904, 0.0008059018873609602, 0.008641283959150314, 0.0007920727366581559, 0.0018297000788152218, 0.0008702632621861994, 0.0005323301302269101, 0.0036083965096622705, 0.004299781285226345, 0.6487747430801392], [0.0061777327209711075, 0.012426759116351604, 0.030953293666243553, 0.01757928356528282, 0.008832531981170177, 0.01592283323407173, 0.031343281269073486, 0.2863965630531311, 0.00206932844594121, 0.018237069249153137, 0.002117638709023595, 0.0035185744054615498, 0.002775121247395873, 0.0011351193534210324, 0.010397687554359436, 0.013095575384795666, 0.5370215177536011], [0.013585938140749931, 0.014022024348378181, 0.04878785461187363, 0.013455034233629704, 0.018033839762210846, 0.020292801782488823, 0.03259448707103729, 0.27387621998786926, 0.008689194917678833, 0.029829464852809906, 0.005522464402019978, 0.011460673063993454, 0.006726657040417194, 0.004167247097939253, 0.01179629284888506, 0.02146286517381668, 0.46569696068763733], [0.0013113915920257568, 0.0016930617857724428, 0.004754963796585798, 0.0032898152712732553, 0.0008957107784226537, 0.0009467920172028244, 0.0017142423894256353, 0.28658854961395264, 0.001904387609101832, 0.0025567507836967707, 0.0012811769265681505, 0.0016908157849684358, 0.0014801995130255818, 0.0020137240644544363, 0.0026195296086370945, 0.003304528072476387, 0.6819543242454529], [0.005051429383456707, 0.001122567686252296, 0.002647553803399205, 0.003190950257703662, 0.0005463911220431328, 0.0005844064289703965, 0.0032161446288228035, 0.255161851644516, 0.003166359616443515, 0.005654721986502409, 0.00737469457089901, 0.002448225161060691, 0.008524739183485508, 0.005858925171196461, 0.007647832855582237, 0.007743903901427984, 0.6800593137741089], [0.02101922780275345, 0.007849006913602352, 0.023462947458028793, 0.010967777110636234, 0.002890623640269041, 0.005473650526255369, 0.017792506143450737, 0.2821122705936432, 0.007170030847191811, 0.01771111786365509, 0.005200207699090242, 0.0049148425459861755, 0.0129205621778965, 0.009769205003976822, 0.0252385213971138, 0.02831202931702137, 0.5171955227851868], [0.0019734955858439207, 0.0005032189656049013, 0.001218911842443049, 0.0010221324628219008, 0.00012615148443728685, 0.00022399377485271543, 0.0011505853617563844, 0.2465369552373886, 0.000807486881967634, 0.0022182385437190533, 0.0020889616571366787, 0.000790018355473876, 0.002242471557110548, 0.0018575420835986733, 0.003560044337064028, 0.0028841132298111916, 0.730795681476593], [0.005579705815762281, 0.003236145479604602, 0.006129849702119827, 0.004484180826693773, 0.0007361247553490102, 0.0012803201097995043, 0.0054097846150398254, 0.2785070240497589, 0.0017415224574506283, 0.004133229609578848, 0.0025663375854492188, 0.0013421970652416348, 0.0032312730327248573, 0.002524322597309947, 0.0060724071227014065, 0.006892625708132982, 0.6661330461502075], [0.015517828054726124, 0.004696711432188749, 0.0029803819488734007, 0.005829711444675922, 0.0008812235319055617, 0.0020777916070073843, 0.004754746798425913, 0.2766418755054474, 0.007372830994427204, 0.00752002838999033, 0.015328435227274895, 0.004653593059629202, 0.01788552850484848, 0.020296813920140266, 0.012626509182155132, 0.014490563422441483, 0.5864454507827759], [0.007105082273483276, 0.0013332953676581383, 0.0014424254186451435, 0.001723151421174407, 0.0002221063623437658, 0.0007341952295973897, 0.002808653051033616, 0.26791858673095703, 0.001972602680325508, 0.003920631017535925, 0.006122084800153971, 0.0017708615632727742, 0.008285640738904476, 0.006828767247498035, 0.011915232986211777, 0.013608649373054504, 0.6622880101203918], [0.015015607699751854, 0.007305089849978685, 0.0064068809151649475, 0.0022947925608605146, 0.000549672928173095, 0.003930551931262016, 0.011729201301932335, 0.28113240003585815, 0.0031697957310825586, 0.006746622733771801, 0.00741637172177434, 0.002470766194164753, 0.00917785707861185, 0.008826141245663166, 0.05215026065707207, 0.032781194895505905, 0.5488969087600708], [0.011647439561784267, 0.006352242548018694, 0.0063038975931704044, 0.00410382030531764, 0.0007571507594548166, 0.0031009891536086798, 0.005768034607172012, 0.28709715604782104, 0.004472686909139156, 0.00952212419360876, 0.0066847545094788074, 0.004218485206365585, 0.00735751586034894, 0.010021993890404701, 0.02135382406413555, 0.02835448831319809, 0.5828834772109985], [0.001497334218584001, 0.002003897214308381, 0.00551882479339838, 0.0038237886037677526, 0.0010628955205902457, 0.0010984818218275905, 0.0016785989282652736, 0.2884233295917511, 0.0028514890000224113, 0.0028835597913712263, 0.0015897626290097833, 0.0021541921887546778, 0.0020303339697420597, 0.003260774305090308, 0.0031158949714154005, 0.0039804778061807156, 0.6730262637138367]], [[0.10231384634971619, 0.04822657257318497, 0.014035449363291264, 0.22435247898101807, 0.060774121433496475, 0.07985565066337585, 0.24080970883369446, 0.055229272693395615, 0.010837005451321602, 0.04434259980916977, 0.010527676902711391, 0.007928035221993923, 0.01310907956212759, 0.0022820786107331514, 0.0021689175628125668, 0.010605605319142342, 0.07260184735059738], [0.046490781009197235, 0.250521183013916, 0.020897692069411278, 0.04409096762537956, 0.10381070524454117, 0.10050298273563385, 0.010810698382556438, 0.046857304871082306, 0.14423811435699463, 0.021540630608797073, 0.018639007583260536, 0.02510441653430462, 0.05081072077155113, 0.03674139827489853, 0.003478321246802807, 0.02884363941848278, 0.04662139341235161], [0.014627782627940178, 0.04120290279388428, 0.4446811378002167, 0.01467923354357481, 0.04158969968557358, 0.024069221690297127, 0.014216679148375988, 0.11584466695785522, 0.024268614128232002, 0.038884032517671585, 0.0037368417251855135, 0.01601598784327507, 0.023707224056124687, 0.010531970299780369, 0.0017431159503757954, 0.03134063631296158, 0.1388603001832962], [0.01673767901957035, 0.04881143942475319, 0.07567895948886871, 0.33459264039993286, 0.08873006701469421, 0.03391461819410324, 0.009355081245303154, 0.07427796721458435, 0.04295920580625534, 0.014623157680034637, 0.0398203581571579, 0.0317721888422966, 0.059089481830596924, 0.01797494851052761, 0.001508086803369224, 0.018500154837965965, 0.09165400266647339], [0.00837042834609747, 0.030092310160398483, 0.013476437889039516, 0.03354636952280998, 0.7834225296974182, 0.054135654121637344, 0.005421343259513378, 0.009862793609499931, 0.010424251668155193, 0.0073937526904046535, 0.00498407194390893, 0.0034986701793968678, 0.01356214378029108, 0.0037819528952240944, 0.00024778442457318306, 0.007580883335322142, 0.010198523290455341], [0.04008343815803528, 0.10951713472604752, 0.03197752684354782, 0.05107555165886879, 0.15460726618766785, 0.199435755610466, 0.02965453267097473, 0.058008693158626556, 0.09214220941066742, 0.03842421621084213, 0.016791483387351036, 0.03116643987596035, 0.025181081146001816, 0.026474973186850548, 0.0045374855399131775, 0.0352134145796299, 0.05570877343416214], [0.011607118882238865, 0.01520081702619791, 0.04260588809847832, 0.020866304636001587, 0.00976637564599514, 0.022626353427767754, 0.09240585565567017, 0.31149742007255554, 0.004977457225322723, 0.02185690589249134, 0.0049994527362287045, 0.003003663383424282, 0.00992362666875124, 0.0025300884153693914, 0.008049491792917252, 0.012959687039256096, 0.4051235318183899], [0.002230716636404395, 0.006503380835056305, 0.009635924361646175, 0.008061449974775314, 0.005050257313996553, 0.005539311561733484, 0.005834879819303751, 0.3444388508796692, 0.0015526474453508854, 0.006807988043874502, 0.006233785301446915, 0.002358405152335763, 0.007827170193195343, 0.001676688902080059, 0.006574063096195459, 0.012214935384690762, 0.5674595236778259], [0.015985960140824318, 0.015500819310545921, 0.0011227468494325876, 0.00436287559568882, 0.003419020213186741, 0.005777842830866575, 0.0018582041375339031, 0.0070347567088902, 0.6480066776275635, 0.014237723313272, 0.03944635018706322, 0.04382740706205368, 0.07246467471122742, 0.10652437061071396, 0.0007181759574450552, 0.012464463710784912, 0.007248013745993376], [0.02855873852968216, 0.030858943238854408, 0.016458123922348022, 0.01886364072561264, 0.015572315081954002, 0.02101227641105652, 0.05973706394433975, 0.06015005335211754, 0.07825782150030136, 0.37543660402297974, 0.050123896449804306, 0.05066576600074768, 0.06622787564992905, 0.01768210157752037, 0.007731163874268532, 0.03566728159785271, 0.066996268928051], [0.006492645479738712, 0.004605474881827831, 0.0010465719969943166, 0.01177778746932745, 0.002194005064666271, 0.001985299400985241, 0.002851333934813738, 0.04574306681752205, 0.04607429727911949, 0.010095133446156979, 0.6637588143348694, 0.017695482820272446, 0.07973530143499374, 0.021500155329704285, 0.005192994140088558, 0.02168217860162258, 0.05756930261850357], [0.016236571595072746, 0.013694696128368378, 0.002915945602580905, 0.014230689965188503, 0.007542142178863287, 0.007519168313592672, 0.005503657273948193, 0.0369376502931118, 0.1028691753745079, 0.042689524590969086, 0.06700694561004639, 0.45826080441474915, 0.07909713685512543, 0.04616483300924301, 0.0025631412863731384, 0.05615502595901489, 0.040612928569316864], [0.012862255796790123, 0.004321208223700523, 0.0007380644674412906, 0.004697688855230808, 0.005368534475564957, 0.0017172519583255053, 0.002345202723518014, 0.024267908185720444, 0.07405127584934235, 0.008670864626765251, 0.11784759163856506, 0.02963525801897049, 0.6094574928283691, 0.05104106292128563, 0.0010725581087172031, 0.02144717611372471, 0.03045850619673729], [0.011344737373292446, 0.008953439071774483, 0.0014549760380759835, 0.0039064921438694, 0.0030496688559651375, 0.004957438446581364, 0.0009691601153463125, 0.017054885625839233, 0.27739036083221436, 0.011209480464458466, 0.03466898202896118, 0.05288417264819145, 0.1435890793800354, 0.34889301657676697, 0.0029962072148919106, 0.05683060735464096, 0.019847268238663673], [0.003368671052157879, 0.013583527877926826, 0.009259716607630253, 0.004006437957286835, 0.0038680133875459433, 0.006436121184378862, 0.007880812510848045, 0.12874320149421692, 0.01580236107110977, 0.022467104718089104, 0.08036766201257706, 0.027030685916543007, 0.055513132363557816, 0.04136744141578674, 0.24670298397541046, 0.15156280994415283, 0.18203933537006378], [0.007473790552467108, 0.014781187288463116, 0.010337840765714645, 0.011942895129323006, 0.009828347712755203, 0.008312221616506577, 0.00318410387262702, 0.08268597722053528, 0.052749909460544586, 0.023263053968548775, 0.053277984261512756, 0.039682745933532715, 0.1319746971130371, 0.12371502071619034, 0.01185782253742218, 0.30487263202667236, 0.11005986481904984], [0.0016825601924210787, 0.005063294433057308, 0.008983827196061611, 0.006539860274642706, 0.004001611843705177, 0.004307132214307785, 0.004967803601175547, 0.33764851093292236, 0.0012313788756728172, 0.006174267269670963, 0.005538249853998423, 0.0018429573392495513, 0.007342248689383268, 0.001477861893363297, 0.006959833204746246, 0.011220687068998814, 0.58501797914505]], [[0.11064932495355606, 0.2861958146095276, 0.015067524276673794, 0.07355459779500961, 0.06046264246106148, 0.20357170701026917, 0.10814206302165985, 0.019283557310700417, 0.008805382996797562, 0.04570617154240608, 0.004187706392258406, 0.010015050880610943, 0.008163661696016788, 0.0019381511956453323, 0.010395367629826069, 0.016593165695667267, 0.017268195748329163], [0.06361417472362518, 0.12997613847255707, 0.04846347123384476, 0.09925758093595505, 0.1981370598077774, 0.09832987934350967, 0.05452072620391846, 0.042111024260520935, 0.03370045870542526, 0.039444901049137115, 0.01645437628030777, 0.023641135543584824, 0.025199059396982193, 0.01792309805750847, 0.03229411691427231, 0.03223419934511185, 0.04469858109951019], [0.04947299137711525, 0.037043869495391846, 0.05764047056436539, 0.01292980182915926, 0.027329223230481148, 0.028892677277326584, 0.03291632980108261, 0.29473304748535156, 0.006244855001568794, 0.010496613569557667, 0.00594155490398407, 0.014162021689116955, 0.006714094430208206, 0.006820941809564829, 0.010017624124884605, 0.02373078651726246, 0.3749130964279175], [0.05016633868217468, 0.05179364234209061, 0.03664080798625946, 0.0956832617521286, 0.16762837767601013, 0.048265911638736725, 0.02495983988046646, 0.19143706560134888, 0.007578858640044928, 0.01655939407646656, 0.009319035336375237, 0.011616279371082783, 0.007665322162210941, 0.008742409758269787, 0.016078194603323936, 0.020136144012212753, 0.23572911322116852], [0.05871862545609474, 0.11475560069084167, 0.029622670263051987, 0.051327742636203766, 0.2336670309305191, 0.09800241887569427, 0.06129336729645729, 0.1315382421016693, 0.0043980637565255165, 0.013080784119665623, 0.006968965288251638, 0.006243482697755098, 0.007549273781478405, 0.006339468993246555, 0.015101193450391293, 0.018916534259915352, 0.14247655868530273], [0.062167420983314514, 0.13224390149116516, 0.03961222246289253, 0.06640327721834183, 0.17744794487953186, 0.1431320756673813, 0.07713640481233597, 0.052284203469753265, 0.027417412027716637, 0.028554832562804222, 0.016788501292467117, 0.027056578546762466, 0.014256540685892105, 0.023119473829865456, 0.02033320814371109, 0.0383477583527565, 0.05369832366704941], [0.013414938934147358, 0.012547017075121403, 0.027256891131401062, 0.03875349834561348, 0.025836626067757607, 0.024479571729898453, 0.05171208456158638, 0.29541683197021484, 0.004627103451639414, 0.00885968841612339, 0.0077529242262244225, 0.007690089754760265, 0.008301527239382267, 0.004220880102366209, 0.007130585145205259, 0.009454840794205666, 0.4525448679924011], [0.0185551755130291, 0.004556307103484869, 0.00894142035394907, 0.010129105299711227, 0.007047406397759914, 0.00627224612981081, 0.011747623793780804, 0.32451847195625305, 0.005878430791199207, 0.00430064694955945, 0.008754890412092209, 0.009346920065581799, 0.009809249080717564, 0.00710773142054677, 0.0052851540967822075, 0.01579752005636692, 0.5419517755508423], [0.06530153751373291, 0.05581533536314964, 0.004615980200469494, 0.008040301501750946, 0.0324745774269104, 0.055171724408864975, 0.022884028032422066, 0.2389223426580429, 0.02392970398068428, 0.02138812467455864, 0.02622242271900177, 0.026544980704784393, 0.026619836688041687, 0.015003722161054611, 0.030615337193012238, 0.04791983589529991, 0.2985301911830902], [0.0186065211892128, 0.03110244683921337, 0.007531535346060991, 0.006384595762938261, 0.01573231630027294, 0.02773648127913475, 0.014105214737355709, 0.3217688202857971, 0.006777422036975622, 0.019051678478717804, 0.011984478682279587, 0.016232356429100037, 0.007189901079982519, 0.006299290806055069, 0.015625718981027603, 0.02901366539299488, 0.44485756754875183], [0.06479699909687042, 0.0203217975795269, 0.0049687158316373825, 0.006786097772419453, 0.015438761562108994, 0.020039653405547142, 0.014200984500348568, 0.261074960231781, 0.014420581050217152, 0.023350222036242485, 0.056368596851825714, 0.0322117917239666, 0.042492445558309555, 0.01031960267573595, 0.03449470177292824, 0.031758785247802734, 0.3469553291797638], [0.07967878133058548, 0.03959923982620239, 0.00844874419271946, 0.0067182788625359535, 0.022977136075496674, 0.03395850583910942, 0.016678674146533012, 0.2410414218902588, 0.019603190943598747, 0.019815417006611824, 0.027524985373020172, 0.03829849511384964, 0.03646773844957352, 0.01738143153488636, 0.02577395550906658, 0.054980892688035965, 0.3110530972480774], [0.06913606822490692, 0.04853967949748039, 0.00267248647287488, 0.006781389936804771, 0.011949611827731133, 0.027468914166092873, 0.008769519627094269, 0.22675353288650513, 0.024301622062921524, 0.02624249830842018, 0.049577295780181885, 0.03207225352525711, 0.08995068818330765, 0.020404692739248276, 0.03635886684060097, 0.04128899425268173, 0.2777319550514221], [0.09848746657371521, 0.04065364971756935, 0.003563826670870185, 0.0045530120842158794, 0.021409759297966957, 0.03902905434370041, 0.014178195036947727, 0.21531948447227478, 0.025104945525527, 0.01594237983226776, 0.033953890204429626, 0.04222805052995682, 0.04058384150266647, 0.03738243877887726, 0.04146037623286247, 0.06771257519721985, 0.25843706727027893], [0.02186203934252262, 0.009089235216379166, 0.008449574932456017, 0.0069407629780471325, 0.010747908614575863, 0.011311180889606476, 0.01207310613244772, 0.2488556206226349, 0.02082022838294506, 0.02715897187590599, 0.04647752642631531, 0.06585762649774551, 0.03935054689645767, 0.02987242117524147, 0.03858870267868042, 0.04665840044617653, 0.35588622093200684], [0.06198921054601669, 0.03536825254559517, 0.007430571597069502, 0.012308758683502674, 0.021207600831985474, 0.03148888424038887, 0.022302895784378052, 0.2061266154050827, 0.036144208163022995, 0.02327052690088749, 0.03840610012412071, 0.045426931232213974, 0.06916413456201553, 0.04437774047255516, 0.032215967774391174, 0.05239536613225937, 0.2603762149810791], [0.016722913831472397, 0.004445492755621672, 0.009419319219887257, 0.01058664359152317, 0.0068637048825621605, 0.006029916927218437, 0.010071957483887672, 0.32445138692855835, 0.007086324971169233, 0.005023763980716467, 0.010848034173250198, 0.010410626418888569, 0.011794120073318481, 0.008979752659797668, 0.0061179823242127895, 0.016704149544239044, 0.5344439744949341]], [[0.008838932029902935, 0.05999119207262993, 0.02032579854130745, 0.029770970344543457, 0.0046866172924637794, 0.012430781498551369, 0.08697380870580673, 0.1206798180937767, 0.047204192727804184, 0.009483088739216328, 0.06672019511461258, 0.008920884691178799, 0.10948628187179565, 0.19007766246795654, 0.016055172309279442, 0.043300095945596695, 0.16505447030067444], [0.006277316249907017, 0.030027559027075768, 0.016165612265467644, 0.02920420467853546, 0.0074911778792738914, 0.010401354171335697, 0.005729466676712036, 0.3119165003299713, 0.005573493894189596, 0.0040230778977274895, 0.0021689997520297766, 0.002085156738758087, 0.0056680114939808846, 0.0027740884106606245, 0.0014002544339746237, 0.00316954986192286, 0.555924117565155], [0.0028562152292579412, 0.010485831648111343, 0.00589174497872591, 0.007633655797690153, 0.004016880411654711, 0.00910889357328415, 0.0030102611053735018, 0.32462581992149353, 0.0011670918902382255, 0.0014104503206908703, 0.0007403984200209379, 0.0007315311231650412, 0.0019150173757225275, 0.0007510724826715887, 0.0008684338536113501, 0.0017024397384375334, 0.6230842471122742], [0.013247065246105194, 0.06848253309726715, 0.02582527883350849, 0.059877052903175354, 0.01903846673667431, 0.03915196284651756, 0.013449899852275848, 0.2992263436317444, 0.005860950797796249, 0.005228393245488405, 0.0021665478125214577, 0.0017033166950568557, 0.003786279819905758, 0.0031911644618958235, 0.0013446449302136898, 0.0035184372682124376, 0.4349015951156616], [0.001979042077437043, 0.011797063983976841, 0.00874242465943098, 0.011050427332520485, 0.0038752045948058367, 0.009132196195423603, 0.004224082455039024, 0.320546954870224, 0.0004811668477486819, 0.0011068056337535381, 0.00024345020938199013, 0.00024754495825618505, 0.0005716445157304406, 0.00018906612240243703, 0.0003468822978902608, 0.0007660789997316897, 0.6247000098228455], [0.005987359676510096, 0.02407584711909294, 0.01559499092400074, 0.016291232779622078, 0.008457396179437637, 0.019713999703526497, 0.006674196105450392, 0.32652342319488525, 0.002775958739221096, 0.003231125883758068, 0.0012888950295746326, 0.0009976557921618223, 0.0021340963430702686, 0.001570488209836185, 0.0014272535918280482, 0.0032574962824583054, 0.559998631477356], [0.0023013888858258724, 0.007948913611471653, 0.007530550938099623, 0.006467541679739952, 0.00343892490491271, 0.00926176831126213, 0.002284273272380233, 0.3121257424354553, 0.0011618515709415078, 0.0009515214478597045, 0.001110097044147551, 0.000712997920345515, 0.0013345349580049515, 0.0009808051399886608, 0.0006136141601018608, 0.002311359392479062, 0.6394640803337097], [0.0018660355126485229, 0.007614978589117527, 0.0026823594234883785, 0.007038292475044727, 0.00157541676890105, 0.0035376683808863163, 0.0011946283048018813, 0.3208935558795929, 0.0009267051354981959, 0.0009504372719675303, 0.0007879172917455435, 0.0006295080529525876, 0.001908165984787047, 0.0009390199556946754, 0.0010099190985783935, 0.0015289455186575651, 0.6449163556098938], [0.0025385264307260513, 0.015972135588526726, 0.003545205807313323, 0.011749769560992718, 0.0012723729014396667, 0.002697229152545333, 0.004866490140557289, 0.27537378668785095, 0.0058863842859864235, 0.004376258701086044, 0.004904803354293108, 0.0019107566913589835, 0.012556239031255245, 0.004626601003110409, 0.0021985922940075397, 0.0031842042226344347, 0.6423406004905701], [0.003268518252298236, 0.012585888616740704, 0.004801989998668432, 0.011512072756886482, 0.002396664349362254, 0.003887286875396967, 0.0017279277089983225, 0.283633291721344, 0.003216083627194166, 0.0019018937600776553, 0.005095880478620529, 0.0013470402918756008, 0.005691589321941137, 0.0033170911483466625, 0.001266071922145784, 0.0029785381630063057, 0.6513722538948059], [0.002200132003054023, 0.005949343089014292, 0.001533872913569212, 0.003685093717649579, 0.0005745512899011374, 0.001878154231235385, 0.0009750121389515698, 0.28034859895706177, 0.0033737639896571636, 0.0026834208983927965, 0.009662826545536518, 0.0014492352493107319, 0.006672397255897522, 0.004902185406535864, 0.004605260211974382, 0.00593555485829711, 0.6635706424713135], [0.001111872959882021, 0.004992477595806122, 0.0035979158710688353, 0.0033878388348966837, 0.0009550845134072006, 0.0020844158716499805, 0.0022347532212734222, 0.2775435745716095, 0.0037635385524481535, 0.0018603479256853461, 0.0032637796830385923, 0.0012520966120064259, 0.006311315111815929, 0.0037632009480148554, 0.0020016119815409184, 0.003100041765719652, 0.6787762641906738], [0.006274266634136438, 0.015597670339047909, 0.0019414409762248397, 0.014022943563759327, 0.001832918613217771, 0.0031529469415545464, 0.0034341164864599705, 0.28615373373031616, 0.009429996833205223, 0.004364234395325184, 0.014302671886980534, 0.0033574793487787247, 0.021188819780945778, 0.008690145798027515, 0.0040708379819989204, 0.005184987559914589, 0.597000777721405], [0.0028356562834233046, 0.010088915936648846, 0.002142612123861909, 0.0065392423421144485, 0.0010128070134669542, 0.002616879530251026, 0.005847576539963484, 0.2856804430484772, 0.0028883544728159904, 0.0038166623562574387, 0.0035903428215533495, 0.0015511433593928814, 0.008969041518867016, 0.003468763316050172, 0.0032929072622209787, 0.0042611476965248585, 0.6513975262641907], [0.002507975557819009, 0.008276339620351791, 0.0015707237180322409, 0.0030484518501907587, 0.0005828459979966283, 0.0034471892286092043, 0.001863445620983839, 0.3165230453014374, 0.002972877351567149, 0.001105941948480904, 0.004632243886590004, 0.0015756785869598389, 0.0044962200336158276, 0.00519570941105485, 0.003240439807996154, 0.0046338895335793495, 0.6343269944190979], [0.0028794854879379272, 0.016901135444641113, 0.0030379469972103834, 0.007939142175018787, 0.0014289936516433954, 0.004951691720634699, 0.0038265036419034004, 0.3014661967754364, 0.0033361786045134068, 0.0020475382916629314, 0.004913855344057083, 0.0019554055761545897, 0.007757754065096378, 0.005630883853882551, 0.0032016821205615997, 0.00585649348795414, 0.622869074344635], [0.002648302586749196, 0.00996107142418623, 0.003456014906987548, 0.00959792174398899, 0.002230860060080886, 0.004387015476822853, 0.0014840762596577406, 0.32245972752571106, 0.001398561173118651, 0.0013238589745014906, 0.0011692801490426064, 0.0010149111039936543, 0.0028998046182096004, 0.0014115675585344434, 0.0014841669471934438, 0.002164526144042611, 0.6309084296226501]], [[0.07250411808490753, 0.054838016629219055, 0.03025709092617035, 0.03168412297964096, 0.01117117889225483, 0.02596318908035755, 0.02019568160176277, 0.17584320902824402, 0.03772613778710365, 0.017292149364948273, 0.019033158197999, 0.015981798991560936, 0.10748451948165894, 0.06339283287525177, 0.026982644572854042, 0.025968389585614204, 0.2636817693710327], [0.00449938653036952, 0.007869871333241463, 0.005855365190654993, 0.010057754814624786, 0.003850877285003662, 0.006389282643795013, 0.0058118123561143875, 0.3005697727203369, 0.003926183097064495, 0.02833758294582367, 0.0019055336015298963, 0.004731093067675829, 0.006856117397546768, 0.0018985210917890072, 0.0037874961271882057, 0.005142093636095524, 0.5985113382339478], [0.002887526759877801, 0.005752807483077049, 0.021710360422730446, 0.01651262491941452, 0.007668592501431704, 0.006368731148540974, 0.010878670029342175, 0.2871168553829193, 0.001873801345936954, 0.018829399719834328, 0.0014974785735830665, 0.003380925627425313, 0.004635887686163187, 0.0014531337656080723, 0.005097417160868645, 0.009678236208856106, 0.5946574807167053], [0.0032901836093515158, 0.006757782306522131, 0.013796627521514893, 0.020112836733460426, 0.008938129991292953, 0.004249601159244776, 0.004514114465564489, 0.28531527519226074, 0.004239200614392757, 0.014444817788898945, 0.00107086100615561, 0.0034148218110203743, 0.004675847943872213, 0.0013690005289390683, 0.002087400760501623, 0.004465100821107626, 0.6172584891319275], [0.002208573743700981, 0.0023240309674292803, 0.008742237463593483, 0.005920421332120895, 0.00325100333429873, 0.003751414828002453, 0.007118323352187872, 0.29076340794563293, 0.0009553448180668056, 0.023026814684271812, 0.0007277590921148658, 0.0022186446003615856, 0.0026698641013354063, 0.000758698966819793, 0.004438812844455242, 0.004820294212549925, 0.6363043785095215], [0.01057443581521511, 0.013326886110007763, 0.013629400171339512, 0.011955070309340954, 0.005824630614370108, 0.02150745503604412, 0.014331351034343243, 0.31061574816703796, 0.005320559721440077, 0.03795351833105087, 0.0034003520850092173, 0.008019414730370045, 0.008709263987839222, 0.003563303966075182, 0.008108800277113914, 0.008816267363727093, 0.5143435597419739], [0.023246681317687035, 0.036682963371276855, 0.0468820296227932, 0.07191494852304459, 0.043857038021087646, 0.05066068097949028, 0.02793613262474537, 0.23398587107658386, 0.010748549364507198, 0.04845832288265228, 0.003108717268332839, 0.009237070567905903, 0.011475318111479282, 0.004741620738059282, 0.008399915881454945, 0.011427357792854309, 0.3572368621826172], [0.001065234886482358, 0.0030463573057204485, 0.005035394802689552, 0.0063715483993291855, 0.0016224693972617388, 0.0017974732909351587, 0.0018142752815037966, 0.2865598499774933, 0.0011633801041170955, 0.004326092079281807, 0.0005308861727826297, 0.001274771522730589, 0.002319390419870615, 0.0009796912781894207, 0.0024901360739022493, 0.0032277535647153854, 0.6763752698898315], [0.006462869234383106, 0.004796402063220739, 0.0025641212705522776, 0.005078374408185482, 0.0010948996059596539, 0.00148495240136981, 0.002638250356540084, 0.24808421730995178, 0.006665803957730532, 0.015178035944700241, 0.003392704762518406, 0.00270862621255219, 0.018063317984342575, 0.009636455215513706, 0.005009351298213005, 0.0033341115340590477, 0.6638075709342957], [0.04251081123948097, 0.03791697695851326, 0.02159583382308483, 0.038590263575315475, 0.012144549749791622, 0.019638122990727425, 0.017482291907072067, 0.17553822696208954, 0.02592628449201584, 0.1147778257727623, 0.010924885980784893, 0.025900382548570633, 0.06118728220462799, 0.027059901505708694, 0.015693916007876396, 0.020478995516896248, 0.3326333463191986], [0.0016267145983874798, 0.0010850365506485105, 0.001038845512084663, 0.0027872996870428324, 0.0005339140770956874, 0.000669448112603277, 0.001308221137151122, 0.24292205274105072, 0.002076371805742383, 0.009975813329219818, 0.004298142623156309, 0.001802602899260819, 0.0131455073133111, 0.004136352334171534, 0.004994882736355066, 0.0043364521116018295, 0.7032623887062073], [0.009279221296310425, 0.0059870085678994656, 0.004043014254420996, 0.0066634006798267365, 0.0019153476459905505, 0.0027499194256961346, 0.0026665106415748596, 0.25666531920433044, 0.005988645367324352, 0.02468833141028881, 0.004796822555363178, 0.0063317217864096165, 0.015372059307992458, 0.008533770218491554, 0.004591592121869326, 0.009010883048176765, 0.6307163238525391], [0.007363625802099705, 0.0056667448952794075, 0.0023991030175238848, 0.004123271908611059, 0.0010621732799336314, 0.0017300231847912073, 0.0015881824074313045, 0.25956991314888, 0.0074172052554786205, 0.01860951818525791, 0.0054871696047484875, 0.003380029695108533, 0.02878706157207489, 0.014314514584839344, 0.0049970257095992565, 0.005947984755039215, 0.6275563836097717], [0.0027076993137598038, 0.0016159438528120518, 0.0011922736885026097, 0.0012111011892557144, 0.00041656449320726097, 0.0010283944429829717, 0.0019091839203611016, 0.2607748508453369, 0.0019935735035687685, 0.0035810943227261305, 0.002863318659365177, 0.001211909344419837, 0.010081212036311626, 0.008832305669784546, 0.005484686233103275, 0.004947139415889978, 0.690148651599884], [0.004162090364843607, 0.003181458218023181, 0.005689968820661306, 0.00298204249702394, 0.001212175004184246, 0.0026147994212806225, 0.0022495894227176905, 0.2610611915588379, 0.0037972412537783384, 0.004069909453392029, 0.008129466325044632, 0.003986326977610588, 0.00973745808005333, 0.014778226613998413, 0.020307740196585655, 0.011769690550863743, 0.6402708292007446], [0.00649907486513257, 0.005668060854077339, 0.005573372822254896, 0.003342200070619583, 0.001615032204426825, 0.004149514250457287, 0.003641796763986349, 0.28125372529029846, 0.004318587016314268, 0.007333166431635618, 0.005432142876088619, 0.00388256274163723, 0.012013954110443592, 0.019479017704725266, 0.011880111880600452, 0.021110909059643745, 0.6028066873550415], [0.0014380291104316711, 0.004073725547641516, 0.006834553554654121, 0.008995609357953072, 0.0022059043403714895, 0.0022548993583768606, 0.0023127931635826826, 0.28964006900787354, 0.0018081108573824167, 0.005677299574017525, 0.0007354802219197154, 0.0016671495977789164, 0.0033539654687047005, 0.001479115104302764, 0.003517432138323784, 0.004354709759354591, 0.6596513390541077]]]], \"left_text\": [\"[CLS]\", \"the\", \"cat\", \"sleeps\", \"on\", \"the\", \"mat\", \"[SEP]\", \"le\", \"chat\", \"do\", \"##rs\", \"sur\", \"le\", \"tap\", \"##is\", \"[SEP]\"], \"right_text\": [\"[CLS]\", \"the\", \"cat\", \"sleeps\", \"on\", \"the\", \"mat\", \"[SEP]\", \"le\", \"chat\", \"do\", \"##rs\", \"sur\", \"le\", \"tap\", \"##is\", \"[SEP]\"]}}, \"default_filter\": \"all\"}" ], "text/plain": [ "" ] }, "metadata": { "tags": [] } }, { "output_type": "display_data", "data": { "application/javascript": [ "/**\n", " * @fileoverview Transformer Visualization D3 javascript code.\n", " *\n", " *\n", " * Based on: https://github.com/tensorflow/tensor2tensor/blob/master/tensor2tensor/visualization/attention.js\n", " *\n", " * Change log:\n", " *\n", " * 12/19/18 Jesse Vig Assorted cleanup. Changed orientation of attention matrices.\n", " */\n", "\n", "requirejs(['jquery', 'd3'], function($, d3) {\n", "\n", "const TEXT_SIZE = 15;\n", "const BOXWIDTH = 110;\n", "const BOXHEIGHT = 22.5;\n", "const MATRIX_WIDTH = 115;\n", "const CHECKBOX_SIZE = 20;\n", "const TEXT_TOP = 30;\n", "const HEAD_COLORS = d3.scale.category10();\n", "\n", "var params = window.params;\n", "var config = {};\n", "initialize();\n", "\n", "function lighten(color) {\n", " var c = d3.hsl(color);\n", " var increment = (1 - c.l) * 0.6;\n", " c.l += increment;\n", " c.s -= increment;\n", " return c;\n", "}\n", "\n", "function transpose(mat) {\n", " return mat[0].map(function(col, i) {\n", " return mat.map(function(row) {\n", " return row[i];\n", " });\n", " });\n", "}\n", "\n", "function zip(a, b) {\n", " return a.map(function (e, i) {\n", " return [e, b[i]];\n", " });\n", "}\n", "\n", "function render() {\n", "\n", " var attnData = config.attention[config.filter];\n", " var leftText = attnData.left_text;\n", " var rightText = attnData.right_text;\n", " var attentionHeads = attnData.attn[config.layer];\n", "\n", " $(\"#vis svg\").empty();\n", " $(\"#vis\").empty();\n", "\n", " var height = config.initialTextLength * BOXHEIGHT + TEXT_TOP;\n", " var svg = d3.select(\"#vis\")\n", " .append('svg')\n", " .attr(\"width\", \"100%\")\n", " .attr(\"height\", height + \"px\");\n", "\n", " var attData = [];\n", " for (var i=0; i < config.nHeads; i++) {\n", " var att = attentionHeads[i];\n", " var att_trans = transpose(att);\n", " attData.push(zip(att_trans, att));\n", " }\n", "\n", " renderText(svg, leftText, true, attData, 0);\n", " renderText(svg, rightText, false, attData, MATRIX_WIDTH + BOXWIDTH);\n", "\n", " renderAttentionHighlights(svg, attData);\n", "\n", " svg.append(\"g\").classed(\"attentionHeads\", true);\n", "\n", " renderAttention(svg, attentionHeads);\n", "\n", " drawCheckboxes(0, svg, attentionHeads);\n", "\n", "}\n", "\n", "function renderText(svg, text, isLeft, attData, leftPos) {\n", " // attData: list of tuples (att, att_trans), one for each layer. att and att_trans are attention matrics for each layer.\n", " // att is of shape [nHeads, source_len, target_len)\n", " var id = isLeft ? \"left\" : \"right\";\n", " var textContainer = svg.append(\"svg:g\")\n", " .attr(\"id\", id);\n", "\n", " textContainer.append(\"g\").classed(\"attentionBoxes\", true)\n", " .selectAll(\"g\")\n", " .data(attData)\n", " .enter()\n", " .append(\"g\")\n", " .selectAll(\"rect\")\n", " .data(function(d) {return d;})\n", " .enter()\n", " .append(\"rect\")\n", " .attr(\"x\", function(d, i, j) {\n", " return leftPos + boxOffsets(j);\n", " })\n", " .attr(\"y\", function(d, i) {\n", " return (+1) * BOXHEIGHT;\n", " })\n", " .attr(\"width\", BOXWIDTH / activeHeads())\n", " .attr(\"height\", function() { return BOXHEIGHT; })\n", " .attr(\"fill\", function(d, i, j) {\n", " return HEAD_COLORS(j);\n", " })\n", " .style(\"opacity\", 0.0);\n", "\n", " var tokenContainer = textContainer.append(\"g\").selectAll(\"g\")\n", " .data(text)\n", " .enter()\n", " .append(\"g\");\n", "\n", " tokenContainer.append(\"rect\")\n", " .classed(\"background\", true)\n", " .style(\"opacity\", 0.0)\n", " .attr(\"fill\", \"lightgray\")\n", " .attr(\"x\", leftPos)\n", " .attr(\"y\", function(d, i) {\n", " return TEXT_TOP + i * BOXHEIGHT;\n", " })\n", " .attr(\"width\", BOXWIDTH)\n", " .attr(\"height\", BOXHEIGHT);\n", "\n", " var textEl = tokenContainer.append(\"text\")\n", " .text(function(d) { return d; })\n", " .attr(\"font-size\", TEXT_SIZE + \"px\")\n", " .style(\"cursor\", \"default\")\n", " .style(\"-webkit-user-select\", \"none\")\n", " .attr(\"x\", leftPos)\n", " .attr(\"y\", function(d, i) {\n", " return TEXT_TOP + i * BOXHEIGHT;\n", " });\n", "\n", " if (isLeft) {\n", " textEl.style(\"text-anchor\", \"end\")\n", " .attr(\"dx\", BOXWIDTH - 0.5 * TEXT_SIZE)\n", " .attr(\"dy\", TEXT_SIZE);\n", " } else {\n", " textEl.style(\"text-anchor\", \"start\")\n", " .attr(\"dx\", + 0.5 * TEXT_SIZE)\n", " .attr(\"dy\", TEXT_SIZE);\n", " }\n", "\n", " tokenContainer.on(\"mouseover\", function(d, index) {\n", " textContainer.selectAll(\".background\")\n", " .style(\"opacity\", function(d, i) {\n", " return i == index ? 1.0 : 0.0;\n", " });\n", "\n", " svg.selectAll(\".attentionHeads\").style(\"display\", \"none\");\n", "\n", " svg.selectAll(\".lineHeads\") // To get the nesting to work.\n", " .selectAll(\".attLines\")\n", " .attr(\"stroke-opacity\", function(d) {\n", " return 1.0;\n", " })\n", " .attr(\"y1\", function(d, i) {\n", " if (isLeft) {\n", " return TEXT_TOP + index * BOXHEIGHT + (BOXHEIGHT/2);\n", " } else {\n", " return TEXT_TOP + i * BOXHEIGHT + (BOXHEIGHT/2);\n", " }\n", " })\n", " .attr(\"x1\", BOXWIDTH)\n", " .attr(\"y2\", function(d, i) {\n", " if (isLeft) {\n", " return TEXT_TOP + i * BOXHEIGHT + (BOXHEIGHT/2);\n", " } else {\n", " return TEXT_TOP + index * BOXHEIGHT + (BOXHEIGHT/2);\n", " }\n", " })\n", " .attr(\"x2\", BOXWIDTH + MATRIX_WIDTH)\n", " .attr(\"stroke-width\", 2)\n", " .attr(\"stroke\", function(d, i, j) {\n", " return HEAD_COLORS(j);\n", " })\n", " .attr(\"stroke-opacity\", function(d, i, j) {\n", " if (isLeft) {d = d[0];} else {d = d[1];}\n", " if (config.headVis[j]) {\n", " if (d) {\n", " return d[index];\n", " } else {\n", " return 0.0;\n", " }\n", " } else {\n", " return 0.0;\n", " }\n", " });\n", "\n", " function updateAttentionBoxes() {\n", " var id = isLeft ? \"right\" : \"left\";\n", " var leftPos = isLeft ? MATRIX_WIDTH + BOXWIDTH : 0;\n", " svg.select(\"#\" + id)\n", " .selectAll(\".attentionBoxes\")\n", " .selectAll(\"g\")\n", " .selectAll(\"rect\")\n", " .attr(\"x\", function(d, i, j) { return leftPos + boxOffsets(j); })\n", " .attr(\"y\", function(d, i) { return TEXT_TOP + i * BOXHEIGHT; })\n", " .attr(\"width\", BOXWIDTH/activeHeads())\n", " .attr(\"height\", function() { return BOXHEIGHT; })\n", " .style(\"opacity\", function(d, i, j) {\n", " if (isLeft) {d = d[0];} else {d = d[1];}\n", " if (config.headVis[j])\n", " if (d) {\n", " return d[index];\n", " } else {\n", " return 0.0;\n", " }\n", " else\n", " return 0.0;\n", " });\n", " }\n", "\n", " updateAttentionBoxes();\n", " });\n", "\n", " textContainer.on(\"mouseleave\", function() {\n", " d3.select(this).selectAll(\".background\")\n", " .style(\"opacity\", 0.0);\n", " svg.selectAll(\".attLines\").attr(\"stroke-opacity\", 0.0);\n", " svg.selectAll(\".attentionHeads\").style(\"display\", \"inline\");\n", " svg.selectAll(\".attentionBoxes\")\n", " .selectAll(\"g\")\n", " .selectAll(\"rect\")\n", " .style(\"opacity\", 0.0);\n", " });\n", "}\n", "\n", "function renderAttentionHighlights(svg, attention) {\n", " var line_container = svg.append(\"g\");\n", " line_container.selectAll(\"g\")\n", " .data(attention)\n", " .enter()\n", " .append(\"g\")\n", " .classed(\"lineHeads\", true)\n", " .selectAll(\"line\")\n", " .data(function(d){return d;})\n", " .enter()\n", " .append(\"line\").classed(\"attLines\", true);\n", "}\n", "\n", "function renderAttention(svg, attentionHeads) {\n", " var line_container = svg.selectAll(\".attentionHeads\");\n", " line_container.html(null);\n", " for(var h=0; h\").val(i).text(i));\n", "}\n", "\n", "$(\"#layer\").on('change', function(e) {\n", " config.layer = +e.currentTarget.value;\n", " render();\n", "});\n", "\n", "$(\"#filter\").on('change', function(e) {\n", " config.filter = e.currentTarget.value;\n", " render();\n", "});\n", "\n", "render();\n", "\n", "});" ], "text/plain": [ "" ] }, "metadata": { "tags": [] } } ] } ] } ================================================ FILE: Chapter07/Summarizing_Text_with_T5.ipynb ================================================ { "nbformat": 4, "nbformat_minor": 0, "metadata": { "colab": { "name": "Summarizing_Text_with_T5.ipynb", "provenance": [], "collapsed_sections": [] }, "kernelspec": { "name": "python3", "display_name": "Python 3" }, "widgets": { "application/vnd.jupyter.widget-state+json": { "311dd5614ff1456fa608b06c6a486333": { "model_module": "@jupyter-widgets/controls", "model_name": "HBoxModel", "state": { "_view_name": "HBoxView", "_dom_classes": [], "_model_name": "HBoxModel", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.5.0", "box_style": "", "layout": "IPY_MODEL_483e8e39248842cca6fdb69497417033", "_model_module": "@jupyter-widgets/controls", "children": [ "IPY_MODEL_898bb243539e4693a1a024e54dac42e2", "IPY_MODEL_ab283be8197e4a638fd83dc475b57486" ] } }, "483e8e39248842cca6fdb69497417033": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "898bb243539e4693a1a024e54dac42e2": { "model_module": "@jupyter-widgets/controls", "model_name": "FloatProgressModel", "state": { "_view_name": "ProgressView", "style": "IPY_MODEL_af7cb58500024f7cac6545a44045bdd1", "_dom_classes": [], "description": "Downloading: 100%", "_model_name": "FloatProgressModel", "bar_style": "success", "max": 1200, "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": 1200, "_view_count": null, "_view_module_version": "1.5.0", "orientation": "horizontal", "min": 0, "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_d252ead0d3fc4885bed21c592dc9394c" } }, "ab283be8197e4a638fd83dc475b57486": { "model_module": "@jupyter-widgets/controls", "model_name": "HTMLModel", "state": { "_view_name": "HTMLView", "style": "IPY_MODEL_8ed15a3ae7c64fdea663832082e80529", "_dom_classes": [], "description": "", "_model_name": "HTMLModel", "placeholder": "​", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": " 1.20k/1.20k [03:27<00:00, 5.78B/s]", "_view_count": null, "_view_module_version": "1.5.0", "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_6a4014978e96442789889fef7919d87f" } }, "af7cb58500024f7cac6545a44045bdd1": { "model_module": "@jupyter-widgets/controls", "model_name": "ProgressStyleModel", "state": { "_view_name": "StyleView", "_model_name": "ProgressStyleModel", "description_width": "initial", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "bar_color": null, "_model_module": "@jupyter-widgets/controls" } }, "d252ead0d3fc4885bed21c592dc9394c": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "8ed15a3ae7c64fdea663832082e80529": { "model_module": "@jupyter-widgets/controls", "model_name": "DescriptionStyleModel", "state": { "_view_name": "StyleView", "_model_name": "DescriptionStyleModel", "description_width": "", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "_model_module": "@jupyter-widgets/controls" } }, "6a4014978e96442789889fef7919d87f": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "173d4c16ef284a6085b183da2be8574e": { "model_module": "@jupyter-widgets/controls", "model_name": "HBoxModel", "state": { "_view_name": "HBoxView", "_dom_classes": [], "_model_name": "HBoxModel", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.5.0", "box_style": "", "layout": "IPY_MODEL_15d71a8c88ab4dc2bfb20853b5bac3d3", "_model_module": "@jupyter-widgets/controls", "children": [ "IPY_MODEL_fef9e2aa436040c0822605be0df0c359", "IPY_MODEL_9b0393c7c87e42838da0e38fb72b2941" ] } }, "15d71a8c88ab4dc2bfb20853b5bac3d3": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "fef9e2aa436040c0822605be0df0c359": { "model_module": "@jupyter-widgets/controls", "model_name": "FloatProgressModel", "state": { "_view_name": "ProgressView", "style": "IPY_MODEL_3707cb9b877c451bb245d051c19af8fa", "_dom_classes": [], "description": "Downloading: 100%", "_model_name": "FloatProgressModel", "bar_style": "success", "max": 2950825948, "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": 2950825948, "_view_count": null, "_view_module_version": "1.5.0", "orientation": "horizontal", "min": 0, "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_28ac8f3a26ca4c51ab913c482234f3ab" } }, "9b0393c7c87e42838da0e38fb72b2941": { "model_module": "@jupyter-widgets/controls", "model_name": "HTMLModel", "state": { "_view_name": "HTMLView", "style": "IPY_MODEL_3ef057edd14f4fafbbb0b6fb81e9ba97", "_dom_classes": [], "description": "", "_model_name": "HTMLModel", "placeholder": "​", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": " 2.95G/2.95G [03:27<00:00, 14.2MB/s]", "_view_count": null, "_view_module_version": "1.5.0", "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_8ff347600d6f42e5bf198232909fe086" } }, "3707cb9b877c451bb245d051c19af8fa": { "model_module": "@jupyter-widgets/controls", "model_name": "ProgressStyleModel", "state": { "_view_name": "StyleView", "_model_name": "ProgressStyleModel", "description_width": "initial", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "bar_color": null, "_model_module": "@jupyter-widgets/controls" } }, "28ac8f3a26ca4c51ab913c482234f3ab": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "3ef057edd14f4fafbbb0b6fb81e9ba97": { "model_module": "@jupyter-widgets/controls", "model_name": "DescriptionStyleModel", "state": { "_view_name": "StyleView", "_model_name": "DescriptionStyleModel", "description_width": "", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "_model_module": "@jupyter-widgets/controls" } }, "8ff347600d6f42e5bf198232909fe086": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "31a90525042744a49f8d48305354b9ec": { "model_module": "@jupyter-widgets/controls", "model_name": "HBoxModel", "state": { "_view_name": "HBoxView", "_dom_classes": [], "_model_name": "HBoxModel", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.5.0", "box_style": "", "layout": "IPY_MODEL_55811622761d49559291f3e74f072d31", "_model_module": "@jupyter-widgets/controls", "children": [ "IPY_MODEL_3c68af00a0784efca7a3dce29c060eef", "IPY_MODEL_07acd3ae43eb4dfb893cc65e72320367" ] } }, "55811622761d49559291f3e74f072d31": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "3c68af00a0784efca7a3dce29c060eef": { "model_module": "@jupyter-widgets/controls", "model_name": "FloatProgressModel", "state": { "_view_name": "ProgressView", "style": "IPY_MODEL_4a1b68c322254a53a77e4d05e9ee7631", "_dom_classes": [], "description": "Downloading: 100%", "_model_name": "FloatProgressModel", "bar_style": "success", "max": 791656, "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": 791656, "_view_count": null, "_view_module_version": "1.5.0", "orientation": "horizontal", "min": 0, "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_e7c516f923f74666a8a4b11e17c333bd" } }, "07acd3ae43eb4dfb893cc65e72320367": { "model_module": "@jupyter-widgets/controls", "model_name": "HTMLModel", "state": { "_view_name": "HTMLView", "style": "IPY_MODEL_0be43a9c00324bb9abeedeb2a67b883a", "_dom_classes": [], "description": "", "_model_name": "HTMLModel", "placeholder": "​", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": " 792k/792k [00:00<00:00, 1.68MB/s]", "_view_count": null, "_view_module_version": "1.5.0", "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_369a4af884f3436683d68ec126d7b6ec" } }, "4a1b68c322254a53a77e4d05e9ee7631": { "model_module": "@jupyter-widgets/controls", "model_name": "ProgressStyleModel", "state": { "_view_name": "StyleView", "_model_name": "ProgressStyleModel", "description_width": "initial", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "bar_color": null, "_model_module": "@jupyter-widgets/controls" } }, "e7c516f923f74666a8a4b11e17c333bd": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "0be43a9c00324bb9abeedeb2a67b883a": { "model_module": "@jupyter-widgets/controls", "model_name": "DescriptionStyleModel", "state": { "_view_name": "StyleView", "_model_name": "DescriptionStyleModel", "description_width": "", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "_model_module": "@jupyter-widgets/controls" } }, "369a4af884f3436683d68ec126d7b6ec": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } } } } }, "cells": [ { "cell_type": "markdown", "metadata": { "id": "RcdcqBkV0MTU" }, "source": [ "#Summarizing Text with T5\n", "Copyright 2020, Denis Rothman. MIT License. Hugging Face usage example was modified for educational purposes.\n", "\n", "[Hugging Face Models](https://huggingface.co/transformers/model_doc/t5.html)\n", "\n", "[Hugging Face Framework Usage](https://huggingface.co/transformers/usage.html)\n" ] }, { "cell_type": "code", "metadata": { "id": "06QFZGxsf_KJ", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "4bb34ef6-44f5-4ae5-d334-8fa716bf9656" }, "source": [ "!pip install transformers==4.0.0" ], "execution_count": null, "outputs": [ { "output_type": "stream", "text": [ "Collecting transformers==4.0.0\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/99/84/7bc03215279f603125d844bf81c3fb3f2d50fe8e511546eb4897e4be2067/transformers-4.0.0-py3-none-any.whl (1.4MB)\n", "\u001b[K |████████████████████████████████| 1.4MB 9.1MB/s \n", "\u001b[?25hRequirement already satisfied: filelock in /usr/local/lib/python3.6/dist-packages (from transformers==4.0.0) (3.0.12)\n", "Requirement already satisfied: regex!=2019.12.17 in /usr/local/lib/python3.6/dist-packages (from transformers==4.0.0) (2019.12.20)\n", "Requirement already satisfied: dataclasses; python_version < \"3.7\" in /usr/local/lib/python3.6/dist-packages (from transformers==4.0.0) (0.8)\n", "Requirement already satisfied: tqdm>=4.27 in /usr/local/lib/python3.6/dist-packages (from transformers==4.0.0) (4.41.1)\n", "Collecting tokenizers==0.9.4\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/0f/1c/e789a8b12e28be5bc1ce2156cf87cb522b379be9cadc7ad8091a4cc107c4/tokenizers-0.9.4-cp36-cp36m-manylinux2010_x86_64.whl (2.9MB)\n", "\u001b[K |████████████████████████████████| 2.9MB 26.2MB/s \n", "\u001b[?25hRequirement already satisfied: requests in /usr/local/lib/python3.6/dist-packages (from transformers==4.0.0) (2.23.0)\n", "Requirement already satisfied: packaging in /usr/local/lib/python3.6/dist-packages (from transformers==4.0.0) (20.4)\n", "Collecting sacremoses\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/7d/34/09d19aff26edcc8eb2a01bed8e98f13a1537005d31e95233fd48216eed10/sacremoses-0.0.43.tar.gz (883kB)\n", "\u001b[K |████████████████████████████████| 890kB 41.5MB/s \n", "\u001b[?25hRequirement already satisfied: numpy in /usr/local/lib/python3.6/dist-packages (from transformers==4.0.0) (1.18.5)\n", "Requirement already satisfied: chardet<4,>=3.0.2 in /usr/local/lib/python3.6/dist-packages (from requests->transformers==4.0.0) (3.0.4)\n", "Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.6/dist-packages (from requests->transformers==4.0.0) (2020.11.8)\n", "Requirement already satisfied: idna<3,>=2.5 in /usr/local/lib/python3.6/dist-packages (from requests->transformers==4.0.0) (2.10)\n", "Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /usr/local/lib/python3.6/dist-packages (from requests->transformers==4.0.0) (1.24.3)\n", "Requirement already satisfied: six in /usr/local/lib/python3.6/dist-packages (from packaging->transformers==4.0.0) (1.15.0)\n", "Requirement already satisfied: pyparsing>=2.0.2 in /usr/local/lib/python3.6/dist-packages (from packaging->transformers==4.0.0) (2.4.7)\n", "Requirement already satisfied: click in /usr/local/lib/python3.6/dist-packages (from sacremoses->transformers==4.0.0) (7.1.2)\n", "Requirement already satisfied: joblib in /usr/local/lib/python3.6/dist-packages (from sacremoses->transformers==4.0.0) (0.17.0)\n", "Building wheels for collected packages: sacremoses\n", " Building wheel for sacremoses (setup.py) ... \u001b[?25l\u001b[?25hdone\n", " Created wheel for sacremoses: filename=sacremoses-0.0.43-cp36-none-any.whl size=893257 sha256=cd41d8610aac8d370f5d6a1c57d73172911ecb039b55e1a3e3f8780a2420a753\n", " Stored in directory: /root/.cache/pip/wheels/29/3c/fd/7ce5c3f0666dab31a50123635e6fb5e19ceb42ce38d4e58f45\n", "Successfully built sacremoses\n", "Installing collected packages: tokenizers, sacremoses, transformers\n", "Successfully installed sacremoses-0.0.43 tokenizers-0.9.4 transformers-4.0.0\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "3tYFf-cEIkKL", "outputId": "6061799b-9e78-42c5-9902-d268217273e4" }, "source": [ "!pip install sentencepiece==0.1.94" ], "execution_count": null, "outputs": [ { "output_type": "stream", "text": [ "Collecting sentencepiece==0.1.94\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/e5/2d/6d4ca4bef9a67070fa1cac508606328329152b1df10bdf31fb6e4e727894/sentencepiece-0.1.94-cp36-cp36m-manylinux2014_x86_64.whl (1.1MB)\n", "\u001b[K |████████████████████████████████| 1.1MB 9.5MB/s \n", "\u001b[?25hInstalling collected packages: sentencepiece\n", "Successfully installed sentencepiece-0.1.94\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "FEQO4tDl7xH_" }, "source": [ "display_architecture=True" ], "execution_count": null, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "q8suV48O07TW", "colab": { "base_uri": "https://localhost:8080/", "height": 235, "referenced_widgets": [ "311dd5614ff1456fa608b06c6a486333", "483e8e39248842cca6fdb69497417033", "898bb243539e4693a1a024e54dac42e2", "ab283be8197e4a638fd83dc475b57486", "af7cb58500024f7cac6545a44045bdd1", "d252ead0d3fc4885bed21c592dc9394c", "8ed15a3ae7c64fdea663832082e80529", "6a4014978e96442789889fef7919d87f", "173d4c16ef284a6085b183da2be8574e", "15d71a8c88ab4dc2bfb20853b5bac3d3", "fef9e2aa436040c0822605be0df0c359", "9b0393c7c87e42838da0e38fb72b2941", "3707cb9b877c451bb245d051c19af8fa", "28ac8f3a26ca4c51ab913c482234f3ab", "3ef057edd14f4fafbbb0b6fb81e9ba97", "8ff347600d6f42e5bf198232909fe086", "31a90525042744a49f8d48305354b9ec", "55811622761d49559291f3e74f072d31", "3c68af00a0784efca7a3dce29c060eef", "07acd3ae43eb4dfb893cc65e72320367", "4a1b68c322254a53a77e4d05e9ee7631", "e7c516f923f74666a8a4b11e17c333bd", "0be43a9c00324bb9abeedeb2a67b883a", "369a4af884f3436683d68ec126d7b6ec" ] }, "outputId": "cb65b089-f9f1-423c-8c35-f5f39bc7c139" }, "source": [ "import torch\n", "import json \n", "from transformers import T5Tokenizer, T5ForConditionalGeneration, T5Config\n", "\n", "model = T5ForConditionalGeneration.from_pretrained('t5-large')\n", "tokenizer = T5Tokenizer.from_pretrained('t5-large')\n", "device = torch.device('cpu')" ], "execution_count": null, "outputs": [ { "output_type": "display_data", "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "311dd5614ff1456fa608b06c6a486333", "version_minor": 0, "version_major": 2 }, "text/plain": [ "HBox(children=(FloatProgress(value=0.0, description='Downloading', max=1200.0, style=ProgressStyle(description…" ] }, "metadata": { "tags": [] } }, { "output_type": "stream", "text": [ "\n" ], "name": "stdout" }, { "output_type": "display_data", "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "173d4c16ef284a6085b183da2be8574e", "version_minor": 0, "version_major": 2 }, "text/plain": [ "HBox(children=(FloatProgress(value=0.0, description='Downloading', max=2950825948.0, style=ProgressStyle(descr…" ] }, "metadata": { "tags": [] } }, { "output_type": "stream", "text": [ "\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "Some weights of the model checkpoint at t5-large were not used when initializing T5ForConditionalGeneration: ['decoder.block.0.layer.1.EncDecAttention.relative_attention_bias.weight']\n", "- This IS expected if you are initializing T5ForConditionalGeneration from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).\n", "- This IS NOT expected if you are initializing T5ForConditionalGeneration from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).\n" ], "name": "stderr" }, { "output_type": "display_data", "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "31a90525042744a49f8d48305354b9ec", "version_minor": 0, "version_major": 2 }, "text/plain": [ "HBox(children=(FloatProgress(value=0.0, description='Downloading', max=791656.0, style=ProgressStyle(descripti…" ] }, "metadata": { "tags": [] } }, { "output_type": "stream", "text": [ "\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "Q6zHDK7I1GsY", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "21dc9317-47da-414e-b6fb-a24f60e1cda6" }, "source": [ "if display_architecture==True:\n", " print(model.config)" ], "execution_count": null, "outputs": [ { "output_type": "stream", "text": [ "T5Config {\n", " \"_name_or_path\": \"t5-large\",\n", " \"architectures\": [\n", " \"T5WithLMHeadModel\"\n", " ],\n", " \"d_ff\": 4096,\n", " \"d_kv\": 64,\n", " \"d_model\": 1024,\n", " \"decoder_start_token_id\": 0,\n", " \"dropout_rate\": 0.1,\n", " \"eos_token_id\": 1,\n", " \"feed_forward_proj\": \"relu\",\n", " \"initializer_factor\": 1.0,\n", " \"is_encoder_decoder\": true,\n", " \"layer_norm_epsilon\": 1e-06,\n", " \"model_type\": \"t5\",\n", " \"n_positions\": 512,\n", " \"num_decoder_layers\": 24,\n", " \"num_heads\": 16,\n", " \"num_layers\": 24,\n", " \"output_past\": true,\n", " \"pad_token_id\": 0,\n", " \"relative_attention_num_buckets\": 32,\n", " \"task_specific_params\": {\n", " \"summarization\": {\n", " \"early_stopping\": true,\n", " \"length_penalty\": 2.0,\n", " \"max_length\": 200,\n", " \"min_length\": 30,\n", " \"no_repeat_ngram_size\": 3,\n", " \"num_beams\": 4,\n", " \"prefix\": \"summarize: \"\n", " },\n", " \"translation_en_to_de\": {\n", " \"early_stopping\": true,\n", " \"max_length\": 300,\n", " \"num_beams\": 4,\n", " \"prefix\": \"translate English to German: \"\n", " },\n", " \"translation_en_to_fr\": {\n", " \"early_stopping\": true,\n", " \"max_length\": 300,\n", " \"num_beams\": 4,\n", " \"prefix\": \"translate English to French: \"\n", " },\n", " \"translation_en_to_ro\": {\n", " \"early_stopping\": true,\n", " \"max_length\": 300,\n", " \"num_beams\": 4,\n", " \"prefix\": \"translate English to Romanian: \"\n", " }\n", " },\n", " \"use_cache\": true,\n", " \"vocab_size\": 32128\n", "}\n", "\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "5LaWN15NPIPC", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "623480ce-b408-462b-a8c5-5c12959f724c" }, "source": [ "if(display_architecture==True):\n", " print(model)" ], "execution_count": null, "outputs": [ { "output_type": "stream", "text": [ "T5ForConditionalGeneration(\n", " (shared): Embedding(32128, 1024)\n", " (encoder): T5Stack(\n", " (embed_tokens): Embedding(32128, 1024)\n", " (block): ModuleList(\n", " (0): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " (relative_attention_bias): Embedding(32, 16)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (1): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (2): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (3): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (4): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (5): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (6): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (7): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (8): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (9): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (10): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (11): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (12): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (13): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (14): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (15): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (16): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (17): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (18): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (19): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (20): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (21): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (22): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (23): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " )\n", " (final_layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (decoder): T5Stack(\n", " (embed_tokens): Embedding(32128, 1024)\n", " (block): ModuleList(\n", " (0): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " (relative_attention_bias): Embedding(32, 16)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (1): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (2): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (3): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (4): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (5): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (6): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (7): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (8): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (9): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (10): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (11): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (12): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (13): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (14): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (15): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (16): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (17): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (18): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (19): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (20): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (21): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (22): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (23): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " )\n", " (final_layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (lm_head): Linear(in_features=1024, out_features=32128, bias=False)\n", ")\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "DS2twf1P1UYI", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "e060a702-cf27-41b4-d162-ffb21872b81c" }, "source": [ "if display_architecture==True:\n", " print(model.encoder)" ], "execution_count": null, "outputs": [ { "output_type": "stream", "text": [ "T5Stack(\n", " (embed_tokens): Embedding(32128, 1024)\n", " (block): ModuleList(\n", " (0): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " (relative_attention_bias): Embedding(32, 16)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (1): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (2): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (3): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (4): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (5): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (6): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (7): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (8): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (9): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (10): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (11): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (12): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (13): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (14): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (15): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (16): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (17): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (18): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (19): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (20): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (21): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (22): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (23): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " )\n", " (final_layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", ")\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "MCwdhX9U1MA5", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "ae39b023-77ec-483d-fb33-2bc59d2c0996" }, "source": [ "if display_architecture==True:\n", " print(model.decoder)" ], "execution_count": null, "outputs": [ { "output_type": "stream", "text": [ "T5Stack(\n", " (embed_tokens): Embedding(32128, 1024)\n", " (block): ModuleList(\n", " (0): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " (relative_attention_bias): Embedding(32, 16)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (1): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (2): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (3): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (4): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (5): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (6): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (7): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (8): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (9): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (10): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (11): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (12): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (13): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (14): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (15): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (16): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (17): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (18): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (19): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (20): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (21): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (22): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (23): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " )\n", " (final_layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", ")\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "GmrCDtcL1hPn", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "d93ba363-7336-4514-b693-b4c33dd8cb07" }, "source": [ "if display_architecture==True:\n", " print(model.forward)" ], "execution_count": null, "outputs": [ { "output_type": "stream", "text": [ "\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "S5KfhCrifP01" }, "source": [ "\n", "def summarize(text,ml):\n", " preprocess_text = text.strip().replace(\"\\n\",\"\")\n", " t5_prepared_Text = \"summarize: \"+preprocess_text\n", " print (\"Preprocessed and prepared text: \\n\", t5_prepared_Text)\n", "\n", " tokenized_text = tokenizer.encode(t5_prepared_Text, return_tensors=\"pt\").to(device)\n", "\n", " # summmarize \n", " summary_ids = model.generate(tokenized_text,\n", " num_beams=4,\n", " no_repeat_ngram_size=2,\n", " min_length=30,\n", " max_length=ml,\n", " early_stopping=True)\n", "\n", " output = tokenizer.decode(summary_ids[0], skip_special_tokens=True)\n", " return output" ], "execution_count": null, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "vqiTNoDc7pOv", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "d46bc587-b39d-4e8e-bd5e-3d4d6d529168" }, "source": [ "text=\"\"\"\n", "The United States Declaration of Independence was the first Etext\n", "released by Project Gutenberg, early in 1971. The title was stored\n", "in an emailed instruction set which required a tape or diskpack be\n", "hand mounted for retrieval. The diskpack was the size of a large\n", "cake in a cake carrier, cost $1500, and contained 5 megabytes, of\n", "which this file took 1-2%. Two tape backups were kept plus one on\n", "paper tape. The 10,000 files we hope to have online by the end of\n", "2001 should take about 1-2% of a comparably priced drive in 2001.\n", "\"\"\"\n", "print(\"Number of characters:\",len(text))\n", "summary=summarize(text,50)\n", "print (\"\\n\\nSummarized text: \\n\",summary)\n" ], "execution_count": null, "outputs": [ { "output_type": "stream", "text": [ "Number of characters: 534\n", "Preprocessed and prepared text: \n", " summarize: The United States Declaration of Independence was the first Etextreleased by Project Gutenberg, early in 1971. The title was storedin an emailed instruction set which required a tape or diskpack behand mounted for retrieval. The diskpack was the size of a largecake in a cake carrier, cost $1500, and contained 5 megabytes, ofwhich this file took 1-2%. Two tape backups were kept plus one onpaper tape. The 10,000 files we hope to have online by the end of2001 should take about 1-2% of a comparably priced drive in 2001.\n", "\n", "\n", "Summarized text: \n", " the united states declaration of independence was the first etext published by project gutenberg, early in 1971. the 10,000 files we hope to have online by the end of2001 should take about 1-2% of a comparably priced drive in\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "2321zS1Q3jPX", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "fccf0c6e-926f-4f65-b3ba-b0e19715f87b" }, "source": [ "#Bill of Rights,V\n", "text =\"\"\"\n", "No person shall be held to answer for a capital, or otherwise infamous crime,\n", "unless on a presentment or indictment of a Grand Jury, except in cases arising\n", "in the land or naval forces, or in the Militia, when in actual service\n", "in time of War or public danger; nor shall any person be subject for\n", "the same offense to be twice put in jeopardy of life or limb;\n", "nor shall be compelled in any criminal case to be a witness against himself,\n", "nor be deprived of life, liberty, or property, without due process of law;\n", "nor shall private property be taken for public use without just compensation.\n", "\n", "\"\"\"\n", "print(\"Number of characters:\",len(text))\n", "summary=summarize(text,50)\n", "print (\"\\n\\nSummarized text: \\n\",summary)\n", " " ], "execution_count": null, "outputs": [ { "output_type": "stream", "text": [ "Number of characters: 591\n", "Preprocessed and prepared text: \n", " summarize: No person shall be held to answer for a capital, or otherwise infamous crime,unless on a presentment or indictment of a Grand Jury, except in cases arisingin the land or naval forces, or in the Militia, when in actual servicein time of War or public danger; nor shall any person be subject forthe same offense to be twice put in jeopardy of life or limb;nor shall be compelled in any criminal case to be a witness against himself,nor be deprived of life, liberty, or property, without due process of law;nor shall private property be taken for public use without just compensation.\n", "\n", "\n", "Summarized text: \n", " no person shall be held to answer for a capital, or otherwise infamous crime, unless ona presentment or indictment ofa Grand Jury. nor shall any person be subject for the same offense to be twice put\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "k_h8oQ55_zr5", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "7c957d5a-6711-4169-9244-51562a6cc9cd" }, "source": [ "#Montana Corporate Law\n", "#https://corporations.uslegal.com/state-corporation-law/montana-corporation-law/#:~:text=Montana%20Corporation%20Law,carrying%20out%20its%20business%20activities.\n", "\n", "text =\"\"\"The law regarding corporations prescribes that a corporation can be incorporated in the state of Montana to serve any lawful purpose. In the state of Montana, a corporation has all the powers of a natural person for carrying out its business activities. The corporation can sue and be sued in its corporate name. It has perpetual succession. The corporation can buy, sell or otherwise acquire an interest in a real or personal property. It can conduct business, carry on operations, and have offices and exercise the powers in a state, territory or district in possession of the U.S., or in a foreign country. It can appoint officers and agents of the corporation for various duties and fix their compensation.\n", "The name of a corporation must contain the word “corporation” or its abbreviation “corp.” The name of a corporation should not be deceptively similar to the name of another corporation incorporated in the same state. It should not be deceptively identical to the fictitious name adopted by a foreign corporation having business transactions in the state.\n", "The corporation is formed by one or more natural persons by executing and filing articles of incorporation to the secretary of state of filing. The qualifications for directors are fixed either by articles of incorporation or bylaws. The names and addresses of the initial directors and purpose of incorporation should be set forth in the articles of incorporation. The articles of incorporation should contain the corporate name, the number of shares authorized to issue, a brief statement of the character of business carried out by the corporation, the names and addresses of the directors until successors are elected, and name and addresses of incorporators. The shareholders have the power to change the size of board of directors.\n", "\"\"\"\n", "print(\"Number of characters:\",len(text))\n", "summary=summarize(text,50)\n", "print (\"\\n\\nSummarized text: \\n\",summary)\n", " " ], "execution_count": null, "outputs": [ { "output_type": "stream", "text": [ "Number of characters: 1816\n", "Preprocessed and prepared text: \n", " summarize: The law regarding corporations prescribes that a corporation can be incorporated in the state of Montana to serve any lawful purpose. In the state of Montana, a corporation has all the powers of a natural person for carrying out its business activities. The corporation can sue and be sued in its corporate name. It has perpetual succession. The corporation can buy, sell or otherwise acquire an interest in a real or personal property. It can conduct business, carry on operations, and have offices and exercise the powers in a state, territory or district in possession of the U.S., or in a foreign country. It can appoint officers and agents of the corporation for various duties and fix their compensation.The name of a corporation must contain the word “corporation” or its abbreviation “corp.” The name of a corporation should not be deceptively similar to the name of another corporation incorporated in the same state. It should not be deceptively identical to the fictitious name adopted by a foreign corporation having business transactions in the state.The corporation is formed by one or more natural persons by executing and filing articles of incorporation to the secretary of state of filing. The qualifications for directors are fixed either by articles of incorporation or bylaws. The names and addresses of the initial directors and purpose of incorporation should be set forth in the articles of incorporation. The articles of incorporation should contain the corporate name, the number of shares authorized to issue, a brief statement of the character of business carried out by the corporation, the names and addresses of the directors until successors are elected, and name and addresses of incorporators. The shareholders have the power to change the size of board of directors.\n", "\n", "\n", "Summarized text: \n", " a corporation can be incorporated in the state of Montana to serve any lawful purpose. the corporation has perpetual succession and can sue and be sued in its corporate name. it can conduct business, carry on operations, and have offices\n" ], "name": "stdout" } ] } ] } ================================================ FILE: Chapter08/Summarizing_Text_V2.ipynb ================================================ { "nbformat": 4, "nbformat_minor": 0, "metadata": { "colab": { "name": "Summarizing Text V2.ipynb", "provenance": [], "collapsed_sections": [] }, "kernelspec": { "name": "python3", "display_name": "Python 3" } }, "cells": [ { "cell_type": "markdown", "metadata": { "id": "RcdcqBkV0MTU" }, "source": [ "#Summarizing Text with T5\n", "Copyright 2020, Denis Rothman. MIT License. Hugging Face usage example was modified for educational purposes.\n", "\n", "[Hugging Face Models](https://huggingface.co/transformers/model_doc/t5.html)\n", "\n", "[Hugging Face Framework Usage](https://huggingface.co/transformers/usage.html)\n" ] }, { "cell_type": "code", "metadata": { "id": "06QFZGxsf_KJ", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "931f8431-e4c2-4144-bad7-60a4df9e6ae8" }, "source": [ "!pip install transformers" ], "execution_count": 13, "outputs": [ { "output_type": "stream", "text": [ "Requirement already satisfied: transformers in /usr/local/lib/python3.6/dist-packages (4.1.1)\n", "Requirement already satisfied: tqdm>=4.27 in /usr/local/lib/python3.6/dist-packages (from transformers) (4.41.1)\n", "Requirement already satisfied: requests in /usr/local/lib/python3.6/dist-packages (from transformers) (2.23.0)\n", "Requirement already satisfied: sacremoses in /usr/local/lib/python3.6/dist-packages (from transformers) (0.0.43)\n", "Requirement already satisfied: filelock in /usr/local/lib/python3.6/dist-packages (from transformers) (3.0.12)\n", "Requirement already satisfied: tokenizers==0.9.4 in /usr/local/lib/python3.6/dist-packages (from transformers) (0.9.4)\n", "Requirement already satisfied: dataclasses; python_version < \"3.7\" in /usr/local/lib/python3.6/dist-packages (from transformers) (0.8)\n", "Requirement already satisfied: regex!=2019.12.17 in /usr/local/lib/python3.6/dist-packages (from transformers) (2019.12.20)\n", "Requirement already satisfied: packaging in /usr/local/lib/python3.6/dist-packages (from transformers) (20.8)\n", "Requirement already satisfied: numpy in /usr/local/lib/python3.6/dist-packages (from transformers) (1.19.4)\n", "Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.6/dist-packages (from requests->transformers) (2020.12.5)\n", "Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /usr/local/lib/python3.6/dist-packages (from requests->transformers) (1.24.3)\n", "Requirement already satisfied: idna<3,>=2.5 in /usr/local/lib/python3.6/dist-packages (from requests->transformers) (2.10)\n", "Requirement already satisfied: chardet<4,>=3.0.2 in /usr/local/lib/python3.6/dist-packages (from requests->transformers) (3.0.4)\n", "Requirement already satisfied: six in /usr/local/lib/python3.6/dist-packages (from sacremoses->transformers) (1.15.0)\n", "Requirement already satisfied: joblib in /usr/local/lib/python3.6/dist-packages (from sacremoses->transformers) (1.0.0)\n", "Requirement already satisfied: click in /usr/local/lib/python3.6/dist-packages (from sacremoses->transformers) (7.1.2)\n", "Requirement already satisfied: pyparsing>=2.0.2 in /usr/local/lib/python3.6/dist-packages (from packaging->transformers) (2.4.7)\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "3tYFf-cEIkKL", "outputId": "f72a822c-f875-40e3-cc3f-eb7b37e5e47b" }, "source": [ "!pip install sentencepiece==0.1.94" ], "execution_count": 14, "outputs": [ { "output_type": "stream", "text": [ "Requirement already satisfied: sentencepiece==0.1.94 in /usr/local/lib/python3.6/dist-packages (0.1.94)\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "FEQO4tDl7xH_" }, "source": [ "display_architecture=True" ], "execution_count": 15, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "q8suV48O07TW", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "0743f9df-e4a8-46c3-9d8c-595698fa78a5" }, "source": [ "import torch\n", "import json \n", "from transformers import T5Tokenizer, T5ForConditionalGeneration, T5Config\n", "\n", "model = T5ForConditionalGeneration.from_pretrained('t5-large')\n", "tokenizer = T5Tokenizer.from_pretrained('t5-large')\n", "device = torch.device('cpu')" ], "execution_count": 16, "outputs": [ { "output_type": "stream", "text": [ "Some weights of the model checkpoint at t5-large were not used when initializing T5ForConditionalGeneration: ['decoder.block.0.layer.1.EncDecAttention.relative_attention_bias.weight']\n", "- This IS expected if you are initializing T5ForConditionalGeneration from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).\n", "- This IS NOT expected if you are initializing T5ForConditionalGeneration from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).\n" ], "name": "stderr" } ] }, { "cell_type": "code", "metadata": { "id": "Q6zHDK7I1GsY", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "c40c5620-12c6-4a99-9f86-7eef28ec0871" }, "source": [ "if display_architecture==True:\n", " print(model.config)" ], "execution_count": 17, "outputs": [ { "output_type": "stream", "text": [ "T5Config {\n", " \"_name_or_path\": \"t5-large\",\n", " \"architectures\": [\n", " \"T5WithLMHeadModel\"\n", " ],\n", " \"d_ff\": 4096,\n", " \"d_kv\": 64,\n", " \"d_model\": 1024,\n", " \"decoder_start_token_id\": 0,\n", " \"dropout_rate\": 0.1,\n", " \"eos_token_id\": 1,\n", " \"feed_forward_proj\": \"relu\",\n", " \"initializer_factor\": 1.0,\n", " \"is_encoder_decoder\": true,\n", " \"layer_norm_epsilon\": 1e-06,\n", " \"model_type\": \"t5\",\n", " \"n_positions\": 512,\n", " \"num_decoder_layers\": 24,\n", " \"num_heads\": 16,\n", " \"num_layers\": 24,\n", " \"output_past\": true,\n", " \"pad_token_id\": 0,\n", " \"relative_attention_num_buckets\": 32,\n", " \"task_specific_params\": {\n", " \"summarization\": {\n", " \"early_stopping\": true,\n", " \"length_penalty\": 2.0,\n", " \"max_length\": 200,\n", " \"min_length\": 30,\n", " \"no_repeat_ngram_size\": 3,\n", " \"num_beams\": 4,\n", " \"prefix\": \"summarize: \"\n", " },\n", " \"translation_en_to_de\": {\n", " \"early_stopping\": true,\n", " \"max_length\": 300,\n", " \"num_beams\": 4,\n", " \"prefix\": \"translate English to German: \"\n", " },\n", " \"translation_en_to_fr\": {\n", " \"early_stopping\": true,\n", " \"max_length\": 300,\n", " \"num_beams\": 4,\n", " \"prefix\": \"translate English to French: \"\n", " },\n", " \"translation_en_to_ro\": {\n", " \"early_stopping\": true,\n", " \"max_length\": 300,\n", " \"num_beams\": 4,\n", " \"prefix\": \"translate English to Romanian: \"\n", " }\n", " },\n", " \"use_cache\": true,\n", " \"vocab_size\": 32128\n", "}\n", "\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "5LaWN15NPIPC", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "d746c169-acc4-4e6a-e704-b7c4f635adc2" }, "source": [ "if(display_architecture==True):\n", " print(model)" ], "execution_count": 18, "outputs": [ { "output_type": "stream", "text": [ "T5ForConditionalGeneration(\n", " (shared): Embedding(32128, 1024)\n", " (encoder): T5Stack(\n", " (embed_tokens): Embedding(32128, 1024)\n", " (block): ModuleList(\n", " (0): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " (relative_attention_bias): Embedding(32, 16)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (1): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (2): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (3): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (4): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (5): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (6): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (7): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (8): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (9): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (10): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (11): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (12): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (13): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (14): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (15): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (16): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (17): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (18): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (19): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (20): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (21): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (22): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (23): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " )\n", " (final_layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (decoder): T5Stack(\n", " (embed_tokens): Embedding(32128, 1024)\n", " (block): ModuleList(\n", " (0): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " (relative_attention_bias): Embedding(32, 16)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (1): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (2): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (3): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (4): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (5): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (6): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (7): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (8): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (9): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (10): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (11): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (12): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (13): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (14): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (15): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (16): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (17): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (18): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (19): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (20): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (21): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (22): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (23): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " )\n", " (final_layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (lm_head): Linear(in_features=1024, out_features=32128, bias=False)\n", ")\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "DS2twf1P1UYI", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "29de71ca-6f91-4d94-90f6-215b47468d66" }, "source": [ "if display_architecture==True:\n", " print(model.encoder)" ], "execution_count": 19, "outputs": [ { "output_type": "stream", "text": [ "T5Stack(\n", " (embed_tokens): Embedding(32128, 1024)\n", " (block): ModuleList(\n", " (0): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " (relative_attention_bias): Embedding(32, 16)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (1): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (2): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (3): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (4): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (5): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (6): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (7): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (8): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (9): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (10): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (11): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (12): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (13): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (14): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (15): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (16): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (17): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (18): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (19): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (20): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (21): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (22): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (23): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " )\n", " (final_layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", ")\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "MCwdhX9U1MA5", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "f2d4dfd6-c313-415b-e997-da0b0bedd7df" }, "source": [ "if display_architecture==True:\n", " print(model.decoder)" ], "execution_count": 20, "outputs": [ { "output_type": "stream", "text": [ "T5Stack(\n", " (embed_tokens): Embedding(32128, 1024)\n", " (block): ModuleList(\n", " (0): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " (relative_attention_bias): Embedding(32, 16)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (1): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (2): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (3): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (4): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (5): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (6): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (7): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (8): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (9): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (10): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (11): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (12): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (13): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (14): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (15): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (16): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (17): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (18): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (19): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (20): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (21): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (22): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " (23): T5Block(\n", " (layer): ModuleList(\n", " (0): T5LayerSelfAttention(\n", " (SelfAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (1): T5LayerCrossAttention(\n", " (EncDecAttention): T5Attention(\n", " (q): Linear(in_features=1024, out_features=1024, bias=False)\n", " (k): Linear(in_features=1024, out_features=1024, bias=False)\n", " (v): Linear(in_features=1024, out_features=1024, bias=False)\n", " (o): Linear(in_features=1024, out_features=1024, bias=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (2): T5LayerFF(\n", " (DenseReluDense): T5DenseReluDense(\n", " (wi): Linear(in_features=1024, out_features=4096, bias=False)\n", " (wo): Linear(in_features=4096, out_features=1024, bias=False)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " )\n", " )\n", " )\n", " (final_layer_norm): T5LayerNorm()\n", " (dropout): Dropout(p=0.1, inplace=False)\n", ")\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "GmrCDtcL1hPn", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "3a8d557d-0e1f-4174-f58e-4e0e708434fe" }, "source": [ "if display_architecture==True:\n", " print(model.forward)" ], "execution_count": 21, "outputs": [ { "output_type": "stream", "text": [ "\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "S5KfhCrifP01" }, "source": [ "\n", "def summarize(text,ml):\n", " preprocess_text = text.strip().replace(\"\\n\",\"\")\n", " t5_prepared_Text = \"summarize: \"+preprocess_text\n", " print (\"Preprocessed and prepared text: \\n\", t5_prepared_Text)\n", "\n", " tokenized_text = tokenizer.encode(t5_prepared_Text, return_tensors=\"pt\").to(device)\n", "\n", " # summmarize \n", " summary_ids = model.generate(tokenized_text,\n", " num_beams=4,\n", " no_repeat_ngram_size=2,\n", " min_length=30,\n", " max_length=ml,\n", " early_stopping=True)\n", "\n", " output = tokenizer.decode(summary_ids[0], skip_special_tokens=True)\n", " return output" ], "execution_count": 22, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "vqiTNoDc7pOv", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "c83c8762-48bd-47fe-ee95-0da2acf99ddc" }, "source": [ "text=\"\"\"\n", "The United States Declaration of Independence was the first Etext\n", "released by Project Gutenberg, early in 1971. The title was stored\n", "in an emailed instruction set which required a tape or diskpack be\n", "hand mounted for retrieval. The diskpack was the size of a large\n", "cake in a cake carrier, cost $1500, and contained 5 megabytes, of\n", "which this file took 1-2%. Two tape backups were kept plus one on\n", "paper tape. The 10,000 files we hope to have online by the end of\n", "2001 should take about 1-2% of a comparably priced drive in 2001.\n", "\"\"\"\n", "print(\"Number of characters:\",len(text))\n", "summary=summarize(text,50)\n", "print (\"\\n\\nSummarized text: \\n\",summary)\n" ], "execution_count": 23, "outputs": [ { "output_type": "stream", "text": [ "Number of characters: 534\n", "Preprocessed and prepared text: \n", " summarize: The United States Declaration of Independence was the first Etextreleased by Project Gutenberg, early in 1971. The title was storedin an emailed instruction set which required a tape or diskpack behand mounted for retrieval. The diskpack was the size of a largecake in a cake carrier, cost $1500, and contained 5 megabytes, ofwhich this file took 1-2%. Two tape backups were kept plus one onpaper tape. The 10,000 files we hope to have online by the end of2001 should take about 1-2% of a comparably priced drive in 2001.\n", "\n", "\n", "Summarized text: \n", " the united states declaration of independence was the first etext published by project gutenberg, early in 1971. the 10,000 files we hope to have online by the end of2001 should take about 1-2% of a comparably priced drive in\n" ], "name": "stdout" } ] }, { "cell_type": "markdown", "metadata": { "id": "nvB2NenCBfO4" }, "source": [ "Summarizing the Bill of Rights, Version 1" ] }, { "cell_type": "code", "metadata": { "id": "2321zS1Q3jPX", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "6c78b355-8d54-4553-b6e7-d43f8d2b770c" }, "source": [ "#Bill of Rights,V\n", "text =\"\"\"\n", "No person shall be held to answer for a capital, or otherwise infamous crime,\n", "unless on a presentment or indictment of a Grand Jury, except in cases arising\n", "in the land or naval forces, or in the Militia, when in actual service\n", "in time of War or public danger; nor shall any person be subject for\n", "the same offense to be twice put in jeopardy of life or limb;\n", "nor shall be compelled in any criminal case to be a witness against himself,\n", "nor be deprived of life, liberty, or property, without due process of law;\n", "nor shall private property be taken for public use without just compensation.\n", "\n", "\"\"\"\n", "print(\"Number of characters:\",len(text))\n", "summary=summarize(text,50)\n", "print (\"\\n\\nSummarized text: \\n\",summary)\n", " " ], "execution_count": 24, "outputs": [ { "output_type": "stream", "text": [ "Number of characters: 591\n", "Preprocessed and prepared text: \n", " summarize: No person shall be held to answer for a capital, or otherwise infamous crime,unless on a presentment or indictment of a Grand Jury, except in cases arisingin the land or naval forces, or in the Militia, when in actual servicein time of War or public danger; nor shall any person be subject forthe same offense to be twice put in jeopardy of life or limb;nor shall be compelled in any criminal case to be a witness against himself,nor be deprived of life, liberty, or property, without due process of law;nor shall private property be taken for public use without just compensation.\n", "\n", "\n", "Summarized text: \n", " no person shall be held to answer for a capital, or otherwise infamous crime, unless ona presentment or indictment ofa Grand Jury. nor shall any person be subject for the same offense to be twice put\n" ], "name": "stdout" } ] }, { "cell_type": "markdown", "metadata": { "id": "zr2A49TDBkZz" }, "source": [ "Summarizing the Bill of Rights, Version 2" ] }, { "cell_type": "code", "metadata": { "id": "HWMvLGyahPFP", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "0354d7f5-ec21-476d-eb93-dd66204c30b3" }, "source": [ "#Bill of Rights,V\n", "text =\"\"\"\n", "A person must be indicted by a Grand Jury for a capital or infamous crime.\n", "There are excpetions in time of war for a person in the army, navy, or national guard.\n", "A person can not be judged twice for the same offense or put in a situation of double jeopardy of life.\n", "A person can not be asked to be a witness agains herself or himself.\n", "A person cannot be deprived of life, liberty or property without due process of law.\n", "A person must be compensated for property taken for public use.\n", "\"\"\"\n", "print(\"Number of characters:\",len(text))\n", "summary=summarize(text,50)\n", "print (\"\\n\\nSummarized text: \\n\",summary)\n", " " ], "execution_count": 25, "outputs": [ { "output_type": "stream", "text": [ "Number of characters: 485\n", "Preprocessed and prepared text: \n", " summarize: A person must be indicted by a Grand Jury for a capital or infamous crime.There are excpetions in time of war for a person in the army, navy, or national guard.A person can not be judged twice for the same offense or put in a situation of double jeopardy of life.A person can not be asked to be a witness agains herself or himself.A person cannot be deprived of life, liberty or property without due process of law.A person must be compensated for property taken for public use.\n", "\n", "\n", "Summarized text: \n", " a person cannot be deprived of life, liberty or property without due process of law.A person must be compensated for property taken for public use.\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "k_h8oQ55_zr5", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "4bc5bef5-139a-436f-c820-5a5851cfde36" }, "source": [ "#Montana Corporate Law\n", "#https://corporations.uslegal.com/state-corporation-law/montana-corporation-law/#:~:text=Montana%20Corporation%20Law,carrying%20out%20its%20business%20activities.\n", "\n", "text =\"\"\"The law regarding corporations prescribes that a corporation can be incorporated in the state of Montana to serve any lawful purpose. In the state of Montana, a corporation has all the powers of a natural person for carrying out its business activities. The corporation can sue and be sued in its corporate name. It has perpetual succession. The corporation can buy, sell or otherwise acquire an interest in a real or personal property. It can conduct business, carry on operations, and have offices and exercise the powers in a state, territory or district in possession of the U.S., or in a foreign country. It can appoint officers and agents of the corporation for various duties and fix their compensation.\n", "The name of a corporation must contain the word “corporation” or its abbreviation “corp.” The name of a corporation should not be deceptively similar to the name of another corporation incorporated in the same state. It should not be deceptively identical to the fictitious name adopted by a foreign corporation having business transactions in the state.\n", "The corporation is formed by one or more natural persons by executing and filing articles of incorporation to the secretary of state of filing. The qualifications for directors are fixed either by articles of incorporation or bylaws. The names and addresses of the initial directors and purpose of incorporation should be set forth in the articles of incorporation. The articles of incorporation should contain the corporate name, the number of shares authorized to issue, a brief statement of the character of business carried out by the corporation, the names and addresses of the directors until successors are elected, and name and addresses of incorporators. The shareholders have the power to change the size of board of directors.\n", "\"\"\"\n", "print(\"Number of characters:\",len(text))\n", "summary=summarize(text,50)\n", "print (\"\\n\\nSummarized text: \\n\",summary)\n", " " ], "execution_count": 26, "outputs": [ { "output_type": "stream", "text": [ "Number of characters: 1816\n", "Preprocessed and prepared text: \n", " summarize: The law regarding corporations prescribes that a corporation can be incorporated in the state of Montana to serve any lawful purpose. In the state of Montana, a corporation has all the powers of a natural person for carrying out its business activities. The corporation can sue and be sued in its corporate name. It has perpetual succession. The corporation can buy, sell or otherwise acquire an interest in a real or personal property. It can conduct business, carry on operations, and have offices and exercise the powers in a state, territory or district in possession of the U.S., or in a foreign country. It can appoint officers and agents of the corporation for various duties and fix their compensation.The name of a corporation must contain the word “corporation” or its abbreviation “corp.” The name of a corporation should not be deceptively similar to the name of another corporation incorporated in the same state. It should not be deceptively identical to the fictitious name adopted by a foreign corporation having business transactions in the state.The corporation is formed by one or more natural persons by executing and filing articles of incorporation to the secretary of state of filing. The qualifications for directors are fixed either by articles of incorporation or bylaws. The names and addresses of the initial directors and purpose of incorporation should be set forth in the articles of incorporation. The articles of incorporation should contain the corporate name, the number of shares authorized to issue, a brief statement of the character of business carried out by the corporation, the names and addresses of the directors until successors are elected, and name and addresses of incorporators. The shareholders have the power to change the size of board of directors.\n", "\n", "\n", "Summarized text: \n", " a corporation can be incorporated in the state of Montana to serve any lawful purpose. the corporation has perpetual succession and can sue and be sued in its corporate name. it can conduct business, carry on operations, and have offices\n" ], "name": "stdout" } ] } ] } ================================================ FILE: Chapter08/Tokenizer.ipynb ================================================ { "nbformat": 4, "nbformat_minor": 0, "metadata": { "colab": { "name": "Tokenizer.ipynb", "provenance": [], "collapsed_sections": [], "toc_visible": true }, "kernelspec": { "name": "python3", "display_name": "Python 3" } }, "cells": [ { "cell_type": "markdown", "metadata": { "id": "7fjcTlyE3WvR" }, "source": [ "#Tokenizers\n", "Copyright 2020 Denis Rothman, MIT License\n", "\n", "Reference 1 for word embedding:\n", "https://www.geeksforgeeks.org/python-word-embedding-using-word2vec/\n", "\n", "Reference 2 for cosine similarity:\n", "SciKit Learn cosine similarity documentation\n", "\n", "***Upload text.txt before running the Notebook***" ] }, { "cell_type": "code", "metadata": { "id": "JKJ8Saf6vR9b", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "e329b785-128d-447a-b97d-eeaeb740e9e4" }, "source": [ "#@title Pre-Requisistes\n", "!pip install gensim==3.8.3\n", "import nltk\n", "nltk.download('punkt')" ], "execution_count": null, "outputs": [ { "output_type": "stream", "text": [ "Collecting gensim==3.8.3\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/5c/4e/afe2315e08a38967f8a3036bbe7e38b428e9b7a90e823a83d0d49df1adf5/gensim-3.8.3-cp37-cp37m-manylinux1_x86_64.whl (24.2MB)\n", "\u001b[K |████████████████████████████████| 24.2MB 1.5MB/s \n", "\u001b[?25hRequirement already satisfied: scipy>=0.18.1 in /usr/local/lib/python3.7/dist-packages (from gensim==3.8.3) (1.4.1)\n", "Requirement already satisfied: numpy>=1.11.3 in /usr/local/lib/python3.7/dist-packages (from gensim==3.8.3) (1.19.5)\n", "Requirement already satisfied: smart-open>=1.8.1 in /usr/local/lib/python3.7/dist-packages (from gensim==3.8.3) (5.0.0)\n", "Requirement already satisfied: six>=1.5.0 in /usr/local/lib/python3.7/dist-packages (from gensim==3.8.3) (1.15.0)\n", "Installing collected packages: gensim\n", " Found existing installation: gensim 4.0.1\n", " Uninstalling gensim-4.0.1:\n", " Successfully uninstalled gensim-4.0.1\n", "Successfully installed gensim-3.8.3\n", "[nltk_data] Downloading package punkt to /root/nltk_data...\n", "[nltk_data] Package punkt is already up-to-date!\n" ], "name": "stdout" }, { "output_type": "execute_result", "data": { "text/plain": [ "True" ] }, "metadata": { "tags": [] }, "execution_count": 1 } ] }, { "cell_type": "code", "metadata": { "id": "7o7EeDUUu0Sh" }, "source": [ "import math\n", "import numpy as np\n", "from nltk.tokenize import sent_tokenize, word_tokenize \n", "import gensim \n", "from gensim.models import Word2Vec \n", "import numpy as np\n", "from sklearn.metrics.pairwise import cosine_similarity\n", "import matplotlib.pyplot as plt\n", "import warnings \n", "warnings.filterwarnings(action = 'ignore') " ], "execution_count": null, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "1NRomrXEJOxJ", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "360af319-8259-469e-babd-6eaa4cd6c714" }, "source": [ "#@title Word2Vec Tokenization\n", "#‘text.txt’ file \n", "sample = open(\"text.txt\", \"r\") \n", "s = sample.read() \n", "\n", "# processing escape characters \n", "f = s.replace(\"\\n\", \" \") \n", "\n", "data = [] \n", "# sentence parsing\n", "for i in sent_tokenize(f): \n", "\ttemp = [] \n", "\t# tokenize the sentence into words \n", "\tfor j in word_tokenize(i): \n", "\t\ttemp.append(j.lower())\n", "\tdata.append(temp)\n", "\n", "# Creating Skip Gram model \n", "model2 = gensim.models.Word2Vec(data, min_count = 1, size = 512,window = 5, sg = 1) \n", "print(model2)" ], "execution_count": null, "outputs": [ { "output_type": "stream", "text": [ "Word2Vec(vocab=11822, size=512, alpha=0.025)\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "YcC_3JLcJTgw" }, "source": [ "#@title Cosine Similarity\n", "def similarity(word1,word2):\n", " cosine=False #default value\n", " try:\n", " a=model2[word1]\n", " cosine=True\n", " except KeyError: #The KeyError exception is raised\n", " print(word1, \":[unk] key not found in dictionary\")#False implied\n", "\n", " try:\n", " b=model2[word2]#a=True implied\n", " except KeyError: #The KeyError exception is raised\n", " cosine=False #both a and b must be true\n", " print(word2, \":[unk] key not found in dictionary\")\n", "\n", " if(cosine==True):\n", " b=model2[word2]\n", " # compute cosine similarity\n", " dot = np.dot(a, b)\n", " norma = np.linalg.norm(a)\n", " normb = np.linalg.norm(b)\n", " cos = dot / (norma * normb)\n", "\n", " aa = a.reshape(1,512) \n", " ba = b.reshape(1,512)\n", " #print(\"Word1\",aa)\n", " #print(\"Word2\",ba)\n", " cos_lib = cosine_similarity(aa, ba)\n", " #print(cos_lib,\"word similarity\")\n", " \n", " if(cosine==False):cos_lib=0;\n", " return cos_lib" ], "execution_count": null, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "fMfgbogHJVh-", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "3bdcc464-75fb-4f60-be5b-86b59af7809e" }, "source": [ "#@title Case 0: Words in text and dictionary\n", "word1=\"freedom\";word2=\"liberty\"\n", "print(\"Similarity\",similarity(word1,word2),word1,word2)" ], "execution_count": null, "outputs": [ { "output_type": "stream", "text": [ "Similarity [[0.38632965]] freedom liberty\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "4B7vvKxOLbYC", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "837c3d9f-f64c-43be-f689-0c8ea9284c25" }, "source": [ "#@title Word(s) Case 1: Word not in text or dictionary\n", "word1=\"corporations\";word2=\"rights\"\n", "print(\"Similarity\",similarity(word1,word2),word1,word2)" ], "execution_count": null, "outputs": [ { "output_type": "stream", "text": [ "corporations :[unk] key not found in dictionary\n", "Similarity 0 corporations rights\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "qkFIC79JCQJp", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "2f53a434-ce87-47c2-c37c-375f9afac846" }, "source": [ "#@title Case 2: Noisy Relationship \n", "word1=\"etext\";word2=\"declaration\"\n", "print(\"Similarity\",similarity(word1,word2),word1,word2)" ], "execution_count": null, "outputs": [ { "output_type": "stream", "text": [ "Similarity [[0.51544815]] etext declaration\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "mKVPiEi-GZtf", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "f2b3a8b0-63f1-4e0c-a242-ca709140bcb3" }, "source": [ "#@title Case 3: Rare words\n", "word1=\"justiciar\";word2=\"judgement\"\n", "print(\"Similarity\",similarity(word1,word2),word1,word2)" ], "execution_count": null, "outputs": [ { "output_type": "stream", "text": [ "Similarity [[0.2304948]] justiciar judgement\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "0xZtAm3DHGJg", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "7ac7a0f5-3509-4254-a7f2-c55b6fb8b46c" }, "source": [ "#@title Case 4: Replacing words\n", "word1=\"judge\";word2=\"judgement\"\n", "print(\"Similarity\",similarity(word1,word2),word1,word2)\n", "\n", "word1=\"justiciar\";word2=\"judge\"\n", "print(\"Similarity\",similarity(word1,word2),word1,word2)" ], "execution_count": null, "outputs": [ { "output_type": "stream", "text": [ "Similarity [[0.20353234]] judge judgement\n", "Similarity [[0.37659135]] justiciar judge\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "wOSID8kXHXWt", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "1a60f440-4d9c-4b68-e30f-60fe474cf458" }, "source": [ "#@title Case 5: Entailment\n", "word1=\"pay\";word2=\"debt\"\n", "print(\"Similarity\",similarity(word1,word2),word1,word2)" ], "execution_count": null, "outputs": [ { "output_type": "stream", "text": [ "Similarity [[0.54338676]] pay debt\n" ], "name": "stdout" } ] } ] } ================================================ FILE: Chapter08/Training_OpenAI_GPT_2_CH08.ipynb ================================================ { "nbformat": 4, "nbformat_minor": 0, "metadata": { "colab": { "name": "Training OpenAI GPT-2-CH08.ipynb", "provenance": [], "collapsed_sections": [], "toc_visible": true }, "kernelspec": { "name": "python3", "display_name": "Python 3" }, "accelerator": "GPU" }, "cells": [ { "cell_type": "markdown", "metadata": { "id": "LH2YgC7LfzJZ", "colab_type": "text" }, "source": [ "#Training OpenAI GTP-2\n", "Copyright 2020, Denis Rothman MIT License. Denis Rothman created the Colab notebook using the OpenAI repository, adding title steps for educational purposes only.\n", "\n", "***Code References***\n", "\n", "[Reference: OpenAI Repository](https://github.com/openai/gpt-2)\n", "The repository was cloned and adapted to N Shepperd's repository.\n", "\n", "[Reference: N Shepperd Repository](https://github.com/nshepperd/gpt-2)\n", "The repository was not cloned. N Shepperd's training programs were inserted into the OpenAI Repository. The list of N Shepperd's programs are cited in the 'N Shepperd' section of the notebook. Some programs were modified for educational purposes only to work with this notebook.\n", "\n", "***Model Reference Paper***\n", "\n", "[Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei, Ilya Sutskever,2019,'Language Models are Unsupervised Multitask Learners'](https://d4mucfpksywv.cloudfront.net/better-language-models/language-models.pdf)\n", "\n", "\n", "***Step 1: Pre-requisites:***\n", "\n", "a) activate GPU in the notebook settings runTime menu
\n", "b) Upload the following program files and dset.txt(dataset) with the file manager: train.py,load_dataset.py,encode.py,accumulate,memory_saving_gradients.py,dset.txt" ] }, { "cell_type": "code", "metadata": { "id": "isqdu1fpfmqM", "colab_type": "code", "colab": { "base_uri": "https://localhost:8080/", "height": 102 }, "outputId": "0662d019-7248-4642-c840-7b87c08e7ce7" }, "source": [ "#@title Step 2: Cloning the OpenAI GPT-2 Repository \n", "#!git clone https://github.com/nshepperd/gpt-2.git\n", "!git clone https://github.com/openai/gpt-2.git" ], "execution_count": null, "outputs": [ { "output_type": "stream", "text": [ "Cloning into 'gpt-2'...\n", "remote: Enumerating objects: 230, done.\u001b[K\n", "remote: Total 230 (delta 0), reused 0 (delta 0), pack-reused 230\u001b[K\n", "Receiving objects: 100% (230/230), 4.38 MiB | 7.37 MiB/s, done.\n", "Resolving deltas: 100% (119/119), done.\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "7RHOjN-TjUbj", "colab_type": "code", "colab": { "base_uri": "https://localhost:8080/", "height": 887 }, "outputId": "cc45d116-e7a5-4ff8-e41b-7d440317c9a8" }, "source": [ "#@title Step 3: Installing the requirements\n", "import os # when the VM restarts import os necessary\n", "os.chdir(\"/content/gpt-2\") \n", "!pip3 install -r requirements.txt" ], "execution_count": null, "outputs": [ { "output_type": "stream", "text": [ "Collecting fire>=0.1.3\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/34/a7/0e22e70778aca01a52b9c899d9c145c6396d7b613719cd63db97ffa13f2f/fire-0.3.1.tar.gz (81kB)\n", "\u001b[K |████████████████████████████████| 81kB 2.5MB/s \n", "\u001b[?25hCollecting regex==2017.4.5\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/36/62/c0c0d762ffd4ffaf39f372eb8561b8d491a11ace5a7884610424a8b40f95/regex-2017.04.05.tar.gz (601kB)\n", "\u001b[K |████████████████████████████████| 604kB 8.9MB/s \n", "\u001b[?25hCollecting requests==2.21.0\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/7d/e3/20f3d364d6c8e5d2353c72a67778eb189176f08e873c9900e10c0287b84b/requests-2.21.0-py2.py3-none-any.whl (57kB)\n", "\u001b[K |████████████████████████████████| 61kB 6.5MB/s \n", "\u001b[?25hCollecting tqdm==4.31.1\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/6c/4b/c38b5144cf167c4f52288517436ccafefe9dc01b8d1c190e18a6b154cd4a/tqdm-4.31.1-py2.py3-none-any.whl (48kB)\n", "\u001b[K |████████████████████████████████| 51kB 5.7MB/s \n", "\u001b[?25hRequirement already satisfied: six in /usr/local/lib/python3.6/dist-packages (from fire>=0.1.3->-r requirements.txt (line 1)) (1.12.0)\n", "Requirement already satisfied: termcolor in /usr/local/lib/python3.6/dist-packages (from fire>=0.1.3->-r requirements.txt (line 1)) (1.1.0)\n", "Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.6/dist-packages (from requests==2.21.0->-r requirements.txt (line 3)) (2020.6.20)\n", "Requirement already satisfied: chardet<3.1.0,>=3.0.2 in /usr/local/lib/python3.6/dist-packages (from requests==2.21.0->-r requirements.txt (line 3)) (3.0.4)\n", "Requirement already satisfied: urllib3<1.25,>=1.21.1 in /usr/local/lib/python3.6/dist-packages (from requests==2.21.0->-r requirements.txt (line 3)) (1.24.3)\n", "Collecting idna<2.9,>=2.5\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl (58kB)\n", "\u001b[K |████████████████████████████████| 61kB 6.0MB/s \n", "\u001b[?25hBuilding wheels for collected packages: fire, regex\n", " Building wheel for fire (setup.py) ... \u001b[?25l\u001b[?25hdone\n", " Created wheel for fire: filename=fire-0.3.1-py2.py3-none-any.whl size=111005 sha256=3310fe2adb427d9c42d252d7a50303321e9db5a10c95bd0083efc4df204f9703\n", " Stored in directory: /root/.cache/pip/wheels/c1/61/df/768b03527bf006b546dce284eb4249b185669e65afc5fbb2ac\n", " Building wheel for regex (setup.py) ... \u001b[?25l\u001b[?25hdone\n", " Created wheel for regex: filename=regex-2017.4.5-cp36-cp36m-linux_x86_64.whl size=533204 sha256=410a1a2649a21cad83bbd2d67acd95e54704541f49ca03c2ac08574a44ff5985\n", " Stored in directory: /root/.cache/pip/wheels/75/07/38/3c16b529d50cb4e0cd3dbc7b75cece8a09c132692c74450b01\n", "Successfully built fire regex\n", "\u001b[31mERROR: spacy 2.2.4 has requirement tqdm<5.0.0,>=4.38.0, but you'll have tqdm 4.31.1 which is incompatible.\u001b[0m\n", "\u001b[31mERROR: google-colab 1.0.0 has requirement requests~=2.23.0, but you'll have requests 2.21.0 which is incompatible.\u001b[0m\n", "\u001b[31mERROR: datascience 0.10.6 has requirement folium==0.2.1, but you'll have folium 0.8.3 which is incompatible.\u001b[0m\n", "Installing collected packages: fire, regex, idna, requests, tqdm\n", " Found existing installation: regex 2019.12.20\n", " Uninstalling regex-2019.12.20:\n", " Successfully uninstalled regex-2019.12.20\n", " Found existing installation: idna 2.9\n", " Uninstalling idna-2.9:\n", " Successfully uninstalled idna-2.9\n", " Found existing installation: requests 2.23.0\n", " Uninstalling requests-2.23.0:\n", " Successfully uninstalled requests-2.23.0\n", " Found existing installation: tqdm 4.41.1\n", " Uninstalling tqdm-4.41.1:\n", " Successfully uninstalled tqdm-4.41.1\n", "Successfully installed fire-0.3.1 idna-2.8 regex-2017.4.5 requests-2.21.0 tqdm-4.31.1\n" ], "name": "stdout" }, { "output_type": "display_data", "data": { "application/vnd.colab-display-data+json": { "pip_warning": { "packages": [ "idna", "requests", "tqdm" ] } } }, "metadata": { "tags": [] } } ] }, { "cell_type": "code", "metadata": { "id": "q9vV73Opw68m", "colab_type": "code", "colab": { "base_uri": "https://localhost:8080/", "height": 105 }, "outputId": "8d3e336b-7385-4a51-f054-bf3a1ffd3b6a" }, "source": [ "!pip install toposort" ], "execution_count": null, "outputs": [ { "output_type": "stream", "text": [ "Collecting toposort\n", " Downloading https://files.pythonhosted.org/packages/e9/8a/321cd8ea5f4a22a06e3ba30ef31ec33bea11a3443eeb1d89807640ee6ed4/toposort-1.5-py2.py3-none-any.whl\n", "Installing collected packages: toposort\n", "Successfully installed toposort-1.5\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "_kpNCnh9fyYD", "colab_type": "code", "colab": { "base_uri": "https://localhost:8080/", "height": 51 }, "outputId": "6915ef8b-a48f-4a27-c6d4-43fda10b0e82" }, "source": [ "#@title Step 4: Checking TensorFlow version \n", "#Colab has tf 1.x and tf 2.x installed\n", "#Restart runtime using 'Runtime' -> 'Restart runtime...'\n", "%tensorflow_version 1.x\n", "import tensorflow as tf\n", "print(tf.__version__)" ], "execution_count": null, "outputs": [ { "output_type": "stream", "text": [ "TensorFlow 1.x selected.\n", "1.15.2\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "jvVj0cLVkaPL", "colab_type": "code", "colab": { "base_uri": "https://localhost:8080/", "height": 136 }, "outputId": "12f91649-5661-4323-887a-bed1456ce370" }, "source": [ "#@title Step 5: Downloading 117M parameter GPT-2 Model\n", "# run code and send argument\n", "import os # after runtime is restarted\n", "os.chdir(\"/content/gpt-2\")\n", "!python3 download_model.py '117M' #creates model directory" ], "execution_count": null, "outputs": [ { "output_type": "stream", "text": [ "\rFetching checkpoint: 0%| | 0.00/77.0 [00:00 physical GPU (device: 0, name: Tesla K80, pci bus id: 0000:00:04.0, compute capability: 3.7)\n", "WARNING:tensorflow:From train.py:93: The name tf.placeholder is deprecated. Please use tf.compat.v1.placeholder instead.\n", "\n", "WARNING:tensorflow:From /content/gpt-2/src/model.py:148: The name tf.variable_scope is deprecated. Please use tf.compat.v1.variable_scope instead.\n", "\n", "WARNING:tensorflow:From /content/gpt-2/src/model.py:152: The name tf.get_variable is deprecated. Please use tf.compat.v1.get_variable instead.\n", "\n", "WARNING:tensorflow:From /content/gpt-2/src/model.py:36: The name tf.rsqrt is deprecated. Please use tf.math.rsqrt instead.\n", "\n", "WARNING:tensorflow:From /content/gpt-2/src/sample.py:51: The name tf.AUTO_REUSE is deprecated. Please use tf.compat.v1.AUTO_REUSE instead.\n", "\n", "WARNING:tensorflow:From /content/gpt-2/src/sample.py:64: to_float (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version.\n", "Instructions for updating:\n", "Use `tf.cast` instead.\n", "WARNING:tensorflow:From /content/gpt-2/src/sample.py:16: where (from tensorflow.python.ops.array_ops) is deprecated and will be removed in a future version.\n", "Instructions for updating:\n", "Use tf.where in 2.0, which has the same broadcast rule as np.where\n", "WARNING:tensorflow:From /content/gpt-2/src/sample.py:67: multinomial (from tensorflow.python.ops.random_ops) is deprecated and will be removed in a future version.\n", "Instructions for updating:\n", "Use `tf.random.categorical` instead.\n", "WARNING:tensorflow:From train.py:118: The name tf.trainable_variables is deprecated. Please use tf.compat.v1.trainable_variables instead.\n", "\n", "WARNING:tensorflow:From train.py:122: The name tf.train.AdamOptimizer is deprecated. Please use tf.compat.v1.train.AdamOptimizer instead.\n", "\n", "WARNING:tensorflow:From train.py:145: The name tf.summary.scalar is deprecated. Please use tf.compat.v1.summary.scalar instead.\n", "\n", "WARNING:tensorflow:From train.py:148: The name tf.summary.merge is deprecated. Please use tf.compat.v1.summary.merge instead.\n", "\n", "WARNING:tensorflow:From train.py:150: The name tf.summary.FileWriter is deprecated. Please use tf.compat.v1.summary.FileWriter instead.\n", "\n", "WARNING:tensorflow:From train.py:153: The name tf.train.Saver is deprecated. Please use tf.compat.v1.train.Saver instead.\n", "\n", "WARNING:tensorflow:From train.py:157: The name tf.global_variables_initializer is deprecated. Please use tf.compat.v1.global_variables_initializer instead.\n", "\n", "Loading checkpoint models/117M/model.ckpt\n", "Loading dataset...\n", "100% 1/1 [00:00<00:00, 260.74it/s]\n", "dataset has 29379 tokens\n", "Training...\n", "2020-06-29 09:17:29.007668: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10\n", "[1 | 7.03] loss=3.18 avg=3.18\n", "[2 | 7.96] loss=2.67 avg=2.92\n", "[3 | 8.90] loss=2.92 avg=2.92\n", "[4 | 9.82] loss=3.00 avg=2.94\n", "[5 | 10.76] loss=2.65 avg=2.88\n", "[6 | 11.69] loss=2.88 avg=2.88\n", "[7 | 12.63] loss=2.80 avg=2.87\n", "[8 | 13.57] loss=2.68 avg=2.84\n", "[9 | 14.52] loss=2.88 avg=2.85\n", "[10 | 15.46] loss=3.93 avg=2.96\n", "[11 | 16.40] loss=3.06 avg=2.97\n", "[12 | 17.34] loss=2.48 avg=2.93\n", "[13 | 18.28] loss=2.69 avg=2.91\n", "[14 | 19.22] loss=3.19 avg=2.93\n", "[15 | 20.16] loss=2.29 avg=2.88\n", "[16 | 21.11] loss=2.28 avg=2.84\n", "[17 | 22.04] loss=2.91 avg=2.85\n", "[18 | 22.97] loss=2.67 avg=2.84\n", "[19 | 23.91] loss=2.14 avg=2.80\n", "[20 | 24.85] loss=2.00 avg=2.75\n", "[21 | 25.78] loss=2.58 avg=2.75\n", "[22 | 26.73] loss=2.66 avg=2.74\n", "[23 | 27.67] loss=2.80 avg=2.74\n", "[24 | 28.60] loss=3.18 avg=2.76\n", "[25 | 29.54] loss=2.95 avg=2.77\n", "[26 | 30.47] loss=3.41 avg=2.80\n", "[27 | 31.41] loss=2.92 avg=2.81\n", "[28 | 32.34] loss=2.33 avg=2.79\n", "[29 | 33.27] loss=2.17 avg=2.76\n", "[30 | 34.20] loss=1.87 avg=2.73\n", "[31 | 35.13] loss=2.60 avg=2.72\n", "[32 | 36.06] loss=2.71 avg=2.72\n", "[33 | 37.00] loss=2.82 avg=2.73\n", "[34 | 37.95] loss=2.26 avg=2.71\n", "[35 | 38.89] loss=2.20 avg=2.69\n", "[36 | 39.83] loss=2.48 avg=2.69\n", "[37 | 40.76] loss=2.03 avg=2.66\n", "[38 | 41.70] loss=2.15 avg=2.65\n", "[39 | 42.64] loss=2.57 avg=2.65\n", "[40 | 43.57] loss=2.42 avg=2.64\n", "[41 | 44.50] loss=2.20 avg=2.63\n", "[42 | 45.43] loss=3.01 avg=2.64\n", "[43 | 46.37] loss=2.74 avg=2.64\n", "[44 | 47.30] loss=3.33 avg=2.66\n", "[45 | 48.24] loss=3.14 avg=2.67\n", "[46 | 49.17] loss=2.40 avg=2.67\n", "[47 | 50.11] loss=2.58 avg=2.66\n", "[48 | 51.04] loss=1.93 avg=2.64\n", "[49 | 51.97] loss=3.22 avg=2.66\n", "[50 | 52.91] loss=2.56 avg=2.66\n", "[51 | 53.84] loss=1.95 avg=2.64\n", "[52 | 54.77] loss=2.18 avg=2.63\n", "[53 | 55.70] loss=2.65 avg=2.63\n", "[54 | 56.63] loss=2.29 avg=2.62\n", "[55 | 57.55] loss=2.21 avg=2.61\n", "[56 | 58.49] loss=1.98 avg=2.60\n", "[57 | 59.41] loss=2.47 avg=2.59\n", "[58 | 60.34] loss=1.95 avg=2.58\n", "[59 | 61.26] loss=2.40 avg=2.57\n", "[60 | 62.19] loss=2.22 avg=2.57\n", "[61 | 63.12] loss=3.16 avg=2.58\n", "[62 | 64.05] loss=2.25 avg=2.57\n", "[63 | 64.99] loss=3.32 avg=2.59\n", "[64 | 65.93] loss=2.44 avg=2.59\n", "[65 | 66.86] loss=2.39 avg=2.58\n", "[66 | 67.79] loss=2.23 avg=2.57\n", "[67 | 68.73] loss=2.21 avg=2.57\n", "[68 | 69.66] loss=2.45 avg=2.56\n", "[69 | 70.58] loss=3.28 avg=2.58\n", "[70 | 71.52] loss=2.22 avg=2.57\n", "[71 | 72.45] loss=1.76 avg=2.56\n", "[72 | 73.38] loss=3.01 avg=2.56\n", "[73 | 74.31] loss=2.04 avg=2.55\n", "[74 | 75.25] loss=2.20 avg=2.55\n", "[75 | 76.18] loss=2.43 avg=2.54\n", "[76 | 77.10] loss=3.45 avg=2.56\n", "[77 | 78.03] loss=2.40 avg=2.56\n", "[78 | 78.96] loss=2.34 avg=2.55\n", "[79 | 79.89] loss=2.09 avg=2.55\n", "[80 | 80.82] loss=2.17 avg=2.54\n", "[81 | 81.75] loss=2.27 avg=2.53\n", "[82 | 82.69] loss=2.17 avg=2.53\n", "[83 | 83.62] loss=2.19 avg=2.52\n", "[84 | 84.56] loss=2.73 avg=2.53\n", "[85 | 85.49] loss=2.96 avg=2.53\n", "[86 | 86.43] loss=2.20 avg=2.53\n", "[87 | 87.37] loss=2.10 avg=2.52\n", "[88 | 88.31] loss=2.91 avg=2.53\n", "[89 | 89.24] loss=2.91 avg=2.53\n", "[90 | 90.17] loss=2.07 avg=2.53\n", "[91 | 91.10] loss=2.84 avg=2.53\n", "[92 | 92.03] loss=1.77 avg=2.52\n", "[93 | 92.96] loss=2.68 avg=2.52\n", "[94 | 93.88] loss=2.36 avg=2.52\n", "[95 | 94.81] loss=2.65 avg=2.52\n", "[96 | 95.74] loss=1.89 avg=2.51\n", "[97 | 96.68] loss=2.37 avg=2.51\n", "[98 | 97.60] loss=1.99 avg=2.50\n", "[99 | 98.53] loss=2.62 avg=2.50\n", "Generating samples...\n", "======== SAMPLE 1 ========\n", "ive, and the two are not related.\n", "\n", "The second is the same as the first, but the two are not related.\n", "\n", "The third is the same as the first, but the two are not related.\n", "\n", "The fourth is the same as the first, but the two are not related.\n", "\n", "The fifth is the same as the first, but the two are not related.\n", "\n", "The sixth is the same as the first, but the two are not related.\n", "\n", "The seventh is the same as the first, but the two are not related.\n", "\n", "The eighth is the same as the first, but the two are not related.\n", "\n", "The ninth is the same as the first, but the two are not related.\n", "\n", "The tenth is the same as the first, but the two are not related.\n", "\n", "The eleventh is the same as the first, but the two are not related.\n", "\n", "The twelfth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the two are not related.\n", "\n", "The thirteenth is the same as the first, but the\n", "\n", "[100 | 121.68] loss=1.76 avg=2.49\n", "[101 | 122.61] loss=2.08 avg=2.48\n", "[102 | 123.55] loss=2.05 avg=2.48\n", "[103 | 124.49] loss=2.38 avg=2.48\n", "[104 | 125.43] loss=2.39 avg=2.47\n", "[105 | 126.36] loss=2.23 avg=2.47\n", "[106 | 127.31] loss=2.02 avg=2.46\n", "[107 | 128.24] loss=2.95 avg=2.47\n", "[108 | 129.17] loss=1.90 avg=2.46\n", "[109 | 130.11] loss=2.49 avg=2.46\n", "[110 | 131.04] loss=2.15 avg=2.46\n", "[111 | 131.97] loss=2.17 avg=2.45\n", "[112 | 132.90] loss=2.15 avg=2.45\n", "[113 | 133.83] loss=2.10 avg=2.44\n", "[114 | 134.75] loss=2.61 avg=2.45\n", "[115 | 135.68] loss=2.62 avg=2.45\n", "[116 | 136.61] loss=2.28 avg=2.45\n", "[117 | 137.54] loss=2.04 avg=2.44\n", "[118 | 138.47] loss=1.96 avg=2.43\n", "[119 | 139.40] loss=1.84 avg=2.42\n", "[120 | 140.33] loss=2.49 avg=2.43\n", "[121 | 141.26] loss=1.63 avg=2.41\n", "[122 | 142.19] loss=2.49 avg=2.42\n", "[123 | 143.12] loss=2.08 avg=2.41\n", "[124 | 144.05] loss=1.63 avg=2.40\n", "[125 | 144.97] loss=2.10 avg=2.40\n", "[126 | 145.91] loss=3.43 avg=2.41\n", "[127 | 146.84] loss=2.68 avg=2.41\n", "[128 | 147.78] loss=1.55 avg=2.40\n", "[129 | 148.72] loss=2.65 avg=2.41\n", "[130 | 149.66] loss=1.87 avg=2.40\n", "[131 | 150.59] loss=3.37 avg=2.41\n", "[132 | 151.52] loss=1.48 avg=2.40\n", "[133 | 152.44] loss=2.43 avg=2.40\n", "[134 | 153.37] loss=3.28 avg=2.41\n", "[135 | 154.31] loss=1.49 avg=2.40\n", "[136 | 155.24] loss=1.95 avg=2.39\n", "[137 | 156.17] loss=2.05 avg=2.39\n", "[138 | 157.10] loss=2.05 avg=2.38\n", "[139 | 158.04] loss=2.11 avg=2.38\n", "[140 | 158.97] loss=1.66 avg=2.37\n", "[141 | 159.90] loss=1.82 avg=2.36\n", "[142 | 160.82] loss=2.41 avg=2.36\n", "[143 | 161.75] loss=1.53 avg=2.35\n", "[144 | 162.68] loss=2.33 avg=2.35\n", "[145 | 163.62] loss=1.95 avg=2.35\n", "[146 | 164.56] loss=1.88 avg=2.34\n", "[147 | 165.50] loss=1.91 avg=2.34\n", "[148 | 166.43] loss=1.93 avg=2.33\n", "[149 | 167.36] loss=1.72 avg=2.32\n", "[150 | 168.31] loss=2.56 avg=2.33\n", "[151 | 169.25] loss=2.28 avg=2.32\n", "[152 | 170.19] loss=1.94 avg=2.32\n", "[153 | 171.12] loss=2.83 avg=2.33\n", "[154 | 172.05] loss=1.50 avg=2.32\n", "[155 | 172.98] loss=1.85 avg=2.31\n", "[156 | 173.92] loss=1.74 avg=2.30\n", "[157 | 174.84] loss=1.63 avg=2.29\n", "[158 | 175.78] loss=1.65 avg=2.29\n", "[159 | 176.71] loss=2.11 avg=2.28\n", "[160 | 177.64] loss=1.82 avg=2.28\n", "[161 | 178.57] loss=1.92 avg=2.27\n", "[162 | 179.49] loss=1.85 avg=2.27\n", "[163 | 180.41] loss=2.33 avg=2.27\n", "[164 | 181.34] loss=1.66 avg=2.26\n", "[165 | 182.27] loss=1.46 avg=2.25\n", "[166 | 183.19] loss=1.62 avg=2.24\n", "[167 | 184.12] loss=1.62 avg=2.24\n", "[168 | 185.04] loss=2.43 avg=2.24\n", "[169 | 185.97] loss=1.23 avg=2.23\n", "[170 | 186.89] loss=1.78 avg=2.22\n", "[171 | 187.83] loss=2.42 avg=2.22\n", "[172 | 188.76] loss=1.61 avg=2.22\n", "[173 | 189.70] loss=1.67 avg=2.21\n", "[174 | 190.63] loss=2.53 avg=2.21\n", "[175 | 191.56] loss=1.82 avg=2.21\n", "[176 | 192.49] loss=1.53 avg=2.20\n", "[177 | 193.43] loss=1.21 avg=2.19\n", "[178 | 194.35] loss=2.13 avg=2.19\n", "[179 | 195.28] loss=2.07 avg=2.19\n", "[180 | 196.21] loss=1.44 avg=2.18\n", "[181 | 197.14] loss=2.44 avg=2.18\n", "[182 | 198.07] loss=2.22 avg=2.18\n", "[183 | 199.00] loss=1.86 avg=2.18\n", "[184 | 199.94] loss=2.09 avg=2.18\n", "[185 | 200.87] loss=2.00 avg=2.17\n", "[186 | 201.81] loss=2.12 avg=2.17\n", "[187 | 202.74] loss=1.32 avg=2.16\n", "[188 | 203.67] loss=2.10 avg=2.16\n", "[189 | 204.59] loss=1.52 avg=2.15\n", "[190 | 205.52] loss=1.69 avg=2.15\n", "[191 | 206.46] loss=2.13 avg=2.15\n", "[192 | 207.39] loss=2.10 avg=2.15\n", "[193 | 208.32] loss=2.32 avg=2.15\n", "[194 | 209.25] loss=2.89 avg=2.16\n", "[195 | 210.18] loss=1.48 avg=2.15\n", "[196 | 211.12] loss=1.45 avg=2.14\n", "[197 | 212.05] loss=2.73 avg=2.15\n", "[198 | 212.97] loss=1.91 avg=2.15\n", "[199 | 213.90] loss=1.58 avg=2.14\n", "Generating samples...\n", "======== SAMPLE 1 ========\n", "002\n", "\n", "(1)\n", "\n", "where\n", "\n", "S(x, v, vˆ) =\n", "\n", "(x)\n", "\n", "and\n", "\n", "(v) = 0.\n", "\n", "In this case, we can perform the\n", "\n", "transition operator\n", "\n", "where\n", "\n", "S(x, v, vˆ) =\n", "\n", "(x)\n", "\n", "and\n", "\n", "(v) = 0.\n", "\n", "In this case, we can perform the\n", "\n", "transition operator\n", "\n", "where\n", "\n", "S(x, v, vˆ) =\n", "\n", "(x)\n", "\n", "and\n", "\n", "(v) = 0.\n", "\n", "In this case, we can perform the\n", "\n", "transition operator\n", "\n", "where\n", "S(x, v, vˆ) =\n", "\n", "(x)\n", "\n", "and\n", "\n", "(v) = 0.\n", "\n", "In this case, we can perform the\n", "\n", "transition operator\n", "\n", "where\n", "S(x, v, vˆ) =\n", "\n", "(x)\n", "\n", "and\n", "\n", "(v) = 0.\n", "\n", "In this case, we can perform the\n", "\n", "transition operator\n", "\n", "where\n", "S(x, v, vˆ) =\n", "\n", "(x)\n", "\n", "and\n", "(v) = 0.\n", "\n", "In this case, we can perform the\n", "\n", "transition operator\n", "\n", "where\n", "S(x, v, vˆ) =\n", "\n", "(x)\n", "\n", "and\n", "(v) = 0.\n", "\n", "In this case, we can perform the\n", "\n", "transition operator\n", "\n", "where\n", "S(x, v, vˆ) =\n", "\n", "(x)\n", "\n", "and\n", "(v) = 0.\n", "\n", "In this case, we can perform the\n", "\n", "transition operator\n", "\n", "where\n", "S(x, v, vˆ) =\n", "\n", "(x)\n", "\n", "and\n", "(v) = 0.\n", "\n", "In this case, we can perform the\n", "\n", "transition operator\n", "\n", "where\n", "S(x, v, vˆ) =\n", "\n", "(x)\n", "\n", "and\n", "(v) = 0.\n", "\n", "In this case, we can perform the\n", "\n", "transition operator\n", "\n", "where\n", "S(x, v, vˆ) =\n", "\n", "(x)\n", "\n", "and\n", "(v) = 0.\n", "\n", "In this case, we can perform the\n", "\n", "transition operator\n", "\n", "where\n", "S(x, v, vˆ) =\n", "\n", "(x)\n", "\n", "and\n", "(v) = 0.\n", "\n", "In this case, we can perform the\n", "\n", "transition operator\n", "\n", "where\n", "S(x, v, vˆ) =\n", "\n", "(x)\n", "\n", "and\n", "(v) = 0.\n", "\n", "In this case, we can perform the\n", "\n", "transition operator\n", "\n", "where\n", "S(x, v, vˆ) =\n", "\n", "(x)\n", "\n", "and\n", "(v) = 0.\n", "\n", "In this case, we can perform the\n", "\n", "transition operator\n", "\n", "where\n", "S(x, v, vˆ) =\n", "\n", "(x)\n", "\n", "and\n", "(v) = 0.\n", "\n", "In this case, we can perform the\n", "\n", "transition operator\n", "\n", "where\n", "S(x, v, vˆ) =\n", "\n", "(x)\n", "\n", "and\n", "(v) = 0.\n", "\n", "In this case, we can perform the\n", "\n", "transition operator\n", "\n", "where\n", "S(x, v, vˆ) =\n", "\n", "(x)\n", "\n", "and\n", "(v) = 0.\n", "\n", "In this case, we can perform the\n", "\n", "transition operator\n", "\n", "where\n", "S(x, v, vˆ) =\n", "\n", "(x)\n", "\n", "and\n", "(v) = 0.\n", "\n", "In this case, we can perform the\n", "\n", "transition operator\n", "\n", "where\n", "S(x, v, vˆ) =\n", "\n", "(x)\n", "\n", "and\n", "(v) = 0.\n", "\n", "In this case, we can perform the\n", "\n", "transition operator\n", "\n", "where\n", "S(x, v, vˆ) =\n", "\n", "(x)\n", "\n", "and\n", "(v) = 0.\n", "\n", "In this case, we can perform the\n", "\n", "transition operator\n", "\n", "where\n", "S(x, v, vˆ) =\n", "\n", "(x)\n", "\n", "and\n", "(v) = 0.\n", "\n", "In this case, we can perform the\n", "\n", "transition operator\n", "\n", "where\n", "S(x, v, vˆ) =\n", "\n", "(x)\n", "\n", "and\n", "(v) = 0.\n", "\n", "In this case, we can perform the\n", "\n", "transition operator\n", "\n", "where\n", "S(x, v, vˆ) =\n", "\n", "(x)\n", "\n", "[200 | 235.36] loss=2.05 avg=2.14\n", "[201 | 236.29] loss=2.65 avg=2.15\n", "[202 | 237.21] loss=1.34 avg=2.14\n", "[203 | 238.14] loss=1.23 avg=2.13\n", "[204 | 239.06] loss=1.69 avg=2.12\n", "[205 | 240.00] loss=1.38 avg=2.11\n", "[206 | 240.94] loss=1.44 avg=2.10\n", "[207 | 241.87] loss=2.10 avg=2.10\n", "[208 | 242.81] loss=1.89 avg=2.10\n", "[209 | 243.74] loss=2.23 avg=2.10\n", "[210 | 244.67] loss=1.67 avg=2.10\n", "[211 | 245.60] loss=1.49 avg=2.09\n", "[212 | 246.53] loss=1.76 avg=2.09\n", "[213 | 247.46] loss=1.46 avg=2.08\n", "[214 | 248.39] loss=1.55 avg=2.07\n", "[215 | 249.32] loss=1.73 avg=2.07\n", "[216 | 250.25] loss=1.28 avg=2.06\n", "[217 | 251.19] loss=2.06 avg=2.06\n", "[218 | 252.11] loss=1.38 avg=2.05\n", "[219 | 253.04] loss=1.70 avg=2.05\n", "[220 | 253.96] loss=1.93 avg=2.05\n", "[221 | 254.90] loss=1.72 avg=2.05\n", "[222 | 255.83] loss=1.43 avg=2.04\n", "[223 | 256.77] loss=1.31 avg=2.03\n", "[224 | 257.70] loss=1.37 avg=2.02\n", "[225 | 258.64] loss=1.23 avg=2.01\n", "[226 | 259.58] loss=1.39 avg=2.01\n", "[227 | 260.51] loss=1.38 avg=2.00\n", "[228 | 261.45] loss=1.91 avg=2.00\n", "[229 | 262.38] loss=1.49 avg=1.99\n", "[230 | 263.31] loss=2.82 avg=2.00\n", "[231 | 264.25] loss=1.32 avg=1.99\n", "[232 | 265.17] loss=1.44 avg=1.99\n", "[233 | 266.10] loss=1.64 avg=1.98\n", "[234 | 267.03] loss=1.49 avg=1.98\n", "[235 | 267.96] loss=1.15 avg=1.97\n", "[236 | 268.90] loss=1.86 avg=1.97\n", "[237 | 269.83] loss=1.50 avg=1.96\n", "[238 | 270.76] loss=1.42 avg=1.96\n", "[239 | 271.70] loss=1.60 avg=1.95\n", "[240 | 272.61] loss=1.37 avg=1.95\n", "[241 | 273.55] loss=1.34 avg=1.94\n", "[242 | 274.48] loss=1.20 avg=1.93\n", "[243 | 275.40] loss=1.24 avg=1.93\n", "[244 | 276.32] loss=1.64 avg=1.92\n", "[245 | 277.25] loss=1.20 avg=1.91\n", "[246 | 278.18] loss=1.91 avg=1.91\n", "[247 | 279.11] loss=1.71 avg=1.91\n", "[248 | 280.04] loss=1.19 avg=1.90\n", "[249 | 280.96] loss=1.61 avg=1.90\n", "[250 | 281.90] loss=1.61 avg=1.90\n", "[251 | 282.83] loss=1.36 avg=1.89\n", "[252 | 283.76] loss=1.63 avg=1.89\n", "[253 | 284.69] loss=2.02 avg=1.89\n", "[254 | 285.64] loss=1.33 avg=1.88\n", "[255 | 286.58] loss=1.04 avg=1.88\n", "[256 | 287.51] loss=1.20 avg=1.87\n", "[257 | 288.45] loss=1.43 avg=1.86\n", "[258 | 289.38] loss=1.03 avg=1.85\n", "[259 | 290.30] loss=1.04 avg=1.85\n", "[260 | 291.24] loss=1.84 avg=1.85\n", "[261 | 292.17] loss=2.42 avg=1.85\n", "[262 | 293.11] loss=1.92 avg=1.85\n", "[263 | 294.04] loss=1.78 avg=1.85\n", "[264 | 294.97] loss=1.89 avg=1.85\n", "[265 | 295.90] loss=1.04 avg=1.84\n", "[266 | 296.83] loss=1.08 avg=1.84\n", "[267 | 297.76] loss=2.00 avg=1.84\n", "[268 | 298.71] loss=1.56 avg=1.83\n", "[269 | 299.64] loss=1.78 avg=1.83\n", "[270 | 300.58] loss=2.13 avg=1.84\n", "[271 | 301.52] loss=1.21 avg=1.83\n", "[272 | 302.45] loss=1.03 avg=1.82\n", "[273 | 303.39] loss=2.25 avg=1.83\n", "[274 | 304.33] loss=1.13 avg=1.82\n", "[275 | 305.26] loss=1.66 avg=1.82\n", "[276 | 306.18] loss=1.40 avg=1.81\n", "[277 | 307.11] loss=1.11 avg=1.80\n", "[278 | 308.04] loss=1.41 avg=1.80\n", "[279 | 308.98] loss=2.19 avg=1.80\n", "[280 | 309.91] loss=1.21 avg=1.80\n", "[281 | 310.84] loss=0.96 avg=1.79\n", "[282 | 311.77] loss=1.13 avg=1.78\n", "[283 | 312.70] loss=0.89 avg=1.77\n", "[284 | 313.64] loss=1.72 avg=1.77\n", "[285 | 314.57] loss=1.03 avg=1.76\n", "[286 | 315.50] loss=2.07 avg=1.77\n", "[287 | 316.43] loss=0.93 avg=1.76\n", "[288 | 317.36] loss=1.32 avg=1.75\n", "[289 | 318.28] loss=0.93 avg=1.75\n", "[290 | 319.21] loss=1.28 avg=1.74\n", "[291 | 320.14] loss=2.47 avg=1.75\n", "[292 | 321.06] loss=1.72 avg=1.75\n", "[293 | 321.99] loss=0.88 avg=1.74\n", "[294 | 322.92] loss=1.20 avg=1.73\n", "[295 | 323.86] loss=0.93 avg=1.72\n", "[296 | 324.80] loss=2.07 avg=1.73\n", "[297 | 325.74] loss=0.84 avg=1.72\n", "[298 | 326.67] loss=1.90 avg=1.72\n", "[299 | 327.60] loss=1.64 avg=1.72\n", "Generating samples...\n", "======== SAMPLE 1 ========\n", "S.\n", "The first step in the identification of the chemotactic regime is to compare the two regimes. In the first case, we compare the two regimes, while in the second case, we compare the chemotactic regime. In the first case, we compare the two regimes, while in the second case, we compare the chemotactic regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we compare the two regimes, while in the second case, we compare the regime. In the first case, we\n", "\n", "[300 | 348.94] loss=1.11 avg=1.71\n", "[301 | 349.86] loss=1.62 avg=1.71\n", "[302 | 350.79] loss=1.19 avg=1.71\n", "[303 | 351.72] loss=0.70 avg=1.70\n", "[304 | 352.65] loss=1.82 avg=1.70\n", "[305 | 353.58] loss=0.90 avg=1.69\n", "[306 | 354.50] loss=0.91 avg=1.68\n", "[307 | 355.43] loss=1.17 avg=1.68\n", "[308 | 356.35] loss=0.75 avg=1.67\n", "[309 | 357.28] loss=2.11 avg=1.67\n", "[310 | 358.21] loss=0.94 avg=1.66\n", "[311 | 359.13] loss=1.06 avg=1.66\n", "[312 | 360.06] loss=1.33 avg=1.65\n", "[313 | 360.99] loss=1.52 avg=1.65\n", "[314 | 361.93] loss=1.02 avg=1.65\n", "[315 | 362.86] loss=0.63 avg=1.63\n", "[316 | 363.79] loss=1.17 avg=1.63\n", "[317 | 364.74] loss=0.87 avg=1.62\n", "[318 | 365.68] loss=1.61 avg=1.62\n", "[319 | 366.61] loss=1.14 avg=1.62\n", "[320 | 367.54] loss=0.88 avg=1.61\n", "[321 | 368.47] loss=1.66 avg=1.61\n", "[322 | 369.40] loss=0.88 avg=1.60\n", "[323 | 370.32] loss=0.77 avg=1.59\n", "[324 | 371.26] loss=1.39 avg=1.59\n", "[325 | 372.19] loss=1.60 avg=1.59\n", "[326 | 373.12] loss=0.89 avg=1.58\n", "[327 | 374.05] loss=1.57 avg=1.58\n", "[328 | 374.98] loss=1.62 avg=1.58\n", "[329 | 375.91] loss=2.22 avg=1.59\n", "[330 | 376.84] loss=1.21 avg=1.59\n", "[331 | 377.77] loss=1.09 avg=1.58\n", "[332 | 378.70] loss=1.68 avg=1.58\n", "[333 | 379.64] loss=0.57 avg=1.57\n", "[334 | 380.57] loss=0.94 avg=1.57\n", "[335 | 381.51] loss=0.59 avg=1.56\n", "[336 | 382.44] loss=1.25 avg=1.55\n", "[337 | 383.38] loss=1.40 avg=1.55\n", "[338 | 384.31] loss=0.87 avg=1.54\n", "[339 | 385.24] loss=0.54 avg=1.53\n", "[340 | 386.17] loss=1.17 avg=1.53\n", "[341 | 387.10] loss=0.98 avg=1.52\n", "[342 | 388.04] loss=1.51 avg=1.52\n", "[343 | 388.96] loss=0.44 avg=1.51\n", "[344 | 389.89] loss=1.37 avg=1.51\n", "[345 | 390.81] loss=1.65 avg=1.51\n", "[346 | 391.75] loss=1.73 avg=1.51\n", "[347 | 392.67] loss=1.36 avg=1.51\n", "[348 | 393.61] loss=1.15 avg=1.51\n", "[349 | 394.54] loss=0.94 avg=1.50\n", "[350 | 395.47] loss=1.27 avg=1.50\n", "[351 | 396.39] loss=1.38 avg=1.50\n", "[352 | 397.32] loss=0.92 avg=1.49\n", "[353 | 398.25] loss=1.13 avg=1.49\n", "[354 | 399.17] loss=1.38 avg=1.49\n", "[355 | 400.10] loss=0.82 avg=1.48\n", "[356 | 401.03] loss=1.94 avg=1.49\n", "[357 | 401.95] loss=0.82 avg=1.48\n", "[358 | 402.87] loss=0.41 avg=1.47\n", "[359 | 403.80] loss=2.16 avg=1.48\n", "[360 | 404.73] loss=2.05 avg=1.48\n", "[361 | 405.66] loss=0.86 avg=1.48\n", "[362 | 406.60] loss=1.46 avg=1.48\n", "[363 | 407.53] loss=1.14 avg=1.47\n", "[364 | 408.46] loss=1.03 avg=1.47\n", "[365 | 409.40] loss=1.86 avg=1.47\n", "[366 | 410.33] loss=1.84 avg=1.48\n", "[367 | 411.27] loss=1.29 avg=1.47\n", "[368 | 412.19] loss=0.92 avg=1.47\n", "[369 | 413.12] loss=2.56 avg=1.48\n", "[370 | 414.05] loss=0.87 avg=1.47\n", "[371 | 414.98] loss=1.09 avg=1.47\n", "[372 | 415.90] loss=0.86 avg=1.46\n", "[373 | 416.83] loss=1.37 avg=1.46\n", "[374 | 417.77] loss=1.08 avg=1.46\n", "[375 | 418.70] loss=1.24 avg=1.46\n", "[376 | 419.63] loss=1.53 avg=1.46\n", "[377 | 420.56] loss=1.00 avg=1.45\n", "[378 | 421.50] loss=0.83 avg=1.45\n", "[379 | 422.42] loss=1.16 avg=1.44\n", "[380 | 423.35] loss=1.40 avg=1.44\n", "[381 | 424.27] loss=1.45 avg=1.44\n", "[382 | 425.21] loss=1.42 avg=1.44\n", "[383 | 426.14] loss=1.52 avg=1.44\n", "[384 | 427.07] loss=0.55 avg=1.43\n", "[385 | 428.00] loss=0.55 avg=1.42\n", "[386 | 428.94] loss=2.22 avg=1.43\n", "[387 | 429.88] loss=0.53 avg=1.42\n", "[388 | 430.81] loss=0.76 avg=1.42\n", "[389 | 431.75] loss=0.72 avg=1.41\n", "[390 | 432.68] loss=1.05 avg=1.41\n", "[391 | 433.61] loss=0.59 avg=1.40\n", "[392 | 434.54] loss=1.73 avg=1.40\n", "[393 | 435.47] loss=0.94 avg=1.40\n", "[394 | 436.39] loss=1.34 avg=1.40\n", "[395 | 437.32] loss=0.58 avg=1.39\n", "[396 | 438.26] loss=0.55 avg=1.38\n", "[397 | 439.19] loss=1.17 avg=1.38\n", "[398 | 440.12] loss=1.58 avg=1.38\n", "[399 | 441.04] loss=0.65 avg=1.37\n", "Generating samples...\n", "======== SAMPLE 1 ========\n", " orientation, and the direction of alignment of the fibers.\n", "In particular, we shall consider a diffusive model for chemotaxis. In particular, we shall consider a diffusion-advection model. In particular, we shall consider a drift-diffusion model. In particular, we shall consider a non-local sensing model. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a directional cue. In particular, we shall consider a directional cue. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a nonsharing non-local non-sensing kernel.\n", "In particular, we shall consider a non-local non-sensing kernel for which the kernel distribution is non-zero. In particular, we shall not consider a non-local non-sensing kernel. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a non-local non-sensing kernel. In particular, we shall consider a nonsharing non-local non-sensing kernel. In particular, we shall not consider a non-local non-sensing kernel. In particular, we shall not consider a non-local non-sensing kernel. In particular, we shall not consider a nonsharing non-local non-sensing kernel. In particular, we shall not consider a nonsharing non-local non-sensing kernel. In particular, we shall not consider a nonsharing non-local non-sensing kernel. In particular, we shall not consider a nonsharing non-local non-sensing kernel. In particular, we shall not consider a nonsharing nonsharing kernel.\n", "In particular, we shall consider a non-local non-sensing kernel for which the kernel distribution is non-zero. In particular, we shall not consider a non-local non-sensing kernel. In particular, we shall not consider a non-local non-sensing kernel. In particular, we shall not consider a non-local non-sensing kernel. In particular, we shall not consider a nonsharing nonsharing kernel.\n", "In particular, we shall not consider a nonsharing nonsharing kernel. In particular, we shall not consider a nonsharing non-local non-sensing kernel. In particular, we shall not consider a non-local non-sensing kernel. In particular, we shall not consider a nonsharing nonsharing kernel.\n", "In particular, we shall not consider a nonsharing nonsharing kernel. In particular, we shall not consider a non-local non-sensing kernel. In particular, we shall not consider a nonsharing nonsharing kernel. In particular, we shall not consider a nonsharing nonsharing kernel.\n", "In particular, we shall not consider a nonsharing nonsharing kernel. In particular, we shall not consider a non-local non-sensing kernel. In particular, we shall not consider a nonsharing non-local non-sensing kernel. In particular, we shall not consider a nonsharing nonsharing kernel.\n", "In particular, we shall not consider a nonsharing non-local non-sensing kernel. In particular, we shall not consider a nonsharing non-local non-sensing kernel. In particular, we shall not consider a non-local non-sensing kernel. In particular, we shall not consider a nonsharing nonsharing kernel.\n", "In particular, we shall not consider a non-local non-sensing kernel. In particular, we shall not consider a nonsharing non-local non-sensing kernel. In particular, we shall not consider a nonsharing\n", "\n", "[400 | 462.48] loss=1.44 avg=1.37\n", "[401 | 463.41] loss=0.61 avg=1.36\n", "[402 | 464.34] loss=0.57 avg=1.36\n", "[403 | 465.27] loss=0.79 avg=1.35\n", "[404 | 466.20] loss=0.37 avg=1.34\n", "[405 | 467.13] loss=0.87 avg=1.34\n", "[406 | 468.06] loss=0.73 avg=1.33\n", "[407 | 468.99] loss=1.05 avg=1.33\n", "[408 | 469.93] loss=1.21 avg=1.33\n", "[409 | 470.86] loss=0.55 avg=1.32\n", "[410 | 471.80] loss=2.09 avg=1.33\n", "[411 | 472.74] loss=0.61 avg=1.32\n", "[412 | 473.66] loss=1.43 avg=1.32\n", "[413 | 474.59] loss=0.50 avg=1.31\n", "[414 | 475.53] loss=1.01 avg=1.31\n", "[415 | 476.46] loss=1.49 avg=1.31\n", "[416 | 477.38] loss=0.71 avg=1.30\n", "[417 | 478.31] loss=0.53 avg=1.30\n", "[418 | 479.24] loss=0.71 avg=1.29\n", "[419 | 480.17] loss=0.94 avg=1.29\n", "[420 | 481.11] loss=0.92 avg=1.28\n", "[421 | 482.03] loss=1.16 avg=1.28\n", "[422 | 482.96] loss=0.75 avg=1.28\n", "[423 | 483.89] loss=1.20 avg=1.28\n", "[424 | 484.82] loss=0.50 avg=1.27\n", "[425 | 485.75] loss=0.49 avg=1.26\n", "[426 | 486.68] loss=0.81 avg=1.25\n", "[427 | 487.60] loss=1.84 avg=1.26\n", "[428 | 488.53] loss=0.55 avg=1.25\n", "[429 | 489.45] loss=0.38 avg=1.24\n", "[430 | 490.38] loss=1.23 avg=1.24\n", "[431 | 491.30] loss=0.91 avg=1.24\n", "[432 | 492.23] loss=0.77 avg=1.24\n", "[433 | 493.15] loss=0.70 avg=1.23\n", "[434 | 494.09] loss=1.00 avg=1.23\n", "[435 | 495.01] loss=1.86 avg=1.24\n", "[436 | 495.95] loss=1.14 avg=1.23\n", "[437 | 496.88] loss=0.73 avg=1.23\n", "[438 | 497.82] loss=0.51 avg=1.22\n", "[439 | 498.76] loss=0.62 avg=1.22\n", "[440 | 499.68] loss=1.04 avg=1.21\n", "[441 | 500.61] loss=1.39 avg=1.22\n", "[442 | 501.54] loss=0.60 avg=1.21\n", "[443 | 502.47] loss=0.99 avg=1.21\n", "[444 | 503.40] loss=1.10 avg=1.21\n", "[445 | 504.33] loss=1.03 avg=1.20\n", "[446 | 505.26] loss=0.93 avg=1.20\n", "[447 | 506.19] loss=0.89 avg=1.20\n", "[448 | 507.12] loss=0.77 avg=1.19\n", "[449 | 508.05] loss=1.22 avg=1.19\n", "[450 | 508.98] loss=0.78 avg=1.19\n", "[451 | 509.90] loss=0.68 avg=1.18\n", "[452 | 510.83] loss=0.34 avg=1.18\n", "[453 | 511.78] loss=0.82 avg=1.17\n", "[454 | 512.71] loss=0.49 avg=1.17\n", "[455 | 513.64] loss=0.92 avg=1.16\n", "[456 | 514.58] loss=1.10 avg=1.16\n", "[457 | 515.53] loss=0.68 avg=1.16\n", "[458 | 516.46] loss=0.36 avg=1.15\n", "[459 | 517.40] loss=0.37 avg=1.14\n", "[460 | 518.34] loss=1.00 avg=1.14\n", "[461 | 519.27] loss=0.76 avg=1.14\n", "[462 | 520.19] loss=0.32 avg=1.13\n", "[463 | 521.12] loss=0.35 avg=1.12\n", "[464 | 522.05] loss=0.32 avg=1.11\n", "[465 | 522.98] loss=0.93 avg=1.11\n", "[466 | 523.91] loss=0.78 avg=1.11\n", "[467 | 524.84] loss=0.55 avg=1.10\n", "[468 | 525.77] loss=0.70 avg=1.10\n", "[469 | 526.70] loss=0.80 avg=1.09\n", "[470 | 527.62] loss=0.68 avg=1.09\n", "[471 | 528.54] loss=0.68 avg=1.09\n", "[472 | 529.46] loss=0.42 avg=1.08\n", "[473 | 530.39] loss=0.43 avg=1.07\n", "[474 | 531.32] loss=0.63 avg=1.07\n", "[475 | 532.24] loss=0.56 avg=1.06\n", "[476 | 533.17] loss=0.63 avg=1.06\n", "[477 | 534.09] loss=1.12 avg=1.06\n", "[478 | 535.01] loss=0.33 avg=1.05\n", "[479 | 535.95] loss=0.87 avg=1.05\n", "[480 | 536.88] loss=0.43 avg=1.04\n", "[481 | 537.82] loss=1.60 avg=1.05\n", "[482 | 538.76] loss=0.51 avg=1.04\n", "[483 | 539.70] loss=0.76 avg=1.04\n", "[484 | 540.63] loss=0.46 avg=1.04\n", "[485 | 541.55] loss=0.71 avg=1.03\n", "[486 | 542.49] loss=1.91 avg=1.04\n", "[487 | 543.42] loss=1.45 avg=1.05\n", "[488 | 544.35] loss=0.51 avg=1.04\n", "[489 | 545.28] loss=0.66 avg=1.04\n", "[490 | 546.20] loss=0.39 avg=1.03\n", "[491 | 547.13] loss=0.88 avg=1.03\n", "[492 | 548.06] loss=0.40 avg=1.02\n", "[493 | 549.00] loss=0.44 avg=1.02\n", "[494 | 549.92] loss=0.51 avg=1.01\n", "[495 | 550.85] loss=1.18 avg=1.01\n", "[496 | 551.77] loss=0.48 avg=1.01\n", "[497 | 552.71] loss=0.48 avg=1.00\n", "[498 | 553.64] loss=0.94 avg=1.00\n", "[499 | 554.57] loss=0.60 avg=1.00\n", "Generating samples...\n", "======== SAMPLE 1 ========\n", ", the average speed of the fibers is given by\n", "the momentum\n", "T[q, S](x, v, vˆ) = c(x)\n", "Z\n", "R+\n", "γS (λ)S(x + λvˆ) dλ Z\n", "R+\n", "γq(λ) q(x + λvˆ, vˆ) dλ ψ(v).\n", "In particular, the velocity\n", "T[q, S] = T0(q, v, vˆ)\n", "Z\n", "R+\n", "γS (λ)S(x + λvˆ) q(x + λvˆ, vˆ) dλ =\n", "0\n", "Z\n", "R+\n", "γS (λ)S(x + λvˆ) dλ\n", "(1)\n", "and the average\n", "T0[q, S] = T0(q, v, vˆ)\n", "T0(k, v, vˆ)\n", "T0(a) =\n", "Γ\n", "Γ\n", "q\n", "k\n", "\n", "h\n", "\n", "| Γ\n", "S\n", "e\n", "i\n", "|\n", "Z\n", "R+\n", "γS (λ)S(x + λvˆ) dλ i = 1\n", "Γ\n", "Γq\n", "k\n", "h\n", "I\n", "(a) vˆ =\n", "I(a)\n", "Z\n", "R+\n", "γS (λ)S(x + λvˆ) dλ i = 1\n", "Γ\n", "Γq\n", "k\n", "\n", "I\n", "(b) vˆ =\n", "I(b)\n", "Z\n", "R+\n", "γS (λ)S(x + λvˆ) dλ i = 1\n", "Γ\n", "Γ\n", "q\n", "k\n", "\n", "h\n", "I\n", "(c) vˆ =\n", "I(c)\n", "Z\n", "R+\n", "γS (λ)S(x + λvˆ) dλ i = 1\n", "Γ\n", "Γ\n", "q\n", "k\n", "\n", "h\n", "I\n", "(d) vˆ =\n", "I(d)\n", "Z\n", "R+\n", "γS (λ)S(x + λvˆ) dλ i = 1\n", "Γ\n", "Γ\n", "q\n", "k\n", "h\n", "I\n", "(e) vˆ =\n", "I(e)\n", "Z\n", "R+\n", "γS (λ)S(x + λvˆ) dλ i = 1\n", "Γ\n", "Γ\n", "q\n", "k\n", "h\n", "I\n", "(f) vˆ =\n", "I(f)\n", "Z\n", "R+\n", "γS (λ)S(x + λvˆ) dλ i = 1\n", "Γ\n", "Γ\n", "q\n", "k\n", "h\n", "I\n", "(g) vˆ =\n", "I(g)\n", "Z\n", "R+\n", "γS (λ)S(x + λvˆ) dλ i = 1\n", "Γ\n", "Γ\n", "q\n", "k\n", "h\n", "I\n", "(h) vˆ =\n", "I(h)\n", "Z\n", "R+\n", "γS (λ)S(x + λvˆ) dλ i = 1\n", "Γ\n", "Γ\n", "q\n", "k\n", "h\n", "I\n", "(i) vˆ =\n", "I(i)\n", "Z\n", "R+\n", "γS (λ) S(x + λvˆ) dλ i = 1\n", "Γ\n", "Γ\n", "q\n", "k\n", "h\n", "I\n", "(k) vˆ =\n", "I(k)\n", "Z\n", "R+\n", "γS (λ) S(x + λvˆ) dλ i = 1\n", "Γ\n", "Γ\n", "q\n", "k\n", "h\n", "I\n", "(l) vˆ =\n", "I(l)\n", "Z\n", "R+\n", "γS (λ) S(x + λvˆ) dλ i = 1\n", "Γ\n", "Γ\n", "q\n", "k\n", "h\n", "I\n", "(m) vˆ =\n", "\n", "I(m)\n", "Z\n", "R+\n", "γS (λ) S(x + λvˆ) dλ i = 1\n", "Γ\n", "Γ\n", "q\n", "k\n", "h\n", "I\n", "(n) vˆ =\n", "\n", "I(n)\n", "Z\n", "R+\n", "γS (λ) S(x + λvˆ) dλ i = 1\n", "Γ\n", "Γ\n", "q\n", "k\n", "h\n", "I\n", "(o) vˆ =\n", "\n", "I(o)\n", "Z\n", "R+\n", "γS (λ) S(x + λvˆ) dλ i = 1\n", "Γ\n", "Γ\n", "\n", "[500 | 576.77] loss=0.66 avg=0.99\n", "[501 | 577.70] loss=0.77 avg=0.99\n", "[502 | 578.62] loss=0.26 avg=0.98\n", "[503 | 579.54] loss=0.15 avg=0.98\n", "[504 | 580.47] loss=1.50 avg=0.98\n", "[505 | 581.40] loss=0.53 avg=0.98\n", "[506 | 582.33] loss=0.56 avg=0.97\n", "[507 | 583.26] loss=0.41 avg=0.97\n", "[508 | 584.20] loss=0.32 avg=0.96\n", "[509 | 585.13] loss=0.35 avg=0.95\n", "[510 | 586.06] loss=0.74 avg=0.95\n", "[511 | 586.99] loss=0.46 avg=0.95\n", "[512 | 587.93] loss=0.78 avg=0.95\n", "[513 | 588.86] loss=0.67 avg=0.94\n", "[514 | 589.79] loss=0.45 avg=0.94\n", "[515 | 590.73] loss=0.97 avg=0.94\n", "[516 | 591.66] loss=0.94 avg=0.94\n", "[517 | 592.58] loss=0.36 avg=0.93\n", "[518 | 593.51] loss=0.69 avg=0.93\n", "[519 | 594.44] loss=1.25 avg=0.93\n", "[520 | 595.36] loss=0.78 avg=0.93\n", "[521 | 596.29] loss=0.58 avg=0.93\n", "[522 | 597.23] loss=0.42 avg=0.92\n", "[523 | 598.16] loss=0.43 avg=0.92\n", "[524 | 599.09] loss=0.30 avg=0.91\n", "[525 | 600.03] loss=0.62 avg=0.91\n", "[526 | 600.96] loss=0.21 avg=0.90\n", "[527 | 601.90] loss=0.57 avg=0.90\n", "[528 | 602.83] loss=0.47 avg=0.89\n", "[529 | 603.77] loss=0.61 avg=0.89\n", "[530 | 604.71] loss=0.69 avg=0.89\n", "[531 | 605.64] loss=0.57 avg=0.89\n", "[532 | 606.57] loss=0.34 avg=0.88\n", "[533 | 607.49] loss=0.95 avg=0.88\n", "[534 | 608.42] loss=0.93 avg=0.88\n", "[535 | 609.34] loss=1.00 avg=0.88\n", "[536 | 610.27] loss=0.45 avg=0.88\n", "[537 | 611.20] loss=0.53 avg=0.88\n", "[538 | 612.12] loss=0.29 avg=0.87\n", "[539 | 613.05] loss=0.27 avg=0.86\n", "[540 | 613.97] loss=0.60 avg=0.86\n", "[541 | 614.90] loss=0.42 avg=0.86\n", "[542 | 615.82] loss=0.22 avg=0.85\n", "[543 | 616.74] loss=0.41 avg=0.85\n", "[544 | 617.67] loss=0.17 avg=0.84\n", "[545 | 618.59] loss=0.43 avg=0.83\n", "[546 | 619.52] loss=0.47 avg=0.83\n", "[547 | 620.44] loss=0.69 avg=0.83\n", "[548 | 621.36] loss=0.27 avg=0.82\n", "[549 | 622.29] loss=0.65 avg=0.82\n", "[550 | 623.22] loss=1.12 avg=0.83\n", "[551 | 624.15] loss=0.54 avg=0.82\n", "[552 | 625.08] loss=0.46 avg=0.82\n", "[553 | 626.02] loss=0.62 avg=0.82\n", "[554 | 626.96] loss=0.27 avg=0.81\n", "[555 | 627.89] loss=0.35 avg=0.81\n", "[556 | 628.82] loss=0.25 avg=0.80\n", "[557 | 629.76] loss=0.41 avg=0.80\n", "[558 | 630.69] loss=0.26 avg=0.79\n", "[559 | 631.62] loss=0.68 avg=0.79\n", "[560 | 632.56] loss=0.24 avg=0.78\n", "[561 | 633.49] loss=0.21 avg=0.78\n", "[562 | 634.42] loss=0.30 avg=0.77\n", "[563 | 635.35] loss=0.32 avg=0.77\n", "[564 | 636.29] loss=0.31 avg=0.77\n", "[565 | 637.21] loss=0.36 avg=0.76\n", "[566 | 638.14] loss=0.52 avg=0.76\n", "[567 | 639.08] loss=0.16 avg=0.75\n", "[568 | 640.01] loss=0.50 avg=0.75\n", "[569 | 640.95] loss=0.73 avg=0.75\n", "[570 | 641.88] loss=0.50 avg=0.75\n", "[571 | 642.81] loss=0.43 avg=0.74\n", "[572 | 643.75] loss=0.68 avg=0.74\n", "[573 | 644.67] loss=0.61 avg=0.74\n", "[574 | 645.61] loss=0.13 avg=0.74\n", "[575 | 646.54] loss=0.21 avg=0.73\n", "[576 | 647.47] loss=0.34 avg=0.73\n", "[577 | 648.39] loss=0.33 avg=0.72\n", "[578 | 649.32] loss=0.22 avg=0.72\n", "[579 | 650.26] loss=0.52 avg=0.72\n", "[580 | 651.19] loss=0.26 avg=0.71\n", "[581 | 652.12] loss=0.51 avg=0.71\n", "[582 | 653.05] loss=0.76 avg=0.71\n", "[583 | 653.97] loss=0.73 avg=0.71\n", "[584 | 654.91] loss=0.34 avg=0.71\n", "[585 | 655.84] loss=0.42 avg=0.70\n", "[586 | 656.76] loss=0.48 avg=0.70\n", "[587 | 657.69] loss=0.34 avg=0.70\n", "[588 | 658.62] loss=0.33 avg=0.69\n", "[589 | 659.54] loss=0.56 avg=0.69\n", "[590 | 660.47] loss=0.52 avg=0.69\n", "[591 | 661.40] loss=0.32 avg=0.69\n", "[592 | 662.32] loss=0.19 avg=0.68\n", "[593 | 663.25] loss=0.21 avg=0.68\n", "[594 | 664.16] loss=0.16 avg=0.67\n", "[595 | 665.10] loss=0.69 avg=0.67\n", "[596 | 666.02] loss=0.27 avg=0.67\n", "[597 | 666.95] loss=0.17 avg=0.66\n", "[598 | 667.88] loss=0.93 avg=0.67\n", "[599 | 668.82] loss=0.24 avg=0.66\n", "Generating samples...\n", "======== SAMPLE 1 ========\n", " values, and the kernel\n", "(x, v, vˆ) is the average of the variance of the\n", "choosing direction. The mean direction is given by the mean/(vˆ)\n", "distribution\n", "0, v, vˆ =\n", "20\n", "4\n", "4\n", "(1) (2) (3)\n", "(4)\n", "(5)\n", "and the variance, i.e., the\n", "turning kernel, given by the\n", "turning(k(x)) =\n", "90\n", "Γ\n", "0\n", "q(x, v, vˆ)\n", "0\n", "that is the turning operator, is given by\n", "UT\n", "dv = V\n", "0\n", "Γ\n", "q(x, v, vˆ)\n", "1\n", "that is the variance of the choice of the\n", "turning operator, given by UT\n", "dvˆ = UT\n", "2\n", "Γ\n", "q(x, v, vˆ)\n", "2\n", "that is the turning\n", "direction given by UT\n", "0\n", "Γ\n", "q\n", "(x, v, vˆ)\n", "(6) (7)\n", "and the mean velocity\n", "Vˆ\n", "0\n", "(max) =\n", "70\n", "Γ\n", "q\n", "(x, v, vˆ)\n", "(8)\n", "and the\n", "U1\n", "q\n", "(x, v, vˆ) =\n", "70\n", "Γ\n", "q\n", "(x, v, vˆ)\n", "(9)\n", "and the\n", "D\n", "0\n", "T\n", "(ξ) =\n", "U¯\n", "(ξ)\n", "∇ · Dq\n", "0\n", "T\n", "(ξ)\n", "· (v0, v1) + ξv (ξ) (v0, v1).\n", "Re-scaling the space variable as in (6), we have\n", "D\n", "0\n", "T\n", "(ξ) =\n", "U¯\n", "ξ\n", "Γ\n", "q\n", "(ξ)\n", "∇ · Dq\n", "0\n", "T\n", "(ξ)\n", "· (v0, v1) + ξv (ξ) (v0, v1).\n", "The mean direction is given by the variance of the\n", "turning direction given by\n", "UT (ξ) = UT\n", "Γ\n", "q\n", "(ξ)U¯\n", "Γ\n", "q\n", "(ξ)∇ · Dq\n", "0\n", "T\n", "(ξ)\n", ". (10)\n", "As a consequence, the macroscopic behavior is strongly affected by the\n", "turning operator, that is\n", "D\n", "0\n", "T\n", "(ξ) =\n", "U¯\n", "ξ\n", "Γ\n", "q\n", "(ξ)∇ · Dq\n", "0\n", "T\n", "(ξ)\n", ". (11)\n", "In particular, the sensing radius of the cells is given by\n", "S\n", "c(x, y) =\n", "Γ\n", "q\n", "(x, y)U¯\n", "Γ\n", "q\n", "(x, y)Γ\n", "both\n", "i\n", " and ii\n", ", v\n", ", dv\n", ",\n", "dvˆ\n", ",\n", "Γ\n", "q\n", ",\n", "are given by (12) and (13). The chemoattractant has a Gaussian\n", "c = c(x, y)\n", "and on the left the two values of c both have to be in the\n", "same direction. Therefore, the sensing radius of the cells is given by\n", "the momentum\n", "T = S\n", "c(x, y) =\n", "v\n", "(x, y)\n", "Γ\n", "q\n", "(x, y)\n", "vˆ + Γ ii\n", "(x, y)\n", "iiˆ + Γ\n", "q\n", "(x, y)\n", "iiˆ + Γ\n", "v\n", "(x, y)\n", "(14)\n", "and the sensing function Γ =\n", "Γ\n", "q\n", "(x, y)U¯\n", "Γ\n", ". (15)\n", "In particular, when the two sensing functions are independent,\n", "when Γ is equal to Γvˆ, we have that the weighted average\n", "for the two velocities is given by the momentum\n", "T = S\n", "c(x, y) =\n", "\n", "v\n", "(x, y)\n", "Γ\n", "q\n", "(x, y)\n", "Γ\n", "\n", "i\n", "and\n", "k\n", ":= vˆ k(x, y)\n", ". (16)\n", "This translates into\n", "k(x, y) =\n", "vˆ(x)\n", ",\n", "that is the kurtosis\n", "T\n", "(ξ) = vˆ(x)\n", ",\n", "that is the tach statistic\n", "DT\n", "(ξ) = u\n", "T\n", "(ξ)\n", "∇T\n", "(ξ)\n", ". (\n", "\n", "[600 | 690.44] loss=0.29 avg=0.66\n", "[601 | 691.37] loss=0.45 avg=0.66\n", "[602 | 692.29] loss=0.27 avg=0.65\n", "[603 | 693.22] loss=0.38 avg=0.65\n", "[604 | 694.15] loss=0.33 avg=0.65\n", "[605 | 695.09] loss=0.89 avg=0.65\n", "[606 | 696.01] loss=0.59 avg=0.65\n", "[607 | 696.94] loss=0.27 avg=0.64\n", "[608 | 697.86] loss=0.61 avg=0.64\n", "[609 | 698.80] loss=0.33 avg=0.64\n", "[610 | 699.72] loss=0.80 avg=0.64\n", "[611 | 700.65] loss=0.49 avg=0.64\n", "[612 | 701.58] loss=0.35 avg=0.64\n", "[613 | 702.51] loss=0.26 avg=0.63\n", "[614 | 703.43] loss=0.47 avg=0.63\n", "[615 | 704.36] loss=0.41 avg=0.63\n", "[616 | 705.28] loss=0.42 avg=0.63\n", "[617 | 706.20] loss=0.69 avg=0.63\n", "[618 | 707.13] loss=0.41 avg=0.63\n", "[619 | 708.06] loss=0.44 avg=0.62\n", "[620 | 709.00] loss=0.23 avg=0.62\n", "[621 | 709.94] loss=0.46 avg=0.62\n", "[622 | 710.88] loss=0.34 avg=0.62\n", "[623 | 711.82] loss=0.34 avg=0.61\n", "[624 | 712.74] loss=0.26 avg=0.61\n", "[625 | 713.67] loss=0.23 avg=0.61\n", "[626 | 714.59] loss=0.35 avg=0.60\n", "[627 | 715.52] loss=0.50 avg=0.60\n", "[628 | 716.45] loss=0.31 avg=0.60\n", "[629 | 717.38] loss=0.36 avg=0.60\n", "[630 | 718.31] loss=0.13 avg=0.59\n", "[631 | 719.24] loss=0.29 avg=0.59\n", "[632 | 720.17] loss=0.20 avg=0.59\n", "[633 | 721.10] loss=0.23 avg=0.58\n", "[634 | 722.03] loss=0.16 avg=0.58\n", "[635 | 722.96] loss=0.14 avg=0.57\n", "[636 | 723.90] loss=0.31 avg=0.57\n", "[637 | 724.83] loss=0.80 avg=0.57\n", "[638 | 725.78] loss=0.31 avg=0.57\n", "[639 | 726.72] loss=0.38 avg=0.57\n", "[640 | 727.65] loss=0.23 avg=0.57\n", "[641 | 728.59] loss=0.24 avg=0.56\n", "[642 | 729.53] loss=0.35 avg=0.56\n", "[643 | 730.46] loss=0.19 avg=0.56\n", "[644 | 731.39] loss=0.16 avg=0.55\n", "[645 | 732.32] loss=0.24 avg=0.55\n", "[646 | 733.25] loss=0.10 avg=0.54\n", "[647 | 734.18] loss=0.40 avg=0.54\n", "[648 | 735.11] loss=0.28 avg=0.54\n", "[649 | 736.04] loss=0.36 avg=0.54\n", "[650 | 736.97] loss=0.27 avg=0.54\n", "[651 | 737.90] loss=0.78 avg=0.54\n", "interrupted\n", "Saving checkpoint/run1/model-652\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "z-zAFd2hLQ2V", "colab_type": "code", "colab": {} }, "source": [ "#@title Step 10: Creating a Training Model directory\n", "#Creating a Training Model directory named 'tgmodel'\n", "import os\n", "run_dir = '/content/gpt-2/models/tgmodel'\n", "if not os.path.exists(run_dir):\n", " os.makedirs(run_dir)" ], "execution_count": null, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "-POx-g1Ql76C", "colab_type": "code", "colab": { "base_uri": "https://localhost:8080/", "height": 68 }, "outputId": "4a14528a-cb8e-4b8c-9b8a-9a5cfab4c6fe" }, "source": [ "#@title Step 10A: Copying training Files\n", "!cp /content/gpt-2/src/checkpoint/run1/model-1000.data-00000-of-00001 /content/gpt-2/models/tgmodel\n", "!cp /content/gpt-2/src/checkpoint/run1/checkpoint /content/gpt-2/models/tgmodel\n", "!cp /content/gpt-2/src/checkpoint/run1/model-1000.index /content/gpt-2/models/tgmodel\n", "!cp /content/gpt-2/src/checkpoint/run1/model-1000.meta /content/gpt-2/models/tgmodel" ], "execution_count": null, "outputs": [ { "output_type": "stream", "text": [ "cp: cannot stat '/content/gpt-2/src/checkpoint/run1/model-1000.data-00000-of-00001': No such file or directory\n", "cp: cannot stat '/content/gpt-2/src/checkpoint/run1/model-1000.index': No such file or directory\n", "cp: cannot stat '/content/gpt-2/src/checkpoint/run1/model-1000.meta': No such file or directory\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "hdE9nNH8m7VD", "colab_type": "code", "colab": {} }, "source": [ "#@title Step 10B: Copying the OpenAI GPT-2 117M Model files\n", "!cp /content/gpt-2/models/117M/encoder.json /content/gpt-2/models/tgmodel\n", "!cp /content/gpt-2/models/117M/hparams.json /content/gpt-2/models/tgmodel\n", "!cp /content/gpt-2/models/117M/vocab.bpe /content/gpt-2/models/tgmodel" ], "execution_count": null, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "3G8NOUXjMq4u", "colab_type": "code", "colab": {} }, "source": [ "#@title Step 10C: Renaming the model directories\n", "import os\n", "!mv /content/gpt-2/models/117M /content/gpt-2/models/117M_OpenAI\n", "!mv /content/gpt-2/models/tgmodel /content/gpt-2/models/117M" ], "execution_count": null, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "h3uexz_e4d18", "colab_type": "code", "colab": {} }, "source": [ "#@title Step 11: Generating Unconditional Samples\n", "import os # import after runtime is restarted\n", "os.chdir(\"/content/gpt-2/src\")\n", "!python generate_unconditional_samples.py --model_name '117M'" ], "execution_count": null, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "6HI7DuBK4iSU", "colab_type": "code", "colab": { "base_uri": "https://localhost:8080/", "height": 1000 }, "outputId": "aced43aa-7fcc-4bac-99c7-8bfd7ded0028" }, "source": [ "#@title Step 12: Interactive Context and Completion Examples\n", "import os # import after runtime is restarted\n", "os.chdir(\"/content/gpt-2/src\")\n", "!python interactive_conditional_samples.py --temperature 0.8 --top_k 40 --model_name '117M' --length 50" ], "execution_count": null, "outputs": [ { "output_type": "stream", "text": [ "WARNING:tensorflow:From interactive_conditional_samples.py:57: The name tf.Session is deprecated. Please use tf.compat.v1.Session instead.\n", "\n", "2020-06-29 09:30:02.273624: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1\n", "2020-06-29 09:30:02.292947: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:983] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero\n", "2020-06-29 09:30:02.293714: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1639] Found device 0 with properties: \n", "name: Tesla K80 major: 3 minor: 7 memoryClockRate(GHz): 0.8235\n", "pciBusID: 0000:00:04.0\n", "2020-06-29 09:30:02.294023: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1\n", "2020-06-29 09:30:02.295631: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10\n", "2020-06-29 09:30:02.297301: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10\n", "2020-06-29 09:30:02.297699: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10\n", "2020-06-29 09:30:02.299362: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10\n", "2020-06-29 09:30:02.300174: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10\n", "2020-06-29 09:30:02.303450: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7\n", "2020-06-29 09:30:02.303619: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:983] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero\n", "2020-06-29 09:30:02.304415: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:983] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero\n", "2020-06-29 09:30:02.305120: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1767] Adding visible gpu devices: 0\n", "2020-06-29 09:30:02.310474: I tensorflow/core/platform/profile_utils/cpu_utils.cc:94] CPU Frequency: 2299995000 Hz\n", "2020-06-29 09:30:02.310737: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x1426d80 initialized for platform Host (this does not guarantee that XLA will be used). Devices:\n", "2020-06-29 09:30:02.310775: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version\n", "2020-06-29 09:30:02.360414: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:983] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero\n", "2020-06-29 09:30:02.361376: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x1426f40 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:\n", "2020-06-29 09:30:02.361416: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Tesla K80, Compute Capability 3.7\n", "2020-06-29 09:30:02.361699: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:983] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero\n", "2020-06-29 09:30:02.362523: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1639] Found device 0 with properties: \n", "name: Tesla K80 major: 3 minor: 7 memoryClockRate(GHz): 0.8235\n", "pciBusID: 0000:00:04.0\n", "2020-06-29 09:30:02.362622: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1\n", "2020-06-29 09:30:02.362681: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10\n", "2020-06-29 09:30:02.362735: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10\n", "2020-06-29 09:30:02.362790: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10\n", "2020-06-29 09:30:02.362854: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10\n", "2020-06-29 09:30:02.362922: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10\n", "2020-06-29 09:30:02.362980: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7\n", "2020-06-29 09:30:02.363153: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:983] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero\n", "2020-06-29 09:30:02.364047: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:983] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero\n", "2020-06-29 09:30:02.364759: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1767] Adding visible gpu devices: 0\n", "2020-06-29 09:30:02.364834: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1\n", "2020-06-29 09:30:02.366467: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1180] Device interconnect StreamExecutor with strength 1 edge matrix:\n", "2020-06-29 09:30:02.366509: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1186] 0 \n", "2020-06-29 09:30:02.366530: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1199] 0: N \n", "2020-06-29 09:30:02.366754: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:983] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero\n", "2020-06-29 09:30:02.367607: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:983] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero\n", "2020-06-29 09:30:02.368323: W tensorflow/core/common_runtime/gpu/gpu_bfc_allocator.cc:39] Overriding allow_growth setting because the TF_FORCE_GPU_ALLOW_GROWTH environment variable is set. Original config value was 0.\n", "2020-06-29 09:30:02.368380: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1325] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 10805 MB memory) -> physical GPU (device: 0, name: Tesla K80, pci bus id: 0000:00:04.0, compute capability: 3.7)\n", "WARNING:tensorflow:From interactive_conditional_samples.py:58: The name tf.placeholder is deprecated. Please use tf.compat.v1.placeholder instead.\n", "\n", "WARNING:tensorflow:From interactive_conditional_samples.py:60: The name tf.set_random_seed is deprecated. Please use tf.compat.v1.set_random_seed instead.\n", "\n", "WARNING:tensorflow:From /content/gpt-2/src/sample.py:51: The name tf.AUTO_REUSE is deprecated. Please use tf.compat.v1.AUTO_REUSE instead.\n", "\n", "WARNING:tensorflow:From /content/gpt-2/src/model.py:148: The name tf.variable_scope is deprecated. Please use tf.compat.v1.variable_scope instead.\n", "\n", "WARNING:tensorflow:From /content/gpt-2/src/model.py:152: The name tf.get_variable is deprecated. Please use tf.compat.v1.get_variable instead.\n", "\n", "WARNING:tensorflow:From /content/gpt-2/src/model.py:36: The name tf.rsqrt is deprecated. Please use tf.math.rsqrt instead.\n", "\n", "WARNING:tensorflow:From /content/gpt-2/src/sample.py:64: to_float (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version.\n", "Instructions for updating:\n", "Use `tf.cast` instead.\n", "WARNING:tensorflow:From /content/gpt-2/src/sample.py:16: where (from tensorflow.python.ops.array_ops) is deprecated and will be removed in a future version.\n", "Instructions for updating:\n", "Use tf.where in 2.0, which has the same broadcast rule as np.where\n", "WARNING:tensorflow:From /content/gpt-2/src/sample.py:67: multinomial (from tensorflow.python.ops.random_ops) is deprecated and will be removed in a future version.\n", "Instructions for updating:\n", "Use `tf.random.categorical` instead.\n", "WARNING:tensorflow:From interactive_conditional_samples.py:68: The name tf.train.Saver is deprecated. Please use tf.compat.v1.train.Saver instead.\n", "\n", "Model prompt >>> During such processes, cells sense the environment and respond to external factors that induce a certain direction of motion towards specific targets (taxis): this results in a persistent migration in a certain preferential direction. The guidance cues leading to directed migration may be biochemical or biophysical. Biochemical cues can be, for example, soluble factors or growth factors that give rise to chemotaxis, which involves a mono-directional stimulus. Other cues generating mono-directional stimuli include, for instance, bound ligands to the substratum that induce haptotaxis, durotaxis, that involves migration towards regions with an increasing stiffness of the ECM, electrotaxis, also known as galvanotaxis, that prescribes a directed motion guided by an electric field or current, or phototaxis, referring to the movement oriented by a stimulus of light [34]. Important biophysical cues are some of the properties of the extracellular matrix (ECM), first among all the alignment of collagen fibers and its stiffness. In particular, the fiber alignment is shown to stimulate contact guidance [22, 21]. TL;DR:\n", "2020-06-29 09:31:30.405327: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10\n", "======================================== SAMPLE 1 ========================================\n", " the ECM of a single tissue is the ECM that is the most effective.\n", "\n", "To address this concern, we developed a novel imaging and immunostaining scheme that, when activated, induces the conversion of a protein to its exogenous target\n", "================================================================================\n", "Model prompt >>> Traceback (most recent call last):\n", " File \"/usr/lib/python3.6/contextlib.py\", line 99, in __exit__\n", " self.gen.throw(type, value, traceback)\n", " File \"/tensorflow-1.15.2/python3.6/tensorflow_core/python/framework/ops.py\", line 5480, in get_controller\n", " yield g\n", " File \"interactive_conditional_samples.py\", line 73, in interact_model\n", " raw_text = input(\"Model prompt >>> \")\n", "KeyboardInterrupt\n", "\n", "During handling of the above exception, another exception occurred:\n", "\n", "Traceback (most recent call last):\n", " File \"interactive_conditional_samples.py\", line 91, in \n", " fire.Fire(interact_model)\n", " File \"/usr/local/lib/python3.6/dist-packages/fire/core.py\", line 138, in Fire\n", " component_trace = _Fire(component, args, parsed_flag_args, context, name)\n", " File \"/usr/local/lib/python3.6/dist-packages/fire/core.py\", line 468, in _Fire\n", " target=component.__name__)\n", " File \"/usr/local/lib/python3.6/dist-packages/fire/core.py\", line 672, in _CallAndUpdateTrace\n", " component = fn(*varargs, **kwargs)\n", " File \"interactive_conditional_samples.py\", line 88, in interact_model\n", " print(\"=\" * 80)\n", " File \"/tensorflow-1.15.2/python3.6/tensorflow_core/python/client/session.py\", line 1633, in __exit__\n", " close_thread.start()\n", " File \"/usr/lib/python3.6/threading.py\", line 851, in start\n", " self._started.wait()\n", " File \"/usr/lib/python3.6/threading.py\", line 551, in wait\n", " signaled = self._cond.wait(timeout)\n", " File \"/usr/lib/python3.6/threading.py\", line 295, in wait\n", " waiter.acquire()\n", "KeyboardInterrupt\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "ihVnmXFYB-E7", "colab_type": "code", "colab": { "base_uri": "https://localhost:8080/", "height": 1000 }, "outputId": "bbe7802e-2e06-4c70-debb-bab34bfb0c2e" }, "source": [ "#@title Additional Tools: Controlling Tokenized Data\n", "#Unzip out.npz\n", "import zipfile\n", "with zipfile.ZipFile('/content/gpt-2/src/out.npz', 'r') as zip_ref:\n", " zip_ref.extractall('/content/gpt-2/src/')\n", "\n", "#Load arr_0.npy which contains encoded dset\n", "import numpy as np\n", "f=np.load('/content/gpt-2/src/arr_0.npy')\n", "print(f)\n", "print(f.shape)\n", "for i in range(0,10):\n", " print(f[i])\n", " \n", "#We first import encoder.json\n", "import json\n", "i=0\n", "with open(\"/content/gpt-2/models/117M/encoder.json\", \"r\") as read_file:\n", " print(\"Converting the JSON encoded data into a Python dictionary\")\n", " developer = json.load(read_file) #converts the encoded data into a Python dictionary\n", " for key, value in developer.items(): #we parse the decoded json data\n", " i+=1\n", " if(i>10):\n", " break;\n", " print(key, \":\", value)\n", "\n", "#We will now search for the key and value for each encoded token\n", " for i in range(0,500):\n", " for key, value in developer.items():\n", " if f[i]==value:\n", " print(key, \":\", value)" ], "execution_count": null, "outputs": [ { "output_type": "stream", "text": [ "[1212 5644 326 ... 13 198 2682]\n", "(29379,)\n", "1212\n", "5644\n", "326\n", "11\n", "355\n", "716\n", "78\n", "1765\n", "1868\n", "4778\n", "Converting JSON encoded data into Python dictionary\n", "! : 0\n", "\" : 1\n", "# : 2\n", "$ : 3\n", "% : 4\n", "& : 5\n", "' : 6\n", "( : 7\n", ") : 8\n", "* : 9\n", "This : 1212\n", "Ġsuggests : 5644\n", "Ġthat : 326\n", ", : 11\n", "Ġas : 355\n", "Ġam : 716\n", "o : 78\n", "eb : 1765\n", "oid : 1868\n", "Ġcells : 4778\n", "Ġare : 389\n", "Ġless : 1342\n", "Ġcontract : 2775\n", "ile : 576\n", ", : 11\n", "Ġwhile : 981\n", "Ġmes : 18842\n", "ench : 24421\n", "ym : 4948\n", "al : 282\n", "Ċ : 198\n", "cells : 46342\n", "Ġare : 389\n", "Ġmore : 517\n", "Ġcontract : 2775\n", "ile : 576\n", ", : 11\n", "Ġand : 290\n", "Ġthere : 612\n", "Ġmay : 743\n", "Ġbe : 307\n", "Ġa : 257\n", "Ġswitching : 15430\n", "Ġbetween : 1022\n", "Ġam : 716\n", "o : 78\n", "eb : 1765\n", "oid : 1868\n", "Ġand : 290\n", "Ġmes : 18842\n", "ench : 24421\n", "ym : 4948\n", "al : 282\n", "Ċ : 198\n", "m : 76\n", "igration : 4254\n", ", : 11\n", "Ġperhaps : 3737\n", "Ġthere : 612\n", "Ġcan : 460\n", "Ġalso : 635\n", "Ġbe : 307\n", "Ġa : 257\n", "Ġswitching : 15430\n", "Ġbetween : 1022\n", "Ġthe : 262\n", "Ġdominance : 18648\n", "Ġof : 286\n", "Ġchem : 4607\n", "ot : 313\n", "axis : 22704\n", "Ġ( : 357\n", "amo : 18811\n", "eb : 1765\n", "oid : 1868\n", "Ċ : 198\n", "m : 76\n", "igration : 4254\n", ") : 8\n", "Ġand : 290\n", "Ġcontact : 2800\n", "Ġguidance : 11154\n", "Ġ( : 357\n", "mes : 6880\n", "ench : 24421\n", "ym : 4948\n", "al : 282\n", "Ġmigration : 13472\n", ") : 8\n", "Ġ[ : 685\n", "60 : 1899\n", "]. : 4083\n", "ĠOne : 1881\n", "Ġof : 286\n", "Ġthe : 262\n", "Ġmost : 749\n", "Ġinteresting : 3499\n", "Ġ2 : 362\n", "D : 35\n", "Ċ : 198\n", "platform : 24254\n", "s : 82\n", ", : 11\n", "Ġallowing : 5086\n", "Ġto : 284\n", "Ġstudy : 2050\n", "Ġcontact : 2800\n", "Ġguidance : 11154\n", "Ġand : 290\n", "Ġchem : 4607\n", "ot : 313\n", "axis : 22704\n", ", : 11\n", "Ġwas : 373\n", "Ġproposed : 5150\n", "Ġin : 287\n", "Ġ[ : 685\n", "57 : 3553\n", "], : 4357\n", "Ġin : 287\n", "Ġwhich : 543\n", "Ġthe : 262\n", "Ċ : 198\n", "authors : 41617\n", "Ġdemonstrated : 9555\n", "Ġan : 281\n", "Ġadditive : 38298\n", "Ġeffect : 1245\n", "Ġof : 286\n", "Ġchemical : 5931\n", "Ġgrad : 3915\n", "ients : 2334\n", "Ġand : 290\n", "Ġfiber : 13608\n", "Ġalignment : 19114\n", "Ġby : 416\n", "Ġmeasuring : 15964\n", "Ċ : 198\n", "the : 1169\n", "Ġpersistence : 30802\n", "Ġtime : 640\n", "; : 26\n", "Ġthey : 484\n", "Ġalso : 635\n", "Ġobserved : 6515\n", "Ġthat : 326\n", "Ġcells : 4778\n", "Ġwere : 547\n", "Ġdirected : 7924\n", "Ġby : 416\n", "Ġfiber : 13608\n", "Ġalignment : 19114\n", "Ġand : 290\n", "Ġthere : 612\n", "Ġwas : 373\n", "Ċ : 198\n", "no : 3919\n", "Ġeffect : 1245\n", "Ġof : 286\n", "Ġthe : 262\n", "Ġchemical : 5931\n", "Ġgradient : 31312\n", "Ġwhen : 618\n", "Ġfibers : 26742\n", "Ġwere : 547\n", "Ġaligned : 19874\n", "Ġperpendicular : 47190\n", "Ġto : 284\n", "Ġit : 340\n", ". : 13\n", "ĠA : 317\n", "Ġsimilar : 2092\n", "Ġsetting : 4634\n", "Ċ : 198\n", "was : 9776\n", "Ġalso : 635\n", "Ġused : 973\n", "Ġfor : 329\n", "Ġstudying : 11065\n", "Ġthe : 262\n", "Ġdependence : 21403\n", "Ġof : 286\n", "Ġcontact : 2800\n", "Ġguidance : 11154\n", "Ġon : 319\n", "Ġthe : 262\n", "Ġcell : 2685\n", "Ġcycle : 6772\n", "Ġ[ : 685\n", "48 : 2780\n", "]. : 4083\n", "ĠHowever : 2102\n", ", : 11\n", "ĠIn : 554\n", "Ċ : 198\n", "the : 1169\n", "Ġcase : 1339\n", "Ġof : 286\n", "Ġdifferent : 1180\n", "Ġmulti : 5021\n", "- : 12\n", "direction : 37295\n", "al : 282\n", "Ġcues : 25288\n", ", : 11\n", "Ġtotally : 6635\n", "Ġdifferent : 1180\n", "Ġscenarios : 13858\n", "Ġmay : 743\n", "Ġhappen : 1645\n", ", : 11\n", "Ġe : 304\n", ". : 13\n", "g : 70\n", ". : 13\n", "Ġin : 287\n", "Ġ[ : 685\n", "51 : 4349\n", "] : 60\n", "Ġit : 340\n", "Ġis : 318\n", "Ċ : 198\n", "shown : 42579\n", "Ġthat : 326\n", "Ġfor : 329\n", "Ġcontact : 2800\n", "Ġguidance : 11154\n", "Ġand : 290\n", "Ġelect : 1742\n", "rot : 10599\n", "axis : 22704\n", "Ġin : 287\n", "Ġthe : 262\n", "Ġcor : 1162\n", "nea : 39718\n", ", : 11\n", "Ġelect : 1742\n", "rot : 10599\n", "axis : 22704\n", "Ġwins : 7864\n", "Ġwhen : 618\n", "Ġcompeting : 11780\n", "Ċ : 198\n", "with : 4480\n", "Ġthe : 262\n", "Ġdirection : 4571\n", "Ġof : 286\n", "Ġalignment : 19114\n", "Ġof : 286\n", "Ġthe : 262\n", "Ġfibers : 26742\n", ". : 13\n", "Ċ : 198\n", "Multi : 29800\n", "- : 12\n", "cue : 15509\n", "Ġkinetic : 37892\n", "Ġmodel : 2746\n", "Ġwith : 351\n", "Ġnon : 1729\n", "- : 12\n", "local : 12001\n", "Ġsensing : 34244\n", "Ġfor : 329\n", "Ġcell : 2685\n", "Ċ : 198\n", "m : 76\n", "igration : 4254\n", "Ġon : 319\n", "Ġa : 257\n", "Ġfibers : 26742\n", "Ġnetwork : 3127\n", "Ġwith : 351\n", "Ġchem : 4607\n", "ot : 313\n", "axis : 22704\n", "Ċ : 198\n", "Mart : 13143\n", "ina : 1437\n", "ĠCon : 1482\n", "te : 660\n", "ĠâĪ : 18872\n", "Ĺ : 245\n", "ĠNad : 21877\n", "ia : 544\n", "ĠL : 406\n", "oy : 726\n", "ĠâĢ : 564\n", "ł : 254\n", "âĢ : 447\n", "¡ : 94\n", "Ċ : 198\n", "June : 15749\n", "Ġ18 : 1248\n", ", : 11\n", "Ġ2020 : 12131\n", "Ċ : 198\n", "Abstract : 23839\n", "Ċ : 198\n", "C : 34\n", "ells : 19187\n", "Ġperform : 1620\n", "Ġdirected : 7924\n", "Ġmotion : 6268\n", "Ġin : 287\n", "Ġresponse : 2882\n", "Ġto : 284\n", "Ġexternal : 7097\n", "Ġstimuli : 25973\n", "Ġthat : 326\n", "Ġthey : 484\n", "Ġdetect : 4886\n", "Ġby : 416\n", "Ġsensing : 34244\n", "Ċ : 198\n", "the : 1169\n", "Ġenvironment : 2858\n", "Ġwith : 351\n", "Ġtheir : 511\n", "Ġmembrane : 25019\n", "Ġprot : 1237\n", "rus : 14932\n", "ions : 507\n", ". : 13\n", "ĠIn : 554\n", "Ġparticular : 1948\n", ", : 11\n", "Ġseveral : 1811\n", "Ġbiochemical : 47685\n", "Ġand : 290\n", "Ġbi : 3182\n", "ophysical : 41789\n", "Ġcues : 25288\n", "Ġgive : 1577\n", "Ġrise : 4485\n", "Ġto : 284\n", "Ġtactic : 18543\n", "Ġmigration : 13472\n", "Ġin : 287\n", "Ġthe : 262\n", "Ġdirection : 4571\n", "Ġof : 286\n", "Ġtheir : 511\n", "Ġspecific : 2176\n", "Ġtargets : 6670\n", ". : 13\n", "ĠThis : 770\n", "Ġdefines : 15738\n", "Ċ : 198\n", "a : 64\n", "Ġmulti : 5021\n", "- : 12\n", "cue : 15509\n", "Ġenvironment : 2858\n", "Ġin : 287\n", "Ġwhich : 543\n", "Ġcells : 4778\n", "Ġhave : 423\n", "Ġto : 284\n", "Ġsort : 3297\n", "Ġand : 290\n", "Ġcombine : 12082\n", "Ġdifferent : 1180\n", ", : 11\n", "Ġand : 290\n", "Ġpotentially : 6196\n", "Ċ : 198\n", "competitive : 46131\n", ", : 11\n", "Ġstimuli : 25973\n", ". : 13\n", "ĠWe : 775\n", "Ġpropose : 18077\n", "Ġa : 257\n", "Ġnon : 1729\n", "- : 12\n", "local : 12001\n", "Ġkinetic : 37892\n", "Ġmodel : 2746\n", "Ġfor : 329\n", "Ġcell : 2685\n", "Ġmigration : 13472\n", "Ġin : 287\n", "Ġpresence : 4931\n", "Ġof : 286\n", "Ċ : 198\n", "two : 11545\n", "Ġexternal : 7097\n", "Ġfactors : 5087\n", "Ġboth : 1111\n", "Ġinfluencing : 32596\n", "Ġcell : 2685\n", "Ġpolarization : 42704\n", ": : 25\n", "Ġcontact : 2800\n", "Ġguidance : 11154\n", "Ġand : 290\n", "Ġchem : 4607\n", "ot : 313\n", "axis : 22704\n", ". : 13\n", "ĠWe : 775\n", "Ċ : 198\n", "pro : 1676\n", "pose : 3455\n", "Ġtwo : 734\n", "Ġdifferent : 1180\n", "Ġsensing : 34244\n", "Ġstrategies : 10064\n", "Ġand : 290\n", "Ġwe : 356\n", "Ġanalyze : 16602\n", "Ġthe : 262\n", "Ġtwo : 734\n", "Ġresulting : 7186\n", "Ġmodels : 4981\n", "Ġby : 416\n", "Ġrecovering : 20222\n", "Ċ : 198\n", "the : 1169\n", "Ġappropriate : 5035\n", "Ġmacro : 15021\n", "sc : 1416\n", "opic : 16603\n", "Ġlimit : 4179\n", "Ġin : 287\n", "Ġdifferent : 1180\n", "Ġregimes : 25879\n", ", : 11\n", "Ġin : 287\n", "Ġorder : 1502\n", "Ġto : 284\n", "Ġsee : 766\n", "Ġhow : 703\n", "Ġthe : 262\n", "Ġsize : 2546\n", "Ġof : 286\n", "Ġthe : 262\n", "Ġcell : 2685\n", ", : 11\n", "Ċ : 198\n", "with : 4480\n", "Ġrespect : 2461\n", "Ġto : 284\n", "Ġthe : 262\n", "Ġvariation : 12291\n", "Ġof : 286\n", "Ġboth : 1111\n", "Ġexternal : 7097\n", "Ġfields : 7032\n", ", : 11\n", "Ġinfluences : 16717\n", "Ġthe : 262\n", "Ġoverall : 4045\n", "Ġbehavior : 4069\n", ". : 13\n", "ĠMoreover : 10968\n", ", : 11\n", "Ċ : 198\n", "we : 732\n", "Ġintegrate : 19386\n", "Ġnumer : 5470\n", "ically : 1146\n", "Ġthe : 262\n", "Ġkinetic : 37892\n", "Ġtransport : 4839\n", "Ġequation : 16022\n", "Ġin : 287\n", "Ġa : 257\n", "Ġtwo : 734\n", "- : 12\n", "dimensional : 19577\n", "Ġsetting : 4634\n", "Ġin : 287\n", "Ġorder : 1502\n", "Ċ : 198\n", "to : 1462\n", "Ġinvestigate : 9161\n", "Ġqual : 4140\n", "itatively : 48668\n", "Ġvarious : 2972\n", "Ġscenarios : 13858\n", ". : 13\n", "Ċ : 198\n", "Key : 9218\n", "word : 4775\n", ". : 13\n", "ĠKin : 16645\n", "etic : 5139\n", "Ġequations : 27490\n", ", : 11\n", "Ġmult : 1963\n", "isc : 2304\n", "ale : 1000\n", "Ġmodeling : 21128\n", ", : 11\n", "Ġmulti : 5021\n", "- : 12\n", "cue : 15509\n", ", : 11\n", "Ġnon : 1729\n", "- : 12\n", "local : 12001\n", ", : 11\n", "Ġhyd : 7409\n", "rod : 14892\n", "ynamic : 28995\n", "Ġlimit : 4179\n", ", : 11\n", "Ċ : 198\n" ], "name": "stdout" } ] } ] } ================================================ FILE: Chapter08/gpt-2-train_files/accumulate.py ================================================ import argparse import json import os import numpy as np import tensorflow as tf import time class AccumulatingOptimizer(object): def __init__(self, opt, var_list): self.opt = opt self.var_list = var_list self.accum_vars = {tv : tf.Variable(tf.zeros_like(tv.initialized_value()), trainable=False) for tv in var_list} self.total_loss = tf.Variable(tf.zeros(shape=[], dtype=tf.float32)) self.count_loss = tf.Variable(tf.zeros(shape=[], dtype=tf.float32)) def reset(self): updates = [tv.assign(tf.zeros_like(tv)) for tv in self.accum_vars.values()] updates.append(self.total_loss.assign(tf.zeros(shape=[], dtype=tf.float32))) updates.append(self.count_loss.assign(tf.zeros(shape=[], dtype=tf.float32))) with tf.control_dependencies(updates): return tf.no_op() def compute_gradients(self, loss): grads = self.opt.compute_gradients(loss, self.var_list) updates = [self.accum_vars[v].assign_add(g) for (g,v) in grads] updates.append(self.total_loss.assign_add(loss)) updates.append(self.count_loss.assign_add(1.0)) with tf.control_dependencies(updates): return tf.no_op() def apply_gradients(self): grads = [(g,v) for (v,g) in self.accum_vars.items()] with tf.control_dependencies([self.opt.apply_gradients(grads)]): return self.total_loss / self.count_loss ================================================ FILE: Chapter08/gpt-2-train_files/encode.py ================================================ #!/usr/bin/env python3 # Usage: # PYTHONPATH=src ./encode.py /path/to/output.npz # PYTHONPATH=src ./train --dataset /path/to/output.npz import argparse import numpy as np import encoder from load_dataset import load_dataset parser = argparse.ArgumentParser( description='Pre-encode text files into tokenized training set.', formatter_class=argparse.ArgumentDefaultsHelpFormatter) parser.add_argument('--model_name', metavar='MODEL', type=str, default='117M', help='Pretrained model name') parser.add_argument('--combine', metavar='CHARS', type=int, default=50000, help='Concatenate files with <|endoftext|> separator into chunks of this minimum size') parser.add_argument('--encoding', type=str, default='utf-8', help='Set the encoding for reading and writing files.') parser.add_argument('in_text', metavar='PATH', type=str, help='Input file, directory, or glob pattern (utf-8 text).') parser.add_argument('out_npz', metavar='OUT.npz', type=str, help='Output file path') def main(): models_dir='/content/gpt-2/src/models' args = parser.parse_args() enc = encoder.get_encoder(args.model_name,models_dir) print('Reading files') chunks = load_dataset(enc, args.in_text, args.combine, encoding=args.encoding) print('Writing', args.out_npz) np.savez_compressed(args.out_npz, *chunks) if __name__ == '__main__': main() ================================================ FILE: Chapter08/gpt-2-train_files/load_dataset.py ================================================ import glob import numpy as np import os import tensorflow as tf import tqdm def load_dataset(enc, path, combine, encoding=None): paths = [] if os.path.isfile(path): # Simple file paths.append(path) elif os.path.isdir(path): # Directory for (dirpath, _, fnames) in os.walk(path): for fname in fnames: paths.append(os.path.join(dirpath, fname)) else: # Assume glob paths = glob.glob(path) token_chunks = [] raw_text = '' for path in tqdm.tqdm(paths): if path.endswith('.npz'): # Pre-encoded with np.load(path) as npz: for item in npz.files: token_chunks.append(npz[item]) else: # Plain text with open(path, 'r', encoding=encoding) as fp: raw_text += fp.read() if len(raw_text) >= combine: tokens = np.stack(enc.encode(raw_text)) token_chunks.append(tokens) raw_text = '' else: raw_text += '<|endoftext|>' if raw_text: tokens = np.stack(enc.encode(raw_text)) token_chunks.append(tokens) return token_chunks def binary_search(f, lo, hi): if f(lo) or not f(hi): return None while hi > lo + 1: mid = (lo + hi) // 2 if f(mid): hi = mid else: lo = mid return hi class Sampler(object): """Fairly samples a slice from a set of variable sized chunks. 'Fairly' means that the distribution is the same as sampling from one concatenated chunk, but without crossing chunk boundaries.""" def __init__(self, chunks, seed=None): self.chunks = chunks self.total_size = sum(chunk.shape[0] for chunk in chunks) self.boundaries = [0] for i in range(len(chunks)): self.boundaries.append(self.boundaries[-1] + chunks[i].shape[0]) self.rs = np.random.RandomState(seed=seed) def sample(self, length): assert length < self.total_size // len( self.chunks ), "Dataset files are too small to sample {} tokens at a time".format( length) while True: index = self.rs.randint(0, self.total_size - length - 1) i = binary_search(lambda j: self.boundaries[j] > index, 0, len(self.boundaries) - 1) - 1 if self.boundaries[i + 1] > index + length: within_chunk = index - self.boundaries[i] return self.chunks[i][within_chunk:within_chunk + length] ================================================ FILE: Chapter08/gpt-2-train_files/mdset.txt ================================================ This suggests that, as amoeboid cells are less contractile, while mesenchymal cells are more contractile, and there may be a switching between amoeboid and mesenchymal migration, perhaps there can also be a switching between the dominance of chemotaxis (amoeboid migration) and contact guidance (mesenchymal migration) [60]. One of the most interesting 2D platforms, allowing to study contact guidance and chemotaxis, was proposed in [57], in which the authors demonstrated an additive effect of chemical gradients and fiber alignment by measuring the persistence time; they also observed that cells were directed by fiber alignment and there was no effect of the chemical gradient when fibers were aligned perpendicular to it. A similar setting was also used for studying the dependence of contact guidance on the cell cycle [48]. However, In the case of different multi-directional cues, totally different scenarios may happen, e.g. in [51] it is shown that for contact guidance and electrotaxis in the cornea, electrotaxis wins when competing with the direction of alignment of the fibers. Multi-cue kinetic model with non-local sensing for cell migration on a fibers network with chemotaxis Martina Conte ∗ Nadia Loy †‡ June 18, 2020 Abstract Cells perform directed motion in response to external stimuli that they detect by sensing the environment with their membrane protrusions. In particular, several biochemical and biophysical cues give rise to tactic migration in the direction of their specific targets. This defines a multi-cue environment in which cells have to sort and combine different, and potentially competitive, stimuli. We propose a non-local kinetic model for cell migration in presence of two external factors both influencing cell polarization: contact guidance and chemotaxis. We propose two different sensing strategies and we analyze the two resulting models by recovering the appropriate macroscopic limit in different regimes, in order to see how the size of the cell, with respect to the variation of both external fields, influences the overall behavior. Moreover, we integrate numerically the kinetic transport equation in a two-dimensional setting in order to investigate qualitatively various scenarios. Keyword. Kinetic equations, multiscale modeling, multi-cue, non-local, hydrodynamic limit, cell migration, contact guidance, chemotaxis AMS subject classifications. 35Q20, 35Q92, 92B05, 45K05, 92C17 1 Introduction Cell migration is a fundamental mechanism in a huge variety of processes, such as embryogenesis, wound healing, angiogenesis, immune response and tumor stroma formation and metastasis. During such processes, cells sense the environment and respond to external factors that induce a certain direction of motion towards specific targets (taxis): this results in a persistent migration in a certain preferential direction. The guidance cues leading to directed migration may be biochemical or biophysical. Biochemical cues can be, for example, soluble factors or growth factors that give rise to chemotaxis, which involves a mono-directional stimulus. Other cues generating mono-directional stimuli include, for instance, bound ligands to the substratum that induce haptotaxis, durotaxis, that involves migration towards regions with an increasing stiffness of the ECM, electrotaxis, also known as galvanotaxis, that prescribes a directed motion guided by an electric field or current, or phototaxis, referring to the movement oriented by a stimulus of light [34]. Important biophysical cues are some of the properties of the extracellular matrix (ECM), first among all the alignment of collagen fibers and its stiffness. In particular, the fiber alignment is shown to stimulate contact guidance [22, 21]. Contact guidance is a key mechanism in a number of in vivo situations in which cells tend to migrate crawling on the fibers, thus following ∗BCAM - Basque Center for Applied Mathematics, Alameda de Mazarredo, 14, 48009 Bilbao, Spain (mconte@bcamath.org) †Department of Mathematical Sciences “G. L. Lagrange”, Politecnico di Torino, Corso Duca degli Abruzzi 24, 10129 Torino, Italy, and Department of Mathematics “G. Peano”, Via Carlo Alberto 10, 10123 Torino, Italy (nadia.loy@polito.it) ‡Corresponding author: nadia.loy@polito.it 1 arXiv:2006.09707v1 [q-bio.CB] 17 Jun 2020 the directions imposed by the network structure of the ECM. This is a bi-directional cue, as, if the fibers network is not polarized, there is no preferential sense of migration along them. For example, during wound healing fibroblasts migrate efficiently along collagen or fibronectin fibers in connective tissues; in cancer spread and metastasis formation, cancer cells migrate through the stromal tissue and are thus facilitated to reach blood and lymphatic vessels [55, 49, 50]. In many processes there are several directional cues that may induce different simultaneous stimuli. While the cell response to each of them has been largely studied, from both an intracellular and a migrative point of view, cell responses to a multi-cue environment are much less understood. The fundamental issue is the way cells rank, integrate or hierarchize multiple cues, in particular when these give conflicting stimuli, because, for example, they are not co-aligned [51]. Some studies have shown that there may be competition or cooperation between different stimuli in the directional response of a cell in a multi-cue environment. Considering the angle between the relative orientation of the directional cues, in the mono-directional case they compete when this angle is π, whereas they collaborate when this angle is 0. Bi-directional cues, such as contact guidance, compete when the angle is π/2. Then, many intermediate scenarios may happen and guidance stimuli submit or prevail according to other factors, among all their average concentration and intensity, that relates to the steepness of the gradient for taxis processes and to the degree of alignment for contact guidance. In particular, regarding the external environment, the average value of the directional cue (fiber density, molecule concentration, etc.) and the steepness of the gradient, or the degree of fiber alignment, are fundamental parameters that can be quantified. While, for cell migration, the angle between the polarization direction and the preferential direction imposed by the guidance cue can be measured, as well as the displacement, the mean squared displacement and the persistence time [15]. However, in general, when cues are aligned, a simple additive mechanism is not what governs multi-cue migration [34], even if it is weighted by the average cue concentrations or intensities. In the framework of kinetic models, in the present paper we will focus on how the environmental sensing of two different stimuli over a finite radius can influence the choice of the direction of motion of a cell. In particular, we combine chemotaxis, a mono-directional biochemical cue, with contact guidance, defining the new orientation of the cells as a result of the sensing of the two cues over a finite neighborhood, that gives a non-local character to the model. In particular, the combination of chemotaxis and contact-guidance happens in vivo in a variety of situations, for example in wound healing and in breast cancer. In wound healing, fibers guide cells towards the provisional clot, whilst in breast cancer cells follow the aligned fibers at the tumor-stroma interface for migrating out of the primary tumor. Chemotaxis accelerates and enhances these processes [34, 6, 49, 50]. Therefore, a deep understanding of multi-cue migrational responses is a key step for the comprehension of both physiologic and pathologic processes, but also for building engineered tissues, as their structure is realized for guiding cell migration in a focused way [34]. There are not many experimental studies concerning chemotaxis and contact guidance, as well as other combinations of directional guidances cues [34]. One of the main reasons is the difficulty in designing environments for controlling multiple directional cues, in particular soluble factors and aligned fibers and fibrous materials. For example, in one of the first works studying in vitro contact guidance of neutrophil leukocytes on fibrils of collagen [59], it is shown that migration is more efficient in the direction of alignment, instead of in the perpendicular direction; in the presence of chemotaxis, obtained by adding a chemoattractant, they observe that these cues cooperate or compete in dependence on their relative orientation. In particular, the chemotactic response is lower for cells trying to cross fibers in the perpendicular direction. In [6], it is shown that alignment along the fibers is greater in presence of a co-aligned chemoattractant. In [38], the authors study how multiple uniformly distributed cues quantitatively regulate random cell migration. One of the latest works concerning the competition between chemotaxis and contact guidance shows that less contractile cells are dominated by chemotaxis, while contact guidance might dominate in more contractile cells [52]. This suggests that, as amoeboid cells are less contractile, while mesenchymal cells are more contractile, and there may be a switching between amoeboid and mesenchymal migration, perhaps there can also be a switching between the dominance of chemotaxis (amoeboid migration) and contact guidance (mesenchymal migration) [60]. One of the most interesting 2D 2 platforms, allowing to study contact guidance and chemotaxis, was proposed in [57], in which the authors demonstrated an additive effect of chemical gradients and fiber alignment by measuring the persistence time; they also observed that cells were directed by fiber alignment and there was no effect of the chemical gradient when fibers were aligned perpendicular to it. A similar setting was also used for studying the dependence of contact guidance on the cell cycle [48]. However, In the case of different multi-directional cues, totally different scenarios may happen, e.g. in [51] it is shown that for contact guidance and electrotaxis in the cornea, electrotaxis wins when competing with the direction of alignment of the fibers. There is a huge variety of mathematical models concerning cell migration. They range from microscopic models (also called individuals based models), that describe migration at the cell level, up to macroscopic ones, that describe collective cell-migration at a tissue level. There are many examples of individual based models regarding chemotaxis ([14, 23] and references therein) and migration on the ECM [11, 54, 53]. Concerning macroscopic models, first among all the famous Keller and Segel model is a drift-diffusion model postulated at the macroscopic level [29]. Many efforts were made in order to encompass the defects of the Keller and Segel model, as well as for deriving it from lower scale models (see [30, 27, 40, 41] and references therein). Between microscopic and macroscopic models there are mesoscopic models that are an intermediate representative scale, as they include microscopic dynamics and describe the statistical distribution of the individuals. They also allow, for instance in the case of kinetic theory, to recover the appropriate macroscopic regime which inherit some details of the microscopic dynamics, thus giving more significance to some of the parameters [40]. Some examples are [12, 7, 17]. The two major models for contact guidance at the mesoscopic level were proposed in [24] and [16], both local models in the physical space. Concerning multiple cues, not many models exist. In [31], the authors propose a macroscopic drift-diffusion model derived from a space jump process in which they include the response to multiple chemicals. A recent review for macroscopic PDEs including multiple-taxis has been proposed in [32]. In [58], the authors propose one of the first models for both contact guidance and chemotaxis, derived from a microscopic dynamics description. In a recent work [1], the authors propose a microscopic stochastic model for studying contact guidance and add chemotaxis in order to study migration at the tumor-stroma interface for classifying TACS (tumor associated collagen signature). In [8], a kinetic model for cell-cell interactions on a fibers network in presence of a tactic cue is considered. In [36, 37], the authors propose a non-local kinetic model with a double biasing cue: the first one affecting the choice of the direction and the second one affecting the speed, including, through the non-locality, the sensing of macroscopic quantities performed by the cell, that depends on the cell size, i.e., on its maximum protrusion length. As already stated, in this paper we want to include chemotaxis and contact guidance as directional cues guiding cell polarization. In particular, we analyze two possible sensing strategies that a cell could apply for exploring the neighborhood around, and that determine the choice for the transition probability for the transport model. The cell can measure the guidance cues independently, and, then, choose the new orientation using the collected information, eventually weighted in different ways. Otherwise, it can measure the two directional stimuli, weighting them equally, and assuming a conditioning of one cue on the other. Therefore, cell response is related to the choice of the sensing strategy, and the macroscopic overall effect of the two cues would also be affected. Moreover, we shall consider for the first time a non-local sensing of the fibers distribution defined at a mesoscopic level; this allows for many intermediate scenarios in the analysis about the collaborative or competitive effect of the cues. For a better understanding, we discuss how the choices made on the transition probability, together with the size of the sampling volume and the characteristics of the two cues determine the macroscopic behavior. Specifically, in section 2, we shall present the mathematical framework, while in section 3 we shall introduce the two classes of models, that describe the different strategies for the sensing of a double cue, along with the corresponding macroscopic limits in various regimes, depending on the cell size and on the variability of the external cues. In section 4, some numerical simulations of the kinetic models will be presented for investigating qualitatively various scenarios in a two-dimensional setting. 3 2 Mathematical framework 2.1 The transport model The cell population will be described at a mesoscopic level through the distribution density p = p(t, x, v, vˆ) that, for every time t > 0 and position x ∈ Ω ⊆ R d , gives the statistical distribution of the speeds v ∈ [0, U], where U is the maximal speed a cell can achieve, and of the polarization directions vˆ ∈ S d−1 , being S d−1 the unit sphere boundary in R d . The velocity vector, thus, will be given by v = vvˆ. Then, a macroscopic description for the cell population can be classically recovered through the definition of moments of the distribution function p. In particular, we recover the cell number density ρ(t, x) ρ(t, x) = Z S d−1 Z U 0 p(t, x, v, vˆ) dv dvˆ (1) the momentum ρ(t, x)U(t, x) = Z S d−1 Z U 0 v p(t, x, v, vˆ) dv dvˆ (2) the cell mean velocity U(t, x) = 1 ρ(t, x) Z S d−1 Z U 0 v p(t, x, v, vˆ) dv dvˆ (3) and the energy tensor D(t, x) = Z S d−1 Z U 0 (v − U) ⊗ (v − U) p(t, x, v, vˆ) dv dvˆ. (4) The mesoscopic model consists in the transport equation for the cell distribution ∂p ∂t(t, x, v, vˆ) + v · ∇p(t, x, v, vˆ) = J [p](t, x, v, vˆ) (5) where the operator ∇ denotes the spatial gradient, so that the term v·∇p takes into account the free particle transport. The term J [p](t, x, v, vˆ) is the turning operator that describes the scattering of the microscopic velocity in direction and speed. This is related to the typical microscopic dynamics of the cell, that is the run and tumble [5, 2]. The run and tumble prescribes an alternation of runs over straight lines and re-orientations: the choice of the new direction may be random or it may be biased by the presence of external factors, that may attract or repel the cell as well as increase the time spent in a run. The run and tumble is classically modeled by a scattering of the microscopic velocity called velocity jump process [56], characterized by a turning frequency µ and a transition probability T. The general form of the turning operator which implements a velocity jump process at a kinetic level is given by J [p](x, v, vˆ) =µ(x) Z S d−1 Z U 0 h T(x, v, vˆ|v 0 , vˆ 0 )p(t, x, v0 , vˆ 0 ) − T(x, v0 , vˆ 0 |v, vˆ)p(t, x, v, vˆ) i dv0 dvˆ 0 (6) where we assumed that the turning frequency does not depend on the microscopic velocity. The transition probability T(x, v, vˆ|v 0 , vˆ 0 ) is also called turning kernel and it is a conditional probability satisfying, ∀x ∈ Ω, Z S d−1 Z U 0 T(x, v, vˆ|v 0 , vˆ 0 )dvdvˆ = 1 , ∀v 0 ∈ [0, U], vˆ 0 ∈ S d−1 . (7) Thanks to this property, the operator (6) reads J [p](t, x, v, vˆ) = µ(x) Z S d−1 Z U 0 T(x, v, vˆ|v 0 , vˆ 0 )p(t, x, v0 , vˆ 0 ) dv0 dvˆ 0 − p(t, x, v, vˆ) ! . 4 For our purposes, we shall assume that the transition probability only depends on the posttumbling velocity T(x, v, vˆ|v 0 , vˆ 0 ) = T(x, v, vˆ) (8) as classically done in the pioneering work concerning kinetic equations for velocity jump processes [56, 42, 24]. This assumption, along with the assumption on the turning frequency, is due to the fact that we shall consider directional cues which are sensed non-locally, and, therefore, the most relevant aspect will be the measured preferential direction instead than the incoming velocity. The latter (8) allows to write the turning operator as J [p](t, x, v, vˆ) = µ(x)  ρ(t, x)T(x, v, vˆ) − p(t, x, v, vˆ)  . (9) The mean macroscopic velocity after a tumble is given by the average of T UT (x) = Z S d−1 Z U 0 v T(x, v, vˆ) dv dvˆ (10) and the diffusion tensor by the variance-covariance matrix DT (x) = Z S d−1 Z U 0 T(x, v, vˆ)(v − UT ) ⊗ (v − UT )dv dvˆ. (11) Arguing as in [46, 4], we can prove a linear version of the classical H-Theorem for the linear Boltzmann equation (5)-(9) with p 0 = p(0, x, v, vˆ) ∈ L 1 (Ω × [0, U] × S d−1 ). In particular the Maxwellian M(x, v, vˆ) = ρ ∞(x)T(x, v, vˆ), making the turning operator vanish, is the local asymptotic stable equilibrium of the system. As already remarked by [36], this implies that T is the local asymptotic equilibrium steady state of the system. Therefore UT and DT are the mean velocity and diffusion tensor of the cell population at equilibrium. 2.2 Boundary conditions Since we are going to consider two-dimensional bounded domains without loss of cells and no cells coming in, we shall assume conservation of mass. Therefore, we will require that the chosen boundary condition is no-flux [47] Z S d−1 Z U 0 p(t, x, v, vˆ)vˆ · n(x) dv dvˆ = 0, ∀x ∈ ∂Ω, t > 0 , (12) being n(x) the outward normal to the boundary ∂Ω in the point x. This class of boundary conditions is part of the wider class of non-absorbing boundary conditions. Denoting the boundary operator as R[p](t, x, v, vˆ) = p(t, x, v0 , vˆ 0 ), there are two important classes of kinetic boundary conditions which satisfy (12): the regular reflection boundary operators and the non-local (in velocity) boundary operators of diffusive type. We address the reader to the works [45] and [35] for the definition of these boundary operators. In the present work, we shall consider specular reflection boundary conditions p(t, x, v0 , vˆ 0 ) = p  t, x, v, vˆ − 2(vˆ · n)n |vˆ − 2(vˆ · n)n|  , n · vˆ ≤ 0, (13) that means that cells are reflected with an angle of π/2 when they hit the wall. 5 2.3 Macroscopic limits In order to investigate the overall trend of the system, the macroscopic behavior is typically analyzed. By integrating Eq. (5) with (9) on S d−1 × [0, U], thanks to Eq. (7), we have that ∂tρ(t, x) + ∇ · (ρ(t, x)U(t, x)) = 0 , i.e., the mass is conserved pointwise and in the entire domain, because of no-flux boundary conditions (after integration on Ω). If we multiply Eq. (5) with (9) by vvˆ, and we then integrate the result on S d−1 × [0, U], we see that the momentum is not conserved ∂tρ(t, x)U(t, x) + ∇ · (ρ(t, x)DT (t, x)) = µ(x) (ρ(t, x)UT (x) − ρ(t, x)U(t, x)). We can observe that, if we multiply the transport equations by increasing orders n of power of v and, then, we integrate on the velocity space, we obtain a non-closed system of macroscopic equations, since the equations describing the evolution of n th moment of p contain the (n + 1)th moment. Therefore, we need some procedures to obtain a closed evolution equation (or system of equations) for the macroscopic quantities. In particular, we are interested in the evolution of ρ(t, x) in the emerging regime of the system. Therefore, we shall consider a diffusive or a hydrodynamic scaling of the transport equation (5) with (9), resulting from a proper non-dimensionalization of the system. Diffusive and hydrodynamic limits for transport equations with velocity jump processes have been widely treated in [26, 40, 24, 36]. Formally, we introduce a small parameter   1 and we re-scale the spatial variable as ξ = x, (14) being ξ the macroscopic spatial variable. According to the other characteristic quantities of the system of study, the macroscopic time scale τ will be τ =  2 t, (15) that is the parabolic scaling representing a diffusion dominated phenomenon, or τ = t, (16) that is the hyperbolic scaling that represents a drift driven phenomenon. Up to the spatial scaling (14), we have that the transition probability may be expanded as T(ξ, v, vˆ) = T0(ξ, v, vˆ) + T1(ξ, v, vˆ) + O( 2 ). Therefore, the corresponding means and diffusion tensors will be given by Ui T (ξ) = Z S d−1 Z U 0 Ti(ξ, v, vˆ)v dvdvˆ (17) and D i T (ξ) = Z S d−1 Z U 0 Ti(ξ, v, vˆ)(v − Ui T ) ⊗ (v − Ui T )dv dvˆ . (18) Considering a Hilbert expansion of the distribution function p p = p0 + p1 + O( 2 ), (19) if there is conservation of mass, we have that all the mass is in p0 [26], i.e., ρ0 = ρ, ρi = 0 ∀i ≥ 1 , (20) where ρi = Z S d−1 Z U 0 pi dv dvˆ. Furthermore, for performing the diffusive limit we shall assume that Z S d−1 Z U 0 pi v dv dvˆ = 0 ∀i ≥ 2 [26]. 6 The functional solvability condition that is necessary for performing a diffusive limit (i.e., for choosing τ =  2 t) is U0 T = 0, (21) meaning that the leading order of the drift vanishes, which is coherent with the fact that the time scale τ =  2 t is chosen because the phenomenon macroscopically is diffusion-driven. The diffusive limit procedure prescribes to re-scale (5)-(9) with (14)-(15) and to insert (19) in the re-scaled equation. By comparing equal order of , we obtain the macroscopic diffusive limit, given by (dropping the dependencies) ∂ ∂τ ρ + ∇ · U1 T ρ  = ∇ ·  1 µ ∇ · D 0 T ρ   , (22) being D 0 T (ξ) = Z S d−1 Z U 0 T0(ξ, v, vˆ)v ⊗ v dvdvˆ the diffusion motility tensor. Equation (22) is a diffusion-advection equation, where U1 T is the drift velocity of first order. If (21) does not hold, a hyperbolic scaling is required, that gives ∂ ∂τ ρ + ∇ · ρU0 T  = 0 . (23) This is an advection equation modeling a drift driven phenomenon. We address the reader to [36] for further details. Concerning the boundary conditions, at the macroscopic level (12) gives [47]  DT ∇ρ − ρU1 T  · n = 0, on ∂Ω, for the diffusive limit, whilst for the hyperbolic limit the corresponding boundary condition is U0 T · n = 0, on ∂Ω . 3 A mathematical model for chemotaxis on a fibers network In this section, we shall introduce the transition probability modeling a decision process of a cell in presence of a double directional guidance cue: a fibrous ECM and a chemoattractant. In particular, we shall consider amoeboid cells [60] moving by contact guidance without proteolysis: cells hit the fiber and then move along the direction of the fiber itself. It has been shown experimentally, for example in the case of glioma cancer cells [28], that randomly disposed fibers imply isotropic diffusion of cells, while aligned fibers cause anisotropic diffusion of cells along the preferential direction of the fibers themselves. The first transport model for contact guidance was proposed by [24], further studied and developed by [43, 8, 9] and applied to the study of glioma by [44, 20, 19, 13, 18]. The model proposed by [24] prescribes a distribution of fibers on the space of directions, given by the unit sphere in R n, q = q(x, vˆ), x ∈ Ω, vˆ ∈ S d−1 (24) that satisfies Q1: q(x, vˆ) > 0, ∀x ∈ Ω, vˆ ∈ S d−1 Q2: Z S d−1 q(x, vˆ) dvˆ = 1, ∀x ∈ Ω Q3: q(x, vˆ) = q(x, −vˆ), ∀x ∈ Ω, vˆ ∈ S d−1 , where the last condition means that we are considering a non-polarized network of fibers, so that cells are able to go in both senses in every direction. Being, then, q(x, vˆ) a probability density, we can define the mean direction of the fibers Eq(x) = Z S d−1 q(x, vˆ) vˆ dvˆ, (25) and the diffusion tensor of the fibers, given by the variance-covariance matrix of q Dq(x) = Z S d−1 q(x, vˆ) (vˆ − Eq) ⊗ (vˆ − Eq) dvˆ . (26) As we consider a non polarized fibers network, we have that Eq(x) = 0, (27) meaning that there is no mean direction in the dynamics. The tensor (26) is symmetric and positive definite, when q is a regular probability distribution, and, thus, it is diagonalizable. Each eigenvalue represents the diffusivity in the direction of the corresponding eigenvector, meaning that, if the eigenvalues are equal, there is isotropic diffusion, while, if they are different, there is a preferential direction of motion, i.e. anisotropy. Therefore, the model introduced in [24], as shown in [43], allows to reproduce isotropic/anisotropic diffusion on a non-polarized fibers network. Concerning chemotaxis, we shall consider a chemoattractant in the region Ω defined by a strictly positive definite function S = S(x) : Ω 7−→ R+. (28) We consider that the sensing performed by the cells is non-local, as they may extend their protrusions, through which they sense the environment, up to several cell diameters [3]. The maximum length R of a protrusion is called sensing radius and it has been first introduced in [40] for modeling a non-local gradient of a chemical and, then, used in a number of works (see [10] for a review and references therein) for describing the sensing of macroscopic quantities. In particular, in [36] and, later, in [37] the authors propose a double bias model, in which two cues are sensed non-locally and they affect cell polarization and speed. In the present work we shall drop the sensing of a cue that affects the speed, that will be unbiased, and we will extend the model proposed in [36] to a double sensing of cues affecting the polarization of the cell. Therefore, in the model both S and q will be sensed non-locally by a cell that, starting from its position x, extends its protrusions in every direction vˆ ∈ S d−1 up to the distance R, given by the sensing radius. In particular, assuming a non-local sensing of the fibers network will allow to reproduce a wider range of migration strategies, that a cell can perform in order to cleverly reach the chemoattractant, with respect to a local sensing. Therefore, we shall consider the quantities S(x + λvˆ), q(x + λvˆ, vˆ), ∀ x ∈ Ω, ∀ vˆ ∈ S d−1 , λ ≤ R. Of course, next to the border of the domain Ω, we shall always consider λ such that x + λvˆ ∈ Ω. In order to analyze qualitatively the impact of the non-locality at the macroscopic level, we study, as previously done in [36, 37], the impact of the directional cues S and q with respect to the size of the cell, that is related to its sensing radius R. Thus, we introduce the characteristic length of variation of S as lS := 1 max x∈Ω |∇S| S . (29) It allows to approximate S(x + λvˆ) with a positive quantity S(x + λvˆ) ∼ S(x) + λ∇S · vˆ ≥ 0 ∀λ ≤ R if R < lS (30) where we neglected higher order terms in λ. Beside the above defined characteristic length of variation of the chemoattractant lS , we define an analogue quantity for the fibers distribution. We choose lq := 1 max x∈Ω max vˆ∈S d−1 |∇q·vˆ| q . (31) 8 In this case, we can approximate q(x + λvˆ, vˆ) with a positive quantity q(x + λvˆ, vˆ) ∼ q(x, vˆ) + λ∇q · vˆ ≥ 0 ∀λ < R if R < lq . (32) In particular, this definition of lq takes into account the variation of directionality of the fibers in space, that is what actually influences the cell orientation, more than spatial variation of the density of the extracellular matrix. We analyze the possible scenarios depending on the relation between R, lS and lq. In analogy to [36], let us now introduce the parameters ηq := R lq (33) and ηS := R lS , (34) that quantify the capability of measuring of the cell with respect to the characteristic lengths of variation of the sensed guidance cues q and S. In particular, ηi < 1, i = q, S, means that the sensing radius is smaller than the characteristic length of variation of q (S, respectively) and the idea is that a single instantaneous sensing of the cell is not capable of catching the total spatial variability of q (S, respectively), while if ηi > 1, i = q, S, the sensing radius is large enough in order to capture the spatial variability of q (S, respectively). If we consider the two cues separately, in the first case we expect that the sensing of q (S, respectively) induces a diffusive behavior, while in the second scenario the overall behavior induced by q (S, respectively) is drift-driven. As we are considering the two guidance cues simultaneously affecting cell polarization, we now take into account for limit cases: i) ηq, ηS  1; ii) ηq, ηS  1; iii) ηS  1, ηq  1; iv) ηS  1, ηq  1. In case i), a Taylor expansion cannot be used, since there is no guarantee that the first order approximations are positive, as well as in case iii) and iv) for q and S, respectively. In order to quantify the relative contribution of chemotaxis to contact guidance, we may introduce the parameter η = ηq ηS (35) that is larger than 1 if contact guidance prevails, whilst it is smaller then 1 if chemotaxis is stronger. Due to (33) and (34), we have that, despite its definition, η does not depend on the size and sensing capability of the cell, as η = ηq ηS = lS lq . In particular, if lS is larger than lq, i.e. η > 1, it means that the gradient of q is steeper than the one of S, thus enhancing a stronger effect of contact guidance on the dynamics. We may also observe that in case iii) we have always that η > 1 while in case iv) we always have η < 1, i.e. contact guidance is weaker then chemotaxis. We shall propose two different transition probabilities describing two different sensing strategies: in the first model the sensings of q and S are independent, while in the second model a unique sensing is performed. In the first model, we shall introduce a transition probability that is the product of two different independent sensings T[q, S](x, v, vˆ) = c(x) Z R+ γS (λ)S(x + λvˆ) dλ Z R+ γq(λ) q(x + λvˆ, vˆ) dλ ψ(v). (36) In this case the cell located in position x measures along the direction vˆ the field S(x+λvˆ) weighted by γS , and, independently, the quantity q(x + λvˆ, vˆ), weighted by γq. The sensing functions γS 9 and γq have compact support in [0, R] and they may be Dirac deltas centered in R, if the cell only measures the guidance cues on its membrane (only on x + Rvˆ for every vˆ), or Heaviside functions if the cell measures and gives the same weight to q and S from x to x + Rvˆ in every direction. Formally the transition probability might be seen as the product of the independent probabilities of q and S, i.e. T[q, S] = Tˆ[q] Tˆ[S]. The second model prescribes a simultaneous averaging of the guidance cues S and q, i.e., T[q, S](x, v, vˆ) = c(x) Z R+ γ(λ)S(x + λvˆ) q(x + λvˆ, vˆ)dλ ψ(v). (37) This transition probability describes a cells in position x that measures in the direction vˆ the two quantities S(x + λvˆ) and q(x + λvˆ), weighting both with γ, that is a sensing function. Formally, as the two sensing are not independent and, therefore, factorized, we have a conditioning of S given q and viceversa, i.e., T[q, S] = T˜[S|q] T˜[q] = T˜[q|S] T˜[S]. In (36) and (37), c(x) is a normalization coefficient. Moreover the probability density ψ is the distribution of the speeds on the interval [0, U] and satisfies Z U 0 ψ(v)dv = 1 . We introduce its mean speed U¯ = Z U 0 v ψ(v) dv (38) and the second moment D = Z U 0 v 2 ψ(v) dv , (39) such that the variance of ψ is given by σ 2 ψ = 1 2 (D − U¯ 2 ). We shall refer to the transport model (5)-(9) with (36) as non-local independent sensing model, in which the cell averages the two cues independently according to two different sensing functions γq, γS . On the other hand, the transport model (5)-(9) with (37) is defined as non-local dependent sensing model, describing cells that sense the two cues at the same time and average them with a unique sensing kernel γ. In the next sections we shall analyze the macroscopic limits for the two models in the scenarios i) − iv) and we shall compare the two models. 3.1 Amoeboid motion and chemotaxis: non-local independent sensing We first consider the non-local independent sensing case (5)-(9) with (36). We recall the expression of the transition probability T[q, S](x, v, vˆ) = c(x) Z R+ γS (λ)S(x + λvˆ) dλ Z R+ γq(λ) q(x + λvˆ, vˆ) dλ ψ(v). The average of T, that will be the equilibrium velocity of the cell population, is given by UT (x) = c(x)U¯ Z S d−1 vˆ Z R+ γS (λ)S(x + λvˆ) dλ Z R+ γq(λ) q(x + λvˆ, vˆ) dλ! dvˆ . (40) Case i) In this case, we shall choose  = min  1 ηq , 1 ηS  . 10 As a consequence of the fact that T cannot be expanded in powers of  after re-scaling with (14), we have that U0 T = UT given by (40). Therefore, we have to perform a hyperbolic scaling that leads to the following macroscopic equation for the cells macroscopic density: ∂ ∂τ ρ(τ, ξ) + ∇ · (ρ(τ, ξ)UT (ξ)) = 0 , (41) with UT (ξ) given by the re-scaling of (40) with (14). Case ii) In this case, we can expand both S(x + λvˆ) and q(x + λvˆ, vˆ) and consider the approximations (30) and (32) for λ < min{lq, lS }. Therefore, we approximate the transition probability by substituting (30) and (32) in (36), and, thus, we obtain the following approximation for the turning kernel T[q, S], that reads T[q, S](x, v, vˆ) =c(x) h Γ S 0 Γ q 0 S(x) q(x, vˆ) + ΓS 0 Γ q 1 S(x) ∇q · vˆ + ΓS 1 Γ q 0 q(x, vˆ) ∇S · vˆ i ψ(v) (42) where we neglected higher orders terms in λ. In the latter c(x) = 1 S(x) ΓS 0 Γ q 0 and Γ S i := Z R+ λ i γS (λ) dλ i = 0, 1 Γ q i := Z R+ λ i γq(λ) dλ i = 0, 1 . The quantities Γq 0 , Γ S 0 are the weighted (by γq, γS ) measures of the sensed linear tracts in every direction, whilst Γq 1 , Γ S 1 are the averages of γq, γS on [0, R]. We can, then, introduce the small parameter  = min{ηq, ηS } (43) and re-scale the space variable as ξ = x, getting T0[q, S](ξ, v, vˆ) = q(ξ, vˆ)ψ(v), (44) meaning that the equilibrium is determined by the fibers distribution, and T1[q, S](ξ, v, vˆ) =  Γ q ∇q · vˆ + ΓS q(ξ, vˆ) ∇S S(ξ) · vˆ  ψ(v) where Γ S := Γ S 1 Γ S 0 , Γq := Γ q 1 Γ q 0 . Because of (27) and (44), we have that UT 0 (ξ) = 0, meaning that we are in a diffusive regime, and the diffusive limits leads to the advection-diffusion equation (22). The explicit form for the zero-order macroscopic diffusion tensor is D 0 T (ξ) = D Z S d−1 q(ξ, vˆ)vˆ ⊗ vˆ dvˆ = D Dq(ξ), (45) and for the macroscopic first-order velocity is U1 T (ξ) = U¯ Z S d−1  Γ q ∇q · vˆ + ΓS ∇S S(ξ) · vˆ q(ξ, vˆ)  vˆdvˆ = U¯ Γ q Z S d−1 (∇q · vˆ) vˆdvˆ + U¯ Γ S ∇S S Z S d−1 vˆ ⊗ vˆ q(ξ, vˆ)dvˆ = U¯  Γ q ∇ · Dq + ΓS Dq ∇S S  . (46) 11 Therefore, the diffusion-advection equation (22) reads (dropping the dependencies) ∂ ∂τ ρ + ∇ · χ S Dq∇S + χ q∇ · Dq  ρ = ∇ ·  1 µ ∇ · D Dq ρ   , (47) where χ S (ξ) := U¯ Γ S S(ξ) , χ q := U¯ Γ q (48) are the sensitivities. The diffusion represented by the motility tensor of the cells (45) only depends on the fibers distribution, while the advective term has two contributions differently weighted by the sensitivities (48). We remark that, in this regime, we obtain the same macroscopic behavior postulated by Keller and Segel [29], with the logarithmic chemotactic sensitivity χS given in (48). The term Dq∇S depends on both the fibers distribution and the chemotactic field; it never vanishes if ∇S is not the null vector, since it may be proved that Dq is invertible. In the case of randomly disposed fibers, corresponding to the isotropic case, i.e., when Dq is proportional to the identity matrix, then Dq∇S is parallel to ∇S, that, thus, represents the anisotropy direction. On the other hand, when Dq is anisotropic, if ∇S is not parallel to the eigenvector corresponding to the highest eigenvalue of Dq, then the migration does not follow the dominant direction of the fibers, but rather its projection on ∇S. Moreover, the second contribution in the drift term, i.e., ∇ · Dq, is a measure of the velocity field induced by the spatial variation of the distribution of the fiber directions, that determines the microscopic velocities of the cells. This term vanishes if the fibers distribution is homogeneous in space. Therefore, if q is homogeneous in space, even in case of competing cues, i.e., Eq ⊥ ∇S, in general the advective term U1 T does not vanish, while in case of cooperating cues, i.e., ∇S is an eigenvector of Dq with eigenvalue D∇S , migration is in direction ∇S with a kinetic factor χS D∇S . In intermediate scenarios, migration happens in the projection Dq∇S, but, if q is not homogeneous, the dynamics is more complex and, even in case of cooperation, we cannot conclude anything about additivity effects. Case iii) In this case, we can only expand with Taylor series the chemoattractant, as in (30), and the turning kernel (36) may be approximated as T[q, S](x, v, vˆ) =c(x) h S(x) ΓS 0 Z R+ γq(λ)q(x + λvˆ, vˆ) dλ + ΓS 1 (∇S · vˆ) Z R+ γq(λ)q(x + λvˆ, vˆ) dλi ψ(v) (49) where we neglected higher order terms in λ. Here, the normalization coefficient reduces to c(x) = 1 Γ S 0 Γ q 0 S(x) . In this case we may choose  = min  1 ηq , ηS  , and, re-scaling the space variable as (14), we get T0[q, S](ξ, v, vˆ) = 1 Γ q 0 Z R+ γq(λ)q(ξ + λvˆ, vˆ) dλ ψ(v) (50) and T1[q, S](ξ, v, vˆ) = Γ S Γ q 0  ∇S S · vˆ  Z R+ γq(λ)q(ξ + λvˆ, vˆ) dλ ψ(v). Equation (50) indicates that the equilibrium distribution is a non-local average of the fibers distribution according to the sensing kernel γq and normalized by the measure of the sensed linear tract Γq 0 over the direction vˆ. Its average is U0 T (ξ) = U¯ Γ q 0 Z R+ γq(λ)Eq(ξ + λvˆ) dλ that vanishes as ξ + λvˆ ∈ Ω and (27) holds true. Therefore, we perform the diffusive limit that leads to (22) with D 0 T (ξ) = D Z S d−1 1 Γ q 0 Z R+ γq(λ) q(ξ + λvˆ, vˆ) dλ vˆ ⊗ vˆ dvˆ . Let us now define D λ q (ξ) = Z S d−1 q(ξ + λvˆ, vˆ) vˆ ⊗ vˆ dvˆ , (51) that, for each point ξ, is the diffusion tensor of the fibers on a circle of radius λ, and D¯ 0 q = 1 Γ q 0 Z R+ γq(λ)D λ q dλ , (52) that is a weighted diffusion tensor of the fibers in the whole neighborhood sensed by the cells, so that D 0 T (ξ) = DD¯ 0 q (ξ) (53) and U1 T (ξ) = U c ¯ (ξ) Z S d−1 Γ S 1 (∇S · vˆ) Z R+ γq(λ) q(ξ + λvˆ, vˆ) dλ! vˆ dvˆ = U c ¯ (ξ) ΓS 1 ∇S Z R+ γq(λ) Z S d−1 vˆ ⊗ vˆ q(ξ + λvˆ, vˆ) dvˆ dλ = = U¯ Γ S D¯ 0 q (ξ) ∇S S(ξ) = χ S (ξ)D¯ 0 q (ξ)∇S . (54) We have defined the chemotactic sensitivity as χ S (ξ) := U¯ Γ S S(ξ) , that is a function of the chemical alone, as it is the cue inducing a diffusive behavior. Here, the advection velocity is related to a non-local average of the diffusion tensor of the fibers D¯ 0 q projected on ∇S, and it cannot be decomposed into two contributions because of the large size of the cell with respect to the spatial variability of the fibers distribution. Therefore, in this case the additivity effect of the two cues is not evident and the possible scenarios are many more. Remark If we consider γq = δ(λ − 0) we obtain a local sensing of fibers. Without chemotaxis we would have the classical model for contact guidance [24], that gives rise, at the macroscopic level, to a fully anisotropic diffusive equation. The presence of a non-local chemoattractant, even when R < lS , gives rise to a drift correction term proportional to Dq∇S. Case iv) The last case allows only for the Taylor expansion of the distribution function q, as in (32). Therefore, the turning kernel may be approximated as T[q, S](x, v, vˆ) =h c0(x) Γq 0 q(x, vˆ) Z R+ γS (λ)S(x + λvˆ) dλ + c1(x)Γq 1 (∇q · vˆ) Z R+ γS (λ)S(x + λvˆ) dλi ψ(v) (55) where c0(x) −1 := 2 Z S d−1 Γ q 0 q(x, vˆ) Z R+ γS (λ)S(x + λvˆ) dλ dvˆ 13 and c1(x) −1 := 2 Z S d−1 Γ q 1 (∇q · vˆ) Z R+ γS (λ)S(x + λvˆ) dλ dvˆ , both different from zero. In this case we may choose  = min  1 ηS , ηq  and, by re-scaling (55) with (14), we get T[q, S] = T0[q, S]. Hence U0 T (ξ) does not vanish in Ω, as it is given by U0 T (ξ) = U¯ Γ q 0 c0(ξ) Z S d−1 vˆ q(ξ, vˆ) Z R+ γS (λ) S(ξ + λvˆ) dλ dvˆ + U¯ Γ q 1 c1(ξ) Z S d−1 vˆ ⊗ vˆ ∇q Z R+ γS (λ) S(ξ + λvˆ) dλ dvˆ , (56) and the macroscopic equation is given by (23). The mean velocity (56) is a linear combination of a non-local measure of the chemoattractant S over the fibers network and a non-local measure of S weighted by the directional average of the spatial variability of the fiber direction. Remark If we consider a local sensing for the chemoattractant, i.e. γS = δ(λ − 0), we obtain a macroscopic advection-diffusion equation, where the macroscopic velocity is induced by the spatial variation of the distribution of fiber directions ∇ · Dq, and the measure of S does not affect the choice of the direction. In this case, if ∇q vanishes, the model reduces to a fully anisotropic diffusive equation [24]. 3.2 Amoeboid motion and chemotaxis: non-local dependent sensing Concerning the non-local dependent sensing case (5)-(9) with (37), we recall the expression of the transition probability T[q, S](x, v, vˆ) = c(x) Z R+ γ(λ)S(x + λvˆ) q(x + λvˆ, vˆ)dλ ψ(v), with c(x) := Z S d−1 Z R+ γ(λ)S(x + λvˆ) q(x + λvˆ, vˆ)dλ . The macroscopic velocity is here given by UT (x) = c(x)U¯ Z S d−1 vˆ Z R+ γ(λ)S(x + λvˆ) q(x + λvˆ, vˆ)dλ dvˆ . (57) The macroscopic limits can be performed as in the previous section and the choice of the parameter  will be the same for the cases i)−iv), since it does not depend on the kind of model (independent or dependent sensing), but only on ηS and ηq. Case i) In this case we cannot consider the expansions (32) and (30), and, thus, we cannot expand the turning kernel, whose non vanishing average is given by (57). Therefore, we perform a hyperbolic limit leading to (23) with macroscopic velocity (57). 14 Case ii) When, instead, the maximum sensing radius R is smaller than both the characteristic lengths, we may consider the positive expansions (32) and (30) and substitute them in (37). Neglecting the higher order terms in λ, we get the approximation T[q, S](x, v, vˆ) = c(x) h S(x) Γ0 q(x, vˆ) + S(x) Γ1 ∇q · vˆ + Γ1 q(x, vˆ) ∇S · vˆ i ψ(v) (58) with c(x) = 1 S(x) Γ0 and Γi := Z R 0 λ i γ(λ) dλ , i = 0, 1 . Re-scaling the space variable as in (14), we find T0[q, S](ξ, v, vˆ) = q(ξ, vˆ)ψ(v) and T1[q, S](ξ, v, vˆ) = Γh ∇q · vˆ + q(ξ, vˆ) ∇S S · vˆ i ψ(v) with Γ := Γ1 Γ0 . Therefore, UT 0 (ξ) = 0, because of (27), and we can perform a diffusive scaling that leads to the zero-order macroscopic diffusion tensor D 0 T (ξ) = D Dq(ξ), (59) and to the macroscopic first-order velocity U1 T (ξ) = U¯ Γ ∇ · Dq(ξ) + U¯ Γ Dq(ξ) ∇S S . (60) The macroscopic advection-diffusion equation (22) now reads (dropping the dependencies) ∂ ∂τ ρ + ∇ ·  χ  ∇ · Dq + Dq ∇S S  ρ  = ∇ ·  1 µ ∇ · D Dq ρ   (61) where χ := U¯Γ . Similar considerations to the case ii) of the non-local independent sensing model may be done, except that there is a unique sensitivity χ that weights equally the two contributions to the advection term (60). Case iii) In this case, we expand only the chemoattractant S(x+λvˆ), as in (30), and the turning kernel (37) can be approximated as T[q, S](x, v, vˆ) =c(x) h S(x) Z R+ γ(λ)q(x + λvˆ, vˆ) dλ + (∇S · vˆ) Z R+ λ γ(λ)q(x + λvˆ, vˆ) dλi ψ(v) (62) with c(x) := 1 Γ0 S(x) . Re-scaling the space variable as in (14), we find T0[q, S](ξ, v, vˆ) = 1 Γ0 Z R+ γ(λ)q(ξ + λvˆ, vˆ) dλ ψ(v), 1 and T1[q, S](ξ, v, vˆ) = 1 Γ0  ∇S S · vˆ  Z R+ λ γ(λ)q(ξ + λvˆ, vˆ) dλ ψ(v). The macroscopic velocity of zero order is then U0 T (ξ) = U¯ Γ0 Z S d−1 Z R+ γ(λ) q(ξ + λvˆ, vˆ) dλ vˆ dvˆ , (63) and, again, it vanishes because of ξ + λvˆ ∈ Ω and (27). Therefore, the macroscopic diffusionadvection equation is given by (22) with D 0 T (ξ) = D Γ0 Z R+ D λ q (ξ) γ(λ)dλ = DD¯ 0 q (64) and U1 T (ξ) = U¯ Γ0 Z R+ λ D λ q (ξ) γ(λ) dλ ∇S S(ξ) = U¯D¯ 1 q (ξ) ∇S S(ξ) , (65) where we defined D¯ 1 q (ξ) = 1 Γ0 Z R+ λ D λ q (ξ) γ(λ)dλ (66) as an average of the weighted diffusion tensor of the fibers in the whole neighborhood sensed by the cells, differently form the case iii) of the non-local independent model. Case iv) In this case, again, we can only consider the positive approximation (32), and the transition probability rewrites as T[q, S](x, v, vˆ) =h c0(x)q(x, vˆ) Z R+ γ(λ) S(x + λvˆ) dλ + c1(x)∇q · vˆ Z R+ λ γ(λ) S(x + λvˆ) dλi ψ(v) (67) where c0(x) −1 := 2 Z S d−1 q(x, vˆ) Z R+ γ(λ)S(x + λvˆ) dλ dvˆ and c1(x) −1 := 2 Z S d−1 (∇q · vˆ) Z R+ λ γ(λ)S(x + λvˆ) dλ dvˆ , both different from zero. As before, by re-scaling (67) with (14), we get T[q, S] = T0[q, S] and we have that the average velocity U0 T = UT 6= 0. In particular, it is given by UT (ξ) : = U¯ c0(ξ) Z S d−1 vˆ q(ξ, vˆ) Z R+ γ(λ) S(ξ + λvˆ) dλ dvˆ + U¯ c1(ξ) Z S d−1 vˆ ⊗ vˆ ∇q(ξ, vˆ) Z R+ λ γ(λ)S(ξ + λvˆ) dλ dvˆ (68) and, thus, we perform a hyperbolic limit leading to (23). The mean velocity (68) is a linear combination of a non-local measure of the chemoattractant S over the fibers network and a nonlocal average of S weighted by the directional average of the spatial variability of the fiber direction. 16 Case non-local independent sensing (5)-(9)-(36) non-local dependent sensing (5)-(9)-(37) i) drift dominated drift dominated UT = cU¯ Z Sd−1 vˆ Z R 0 γS (λ)S(ξ + λvˆ)dλZ R 0 γq(λ) q(ξ + λvˆ, vˆ)dλdvˆ UT = cU¯ Z Sd−1 vˆ Z R 0 γ(λ)S(ξ + λvˆ)q(ξ + λvˆ, vˆ)dλ dvˆ ii) drift-diffusion drift-diffusion D 0 T = D Dq D 0 T = D Dq U1 T = U¯  Γ q ∇ · Dq + ΓS Dq ∇S S  U1 T = U¯Γ  ∇ · Dq + Dq ∇S S  iii) drift-diffusion drift-diffusion D 0 T = DD¯0 q D 0 T = DD¯0 q U1 T = U¯ Γ S D¯0 q ∇S S U1 T = U¯ D¯1 q ∇S S iv) drift dominated drift dominated UT = U¯ Γ q 0 c0 Z Sd−1 vˆ q Z R 0 γS (λ) S(ξ + λvˆ) dλ dvˆ UT := U¯ c0 Z Sd−1 vˆ q Z R 0 γ(λ) S(ξ + λvˆ) dλ dvˆ + U¯ Γ q 1 c1 Z Sd−1 vˆ ⊗ vˆ ∇q Z R 0 γS (λ) S(ξ + λvˆ) dλ dvˆ + U¯ c1 Z Sd−1 vˆ ⊗ vˆ ∇q Z R 0 λ γ(λ)S(ξ + λvˆ) dλ dvˆ Table 1: Summary of the models (dropping the local dependencies in ξ). 3.2.1 Comments We can observe that, if γq = γS = γ = δ(λ−R), the two non-local transport models for independent and dependent sensing are the same, while, if the sensing kernels are not dirac deltas (even if γq = γS = γ), the transport models are always different. Instead, at the macroscopic level, with any choice of the sensing functions the models coincide only in case ii). In this case, in fact, the macroscopic limits are different only if γq 6= γS , while in the cases iii) and iv) they are different if the sensing kernel are not dirac deltas (even if γS = γq = γ). The relevant difference concerns the macroscopic transport velocities (see (54) and (65) for the case iii, and (56) and (68) for the case iv). In fact, in the cases iii) and iv), for the non-local dependent sensing model, as only one cue is considered non-locally and both cues are averaged with the same sensing function γ, we have a weighted average on λ of the non-local quantities, that results in the weighted averages (65) and the second term of (68). These remarks are summarized in Table 2. γq = γS = γ = δ γq = γS = γ 6= δ γq 6= γS Meso models (5)-(9)-(36) and (5)-(9)-(37) = 6= 6= Macro models case i) = 6= 6= Macro models case ii) = = 6= Macro models case iii) = 6= 6= Macro models case iv) = 6= 6= Table 2: Summary of the comparison of the models for different choices of the sensing functions. = indicates the cases in which the models coincide, while 6= the ones in which the models are different. 17 4 Numerical simulations We shall now propose two-dimensional numerical simulations in order to illustrate the behavior of the kinetic transport models for non-local independent sensing and non-local dependent sensing. In particular, we shall integrate numerically the transport equation as in [36] and, then, we shall compute the macroscopic density (1). Concerning the fibers network, a classical used distribution is the Von Mises distribution [39] q˜(x, vˆ) = 1 2πI0(k(x)) e k(x) u(x)·vˆ where Iν(k) is the modified Bessel function of first kind of order ν and u(x) = (cos(θq(x)),sin(θq(x))). It can be proved that Eq˜(x) = u(x) [25], and, therefore, θq(x) is the mean direction in the space [0, 2π) of the fibers located at point x. As we are dealing with cell migrating on a nonpolarized network of fibers, we shall consider the symmetric version, namely the Bimodal Von Mises distribution q (x, vˆ) = 1 4πI0(k(x))  e k(x) u(x)·vˆ + e −k(x) u(x)·vˆ  , that also satisfies Q3; its variance is [25] Dq(x) = 1 2  1 − I2(k) I0(k)  I2 + I2(k) I0(k) u ⊗ u, where I2 is the identity tensor in R 2×2 , while k and u are functions of x. Moreover, the variance in the space [0, 2π) is the scalar Dq(x) = 1 2 Z 2π 0 q(θ − θq) 2 dθ =  1 − I1(k) I0(k)  that represents the degree of alignment of the fibers at point x. 4.1 Test 1: local ECM sensing and non-local chemotaxis As a first example, we shall present the particular case in which the sensing of q is local. This illustrates the effect of a second directional cue when dealing with a cell population migrating by contact guidance and evaluating the local alignment of the fibers over a non-polarized network. Formally, we are dealing with (36) in which γq = δ(λ−0). In particular, we shall consider a region Ωq = {x = (x, y) ∈ Ω s.t. x1 ≤ x ≤ x2} (69) with x1 = 1.8 and x2 = 3.2 in which the fibers are strongly aligned along the direction identified by θq = π/2. In particular, for (x, y) ∈ Ωq, k(x, y) = 700, such that Dq = 5 · 10−3 . In the rest of the domain Ω − Ωq fibers are uniformly distributed. The chemoattractant has a Gaussian profile S(x, y) = p mS 2πσ2 S e − ((x, y) − (xS , yS ))2 2σ 2 S . (70) In particular, in Test 1 (see Fig. 1) we choose (xS , yS ) = (4, 4), mS = 10, σ2 S = 0.1. The initial condition for the cell population is a Gaussian ρ0(x, y) = r0e − ((x, y) − (x0, y0))2 2σ 2 0 (71) 18 with r0 = 0.1 and σ 2 0 = 0.1. In this first test, the initial condition for the cell population is centered in (x0, y0) = (2.5, 2.5), i.e., the center of the region Ωq (see Fig. 1a). Without chemoattractant, because of the presence of highly aligned fibers, we would expect that cells diffuse anisotropically in the preferential direction of the fibers ±π/2, forming the well known ellipsis [43], that represents cells moving with the same probability along direction π/2 and −π/2. In the present case, due to the presence of a chemoattractant, the symmetry is broken, and, even if q describes a non-polarized fibers network, there is a preferential sense of motion (see Fig. 1d-1f). In particular, cells migrate along the fibers in the direction identified by θq = π/2, corresponding to the preferential sense imposed by the presence of the chemoattractant in the upper-right corner of the domain Ω. Given this directional setting, the cell population dynamics is also greatly affected by the strength of the chemoattractant, that depends on mS and σ 2 S , the degree of the alignment Dq, that depends on k(x, y), and by the sensing radius R. Another important aspect is the sensing function γS , that influences the transient dynamics and, especially, the relaxation time. This appears to be double in the case of a Heaviside function, since the kernel γS doubles when computed with a Heaviside function instead of a Dirac delta (see also [36]). (a) Initial cell distribution (b) Initial average polarization (c) Center of mass: trajectory. (d) t=1.25 (e) t=3.75 (f) t=12.5 (g) t=1.25 (h) t=3.75 (i) t=12.5 Figure 1: Test 1 Evolution of the initial distribution given in (a) for the case of local q and non-local chemoattractant S with sensing function γS = δ(λ − R). In (b), S is a Gaussian centred in (4, 4) and with mS = 10 and σ 2 S = 0.1. The sensing radius of the cells is set to R = 0.5. (c): trajectory of the center of mass of the cell population, where each black dot is plotted every ∆t = 1. Figs. (d)-(f): evolution of the macroscopic density. Figs. (g)-(i): polarizations of the cells. 19 We also analyzed the average polarization of the cells at every position x, that is given by the momentum (2). The microscopic directions of cells are initially randomly distributed and they start from a vanishing initial speed (see Fig. 1b). Then, they start to align along the fibers and to migrate upward in the direction individuated by the angle π/2, since cells sense the chemoattractant (see Figs. 1g-1h). Eventually when cells reach the level y = 4, the microscopic directions polarize towards the chemoattractant (see Fig. 1i). The center of mass plotted in Fig. 1c stays in the region Ωq during the migration of cells along the fibers bundle in Ωq, and it moves out of Ωq only when it reaches y = 4. The black dots are plotted every ∆t = 1 and it is clear that the highest acceleration happens when cells are on the bundle of fibers, while they are slowed down when they start to move out of the fibers stripe Ωq. 4.2 Test 2: non-local ECM sensing and chemotaxis As a second test, we present both the non-local independent sensing model and the non-local dependent sensing model. We shall now consider a non-local sensing of the distribution of fibers. In particular, we assume fibers distributed similarly to the previous test, i.e., fibers shall be highly aligned in Ωq given, this time, by x1 = 2.1 and x2 = 2.9 (see Fig. 2b). Here, for (x, y) ∈ Ωq, k(x, y) = 100, that corresponds to Dq = 0.0025, and θq(x, y) = π/2. In the region Ω−Ωq fibers are uniformly distributed. The initial condition of the cell population is (71) with in (x0, y0) = (1, 0.5) (see Fig. 2a) while the chemoattractant is located as in Test 1, with mS = 10 and σ 2 S = 0.05. We shall compare the dynamics of the cells in four settings: 1. local fiber distribution and non-local chemoattractant, as in Test 1, i.e., (36) with γq = δ(λ − 0) and γS = δ(λ − R); 2. non-local sensing with a Dirac Delta for both q and S; this corresponds to both (36) and (37) with γq = γS = γ = δ(λ − R); 3. non-local independent sensing with Heaviside sensing functions for both S and q, i.e., (36) with γq = γS = H(R − λ); 4. non-local dependent sensing for q and S, dealing with (37) and γ = H(R − λ). Results of these simulations are shown in Fig. 2. We can observe that, in the 1-4 settings, cells start from (1, 0.5), they are attracted by the chemoattractant and, on their way towards S, they cross the aligned fibers region Ωq and climb up this region in the direction π/2. Eventually, in all the cases, cells reach the chemoattractant, but the dynamics, as well as the transient time, is influenced by the different sensing kernels, even though the differences are not extremely appreciable, and by the local or non-local sensing strategy. Although settings 3 and 4 in Fig. 2, that are related to the case of independent and dependent cues, respectively, do not show very strong differences, in case 3 (see Figs. 2k-2n) the tendency of going in both the direction π/2, determined by the fibers, and π/4, determined by the chemoattractant, appears more marked because of the independent sensing. In contrast, this behavior results the least evident in the case in which cells deal with a local sensing of the fibers (setting 1), resulting also in a general slow down of the dynamics. 4.3 Test 3. non-local independent sensing model: comparison of the cases i) − iv) We now present a comparison of the macroscopic behaviors of the cells, depending on the relation between R, lS and lq, i.e., we compare the cases i), ii), iii) and iv). In particular, we shall do this for the non-local independent sensing model with γq = γS = H(R − λ), as this is the case in which the transport model is different from the dependent sensing model. Additionally, the independence of the two sensings allows to visualize more efficiently the two distinct directional effects (contact guidance and chemotaxis). 20 (a) Initial condition for cells (b) Initial fiber distribution (c) t=1.25 (d) t=3.75 (e) t=5 (f) t=6.25 (g) t=1.25 (h) t=3.75 (i) t=5 (j) t=6.25 (k) t=1.25 (l) t=3.75 (m) t=5 (n) t=6.25 (o) t=1.25 (p) t=3.75 (q) t=5 (r) t=6.25 Figure 2: Test 2 Time evolution of the initial distribution given in Fig. 2a in the four settings 1-4. The sensing radius of the cells is R = 0.5 and the chemoattractant is (70) with mS = 10, σ2 S = 0.05 and (xS , yS ) = (4, 4). Setting 1 is represented in Figs. (c)-(f): local q and non-local chemoattractant, γS = δ(λ − R). Setting 2 is represented in Figs. (g)-(j): non-local q and S with sensing functions γq = γS = δ(λ−R). Setting 3 is represented in Figs. (k)-(n): non-local q and S, independent sensing with γq = γS = H(R − λ). Setting 4 is represented in Figs. (o)-(r): non-local q and S, dependent sensing with γ = H(R − λ). 21 We shall consider the turning kernel describing contact guidance lead by a q with mean direction θq(x, y) = 3π/4 ∀(x, y) ∈ Ω and coefficient k(x, y), modulating the strength of the alignment, given by a gaussian distribution k(x, y) = mke − ((x, y) − (xk, yk))2 2σ 2 k (72) where (xk, yk) = (2.5, 2.5) and σ 2 k = 0.15 (Fig. 3d). This mimics the situation of fibers more aligned in the central circular region and uniformly disposed in the rest of the domain. We shall consider different values of mk in order to obtain different values of lq: mk = 10 corresponds to lq ≈ 0.031 and mk = 100 corresponds to lq ≈ 0.0031. Details about the estimation of lq for a Bimodal Von Mises distribution of fibers q are given in Appendix A. The chemoattractant is (70) with (xS , yS ) = (4.5, 4.5) and mS = 10. In the simulations, we shall consider three different values for the variance of the chemoattractant σ 2 S in order to obtain different values of lS : σ 2 S = 0.05 that corresponds to lS = 0.002 in Fig. 3a, σ 2 S = 0.25 that corresponds to lS = 0.055 in Fig. 3b and σ 2 S = 1.8 that corresponds to lS = 0.25 in Fig. 3c. The initial distribution of cells for all the tests presented in Figs. 4, 5, 6, 7 and 8 is given by (71) with (x0, y0) = (1.5, 1.5), r0 = 0.1, σ 2 0 = 0.1. In particular, we present five sets of simulations that are summarized in Table 3. lS lq R Case η Figure 0.002 0.0031 0.7 i) < 1 4 0.25 0.0031 0.7 i)  1 5 0.055 0.031 0.02 ii) > 1 6 0.25 0.0031 0.02 iii)  1 7 0.002 0.031 0.02 iv) < 1 8 Table 3: Summary of the simulations presented in Test 3. In Fig. 4, we consider the case in which ηS , ηq  1, i.e., we are dealing with case i). The macroscopic behavior is strongly hyperbolic with macroscopic velocity given by (40). In fact, in Fig. 4 we can observe that the behavior is not diffusive and the cluster of cells is quite compact. Moreover, when cells reach the region in which fibers are strongly aligned in the direction 3π/4 (as shown in Fig. 3d), that is perpendicular to the favorable direction π/4 induced by the chemoattractant, they surround that region inducing strong alignment and go over towards the chemoattractant. In this setting, the parameter defined in (35) is slightly smaller then 1 and, in fact, chemotaxis prevails in the overall dynamics, as the stationary state is clearly peaked on the chemoattractant profile, but the fibers structure influences the transient. In Fig. 5, we shall consider S with σ 2 S = 1.8 and, consequently, lS = 0.25 (see Fig. 3c). Concerning the fibers, we have mk = 100, so that lq ≈ 0.0031, and the sensing radius is R = 0.7. This setting falls again in case i), but the behavior is different with respect to the previous simulation in Fig. 4. The chemoattractant in Fig. 3c, in fact, is spread over the whole domain and, actually, the quantity lS is almost 102 times the lS considered in Fig. 3a and used for the simulation in Fig. 4. Even though we are still in a strongly hyperbolic case and cells are guided by the strong drift (40), as R is slightly larger then lS and lS is large, the cell cluster diffuses a bit more in the domain. When it reaches the region of strongly aligned fibers, it starts to surround that region (see Figs. 5a-5c), but, as ηS = 2.8 = O(1), some cells, that do not surround the region, are slowed down and partially tend to align along the fibers. In Fig. 5d, for instance, we have a high density of cells both in the strongly aligned fiber region and in the region of high density of chemoattractant. Eventually, cells manage to overcome the area of highly aligned fibers and they tend to converge to the chemoattractant profile (see Figs. 5e-5f). Now, the the overall dynamics is greatly affected by the fibers and, in fact, η  1. The second scenario, illustrated in Fig. 6, refers to the case ii), since the sensing radius R = 0.02 is smaller than both lS = 0.055 and lq ≈ 0.031. At the macroscopic level, the behavior 22 (a) Chemoattractant S with σ 2 S = 0.05. (b) Chemoattractant S with σ 2 S = 0.25. (c) Chemoattractant S with σ 2 S = 1.8. (d) Fibers distribution Figure 3: Test 3 Three different chemoattractants used for comparing models i) − iv). The chemoattractant profile is given by (70) with mS = 10 and (a) σ 2 S = 0.05, corresponding to lS = 0.002, (b) σ 2 S = 0.25, corresponding to lS = 0.055, and (c) σ 2 S = 1.8, corresponding to lS = 0.25. The fibers distribution in sketched in (d). of the system is described by the diffusion-advection equation (47) with macroscopic velocity (46). Actually, in Fig. 6 we can observe a highly diffusive behavior, as the macroscopic density of cells has invaded almost the half of the domain before even starting to be influenced by the fibers. If we compare the same time step in Figs. 6b and 5b, we see that the cells are in both cases reaching the fibers and feeling the region in which fibers are aligned the most. However, in Fig. 5b the cell cluster is much more compact than in Fig. 6b, where, instead, cells already occupied half of the domain, because of diffusion, and we have high density of cells both closely to the strongly aligned fiber region and around the initial position. Therefore, cells start surrounding the central region of strongly aligned fibers, because they already sense the chemoattractant, and, once overcome this area, they tend to the chemoattractant profile (see Figs. 6c-6f). In particular, in the transient time, cells accumulate the most at the sides of the region with highly aligned fibers. In this specific setting, η > 1 and, in fact, contact guidance highly affects the dynamics. The third scenario, illustrated in Fig. 7, refers to the case iii), since the sensing radius R = 0.02 is smaller than lS = 0.25 but it is larger then lq ≈ 0.0031. The macroscopic setting is described by a diffusion-advection equation with diffusion tensor and drift velocity given by (53) and (54), respectively. As ηS < 1, we have that the chemoattractant induces a strong diffusivity, but being ηq > 1, the alignment of fibers strongly affects the dynamics (see Figs. 7c-7d). Comparing, in addition, Figs. 6b and 7b, we have now that the highest cell concentration is in the mean fiber direction θq = 3π/4 in the region surrounding the center of the domain, where the fibers are aligned with a higher degree. As already observe in section 3, this scenario prescribes η  1 and, in fact, contact guidance dominates again the dynamics. Eventually, for a sensing radius R = 0.02 smaller than lq ≈ 0.031, but larger than lS = 0.002, the macroscopic behavior is approximated by an hyperbolic equation with drift velocity given in (56). Results of the simulation are presented in Fig. 8. Here, the chemoattractant has the profile 23 (a) t=1.25 (b) t=1.875 (c) t=2.5 (d) t=3.75 (e) t=5 (f) t=6.25 Figure 4: Test 3 Case i) with non-local q and S, sensed with an independent sensing through the kernels γq = γS = H(R−λ). S is given in Fig. 3a with mS = 10 and σ 2 S = 0.05, so that lS = 0.002. The fibers distribution q has a space dependent parameter k given by (72) with mk = 100, so that lq ≈ 0.0031. The sensing radius of the cells is R = 0.7. (a) t=2.5 (b) t=5 (c) t=10 (d) t=15 (e) t=22.5 (f) t=27.5 Figure 5: Test 3 Case i) with non-local q and S, independent and sensing with γq = γS = H(R−λ). S is given in Fig. 3c, that corresponds to lS = 0.25, while for the fiber distribution mk = 100, so that lq ≈ 0.0031. The sensing radius of the cells is R = 0.7. shown in Fig. 3a. Cells diffuse in the domain because ηq is smaller than 1, and they start moving in a region with randomly disposed fibers (see Fig. 8a). Then, they mainly follow the preferential direction π/4 thanks to the presence of the chemoattractant. In fact, it induces a strong drift 24 (a) t=2.5 (b) t=5 (c) t=7.5 (d) t=10 (e) t=15 (f) t=20 Figure 6: Test 3 Case ii) with non-local q and S, independent and sensing with γq = γS = H(R − λ). S is given in Fig. 3b, that corresponds to lS = 0.055, while mk = 10, so that lq ≈ 0.031. The sensing radius of the cells is R = 0.02. (a) t=2.5 (b) t=5 (c) t=10 (d) t=20 (e) t=30 (f) t=60 Figure 7: Test 3 Case iii) with non-local q and S, independent and with sensing function γq = γS = H(R−λ). S is given in Figure 3c, so that lS = 0.25, while for the fiber distribution mk = 100, corresponding to lq ≈ 0.0031. The sensing radius of the cells is set to R = 0.02. because of the high non-locality, determining ηS  1. Here chemotaxis is slightly dominating the dynamics and, in fact, η < 1. 25 (a) t=1.25 (b) t=2.5 (c) t=5 (d) t=7.5 (e) t=10 (f) t=15 Figure 8: Test 3 Case iv) with non-local q and S, independent sensing with γq = γS = H(R − λ). S is given in Fig. 3a, that corresponds to lS = 0.002, whilst mk = 10, so that lq ≈ 0.031. The sensing radius of the cells is R = 0.02. 4.4 Test 4: heterogeneous ECM environment We now consider a domain Ω divided in several regions, each of them characterized by a different average direction of the fibers. In particular, we shall do this in the case of independent sensing model with γq = γS = H(R−λ), as for Test 3; the independence of the two sensings, in fact, allows to visualize more efficiently the two distinct directional effects. As first scenario, we shall consider the domain schematized in Fig. 9a; in each subdomain we have k(x, y) = 50, that corresponds to Dq = 0.005. The initial condition of the cells is represented in Fig. 9c, with initial density r0 = 0.1, while the chemoattractant has a gaussian profile (70) centered in (xS , yS ) = (4, 4), with mS = 10 and σ 2 S = 0.5, as shown in Fig. 9b. We observe that cells do not migrate collectively towards the chemoattractant, but they divide into two main separated clusters (see Figs. 9f - 9h): in fact, although the sensing radius R = 0.8 is quite large, the cells that are closer to the left boundary remain trapped in the first subdomain, showing a loss of adhesion with the rest of the cell population. As shown in Fig. 9i, even though the cells that are in the left subdomain horizontally align to the chemoattractant, the high degree of alignment of the fiber does not allow them to escape this region, even for large times. As second scenario, we shall consider the domain represented in Fig. 10a; in each subdomain, the parameter k(x, y) = 50. The initial condition of the cell population is (71) with (x0, y0) = (4, 0.5) and r0 = 0.1, while the chemoattractant has a gaussian profile (70) centered in (xS , yS ) = (2, 4.5) with mS = 10 and σ 2 S = 0.05, as shown in Fig. 10c and 10b, respectively. We observe that cells do not migrate directly towards the chemoattractant, as they sense the heterogeneous fibrous environment and, consequently, adapt their migration to it. In particular, cells that are able to reach and sense the isotropic subdomain where the fibers are uniformly distributed (defined by 1 ≤ x ≤ 3 and 0 ≤ y ≤ 3), go in this direction imposed by the gradient of the chemoattractant. On the other hand, in the subdomain 3 ≤ x ≤ 5 and 1 ≤ y ≤ 2, they follow the direction of fiber alignment, that is π/4, perpendicular to the favorable direction imposed by S. However, the sensing radius R = 0.7 allows the cells that are closer to the right boundary to escape quite fast the disadvantageous (in terms of preferential direction) subdomains and, following firstly the direction π/2 in 2 ≤ y ≤ 3 and, then, 3π/4 in 3 ≤ y ≤ 4, to reach the chemoattractant. 26 (a) Fibers distribution. (b) Chemoattractant S. (c) Initial condition for the cell. (d) t=0.04 (e) t=1.6 (f) t=2.904 (g) t=18.4 (h) t=44 (i) t=67.2 Figure 9: Test 4 Migration of cells in an heterogenous domain as illustrated in (a). The sensing radius of the cells is R = 0.8. The chemoattractant (b) is (70) with mS = 10 and σ 2 S = 0.5. The initial cell profile (c) evolves in time as illustrated in (d)-(i). 5 Conclusion We have proposed a kinetic model for describing cell migration in a multi-cue environment. In particular, in the same spirit as [36], we have considered that cells, as they can extend protrusions up to several cell diameters, perform a non-local sensing of the environment up to a distance R (named the sensing radius) from its nucleus. In the present model, there are two guidance cues affecting the polarization, and, therefore, the direction of motion of the cells: contact guidance, that is a bi-directional cue, and a chemical gradient, that is a mono-directional cue. We remark that for the first time in this work a non-local sensing in the physical space of the mesoscopic distribution of fibers is considered. In particular, we introduced two classes of models: in the first one, the cells perform an independent sensing of the fibers and of the chemical in its neighborhood, while in the second class of models the cells average the chemical and the fibers with the same sensing kernel. In the two cases, a particular attention was devoted to the identification of the proper macroscopic limit according to the properties of the turning operator. We detected two parameters, ηq and ηS , that measure the relation between the cell sensing radius and the characteristic lengths of variation − lS and lq − of the two cues, and discriminate between a diffusion-driven regime with an advective correction and a drift-driven regime. In particular, when the sensing radius does not exceed the characteristic length of the chemoattractant, the bi-directional nature of the 27 (a) Fibers distribution. (b) Chemoattractant S. (c) Cells initial condition. (d) t=0.5 (e) t=1 (f) t=1.5 (g) t=2.5 (h) t=3.5 (i) t=4.5 Figure 10: Test 4 Migration of cell in an heterogenous domain as illustrated in (a). The sensing radius of the cells is R = 0.7. The chemoattractant (b) is (70) with mS = 10 and σ 2 S = 0.05. The initial cell profile (c) evolves in time as illustrated in (d)-(i). fibers allows for a diffusive regime; otherwise the hyperbolic scaling leads to macroscopic drift. A common feature in the different cases is the dependency of the macroscopic velocity on both the fibers network and the chemoattractant. This aspect enhances the non-trivial influence of contact guidance on the cell drift, although we considered a non polarized fibers network. This interdependence is in accordance with the model proposed in [58]. Moreover, in absence of a chemoattractant, this impact on the drift term could persist for spatial heterogenous fiber distributions. This is in accordance to what is observed in [24] and it represents a step forward with respect to [58], in which the drift is a function of contact guidance only through to the presence of a chemical gradient, i.e., without chemoattractant there will be no drift. The numerical simulations of the transport model pointed out the main features characterizing the two classes of models and the possible scenarios that they are able to capture. We observed that the presence of two cues influencing cell polarization, even when the fibers are sensed locally, ensures a preferential sense of motion for cells laying on regions of highly aligned non-oriented fibers. Test 3 allowed to show the importance of deriving the macroscopic equations from an underlying microscopic dynamics and in the appropriate regime: a directly postulated drift-diffusion equation would not capture the exact dynamics in all the possible regimes. The competitive or collaborative effects of the cues depend, in a first instance, on the angle between their relative orientations, i.e., the direction of fiber alignment θq and the gradient of the chemoattractant. 28 Moreover, especially for the cases of competitive cues, determining which one is the dominant cue depends on their relative strengths, in terms of both concentration and intensity (degree of alignment of the fiber k(x) or steepness of the chemoattractive gradient). We introduced the parameter η = lS /lq that, independently on the cell size or its sensing capability, quantifies the relative contribution of guidance to chemotaxis and provide a first separation between the cases of fiber-dominating and chemotaxis-dominating dynamics (η  1 or η  1, respectively). The presented framework also allows for the direct calculation of parameters that can be used to quantify directed cell migration and to set its efficiency, like, for instance, mean square displacement, persistence time, directional persistence and mean speed [42]. Additionally, the non locality brings an further level of detail to the model, allowing to obtain different macroscopic behaviour depending on the characteristics of the two sensing. In fact, we did not observe strong differences between the independent and the dependent sensing models, when we assume in the former the same sensing kernel for fibers and chemoattractant, i.e., when γq = γS . However, if there are biological observations sustaining the possibility that a cell might implement different strategies for sensing the underlying fibers network and the chemoattractant, it would be possible to use the proposed model, in its independent sensing version, to investigate this scenario and to compare the possible outcomes of this sensing approach with the case of a unique and common sensing strategy. Potentially, the case of competitive cues, combined with the non-local aspect of the model, could lead to interesting further analysis. As observed in the last numerical tests, the combination of heterogenous landscapes of fiber with chemoattractive agents show how the cell density can divide and cross the domain using different migration strategies. This leads to natural questions about the deeper mechanisms leading the competition between the two cues, considering, for instance, the possible role of cell adhesion in recovering collective migration. We remark that, even if simulations were performed in a two dimensional setting, the transport model (and its macroscopic limits, as a consequence) is formulated in a general d-dimensional setting. Hence, a possible future development is to perform simulations in the three dimensional case, that would be much more realistic for mimicking in-vivo migration of cells in the extracellular matrix. Moreover, the model that we proposed may be adapted to describe other directional cues that might describe, among others, haptotactic, durotactic or electrotactic mechanisms. Furthermore, in the same spirit as in [37] we could enrich this model with a non-constant sensing-radius, as it may vary according to the spatial and directional variability of the external guidance cues. Lastly, this study was restricted to the case in which the cues affect only cell polarization, considering a uniform distribution of the speeds. However, similarly to what is done in [36, 37], it may be modified to model a multi-cue environment in which one of the signals affects also the speed of the cells. A Estimation of lq Let us consider the fiber density distribution q(x, vˆ) defined by a bimodal Von Mises Fisher q(x, vˆ) = 1 4πI0(k(x))  e k(x) u·vˆ + e −k(x) u·vˆ  , where k(x) ∈ C1 (Ω) and Iν(k(x)) denotes the modified Bessel function of first kind of order ν. We now want to give an estimation for the range of variability of the characteristic length lq, defined as: lq := 1 max x∈Ω max vˆ∈S d−1 |∇q·vˆ| q . 29 Since ∂I0 ∂k = I1(k) I0(k) , we have that ∇q = e k(x) u·vˆ − e −k(x) u·vˆ  4πI0(k(x)) ∇k (u · vˆ) − e k(x) u·vˆ + e −k(x) u·vˆ  4πI2 0 (k(x)) ∂I0 ∂k ∇k = = e k(x) u·vˆ − e −k(x) u·vˆ  4πI0(k(x)) ∇k (u · vˆ) − e k(x) u·vˆ + e −k(x) u·vˆ  4πI0(k(x)) I1(k(x)) I0(k(x)) ∇k Since q(x, vˆ) > 0, we have: ∇q · vˆ q = e k(x) u·vˆ − e −k(x) u·vˆ  e k(x) u·vˆ + e−k(x) u·vˆ  (u · vˆ) − I1(k(x)) I0(k(x)) ||∇k|| cos(∇k · vˆ) where || · || denotes the L2-norm and we use the fact that ||vˆ|| = 1. Therefore, ∇q · vˆ q = e k(x) u·vˆ − e −k(x) u·vˆ  e k(x) u·vˆ + e−k(x) u·vˆ  (u · vˆ) − I1(k(x)) I0(k(x)) ||∇k|| |cos(∇k · vˆ)| Recalling that |a − b| ≤ |a| + |b|, −1 ≤ e k(x) u·vˆ − e −k(x) u·vˆ  e k(x) u·vˆ + e−k(x) u·vˆ  ≤ 1 and | cos (·)| ≤ 1, we get ∇q · vˆ q ≤  1 + I1(k(x)) I0(k(x))  ||∇k|| . Considering Eq. (1.12) in [33] for ν = 1, we obtain that I1 I0 < 1, and, therefore, ∇q · vˆ q < 2||∇k|| that implies max x∈Ω max vˆ∈S d−1 ∇q · vˆ q < 2 max x∈Ω ||∇k||. This translates into lq ≥ 1 2 max x∈Ω ||∇k|| . (73) In particular, if there exists x such that ∇k(x) · vˆ = 1 and, at the same time, also satisfies ∇k(x) k u, then (73) is true with the equal sign. In particular, for the symmetry of (72) and (70) we shall consider lq ≈ 1 2 max x∈Ω ||∇k|| . Acknowledgments The authors would like to thank Prof. Luigi Preziosi for fruitful discussions and valuable comments. This work was partially supported by Istituto Nazionale di Alta Matematica, Ministry of Education, Universities and Research, through the MIUR grant Dipartimento di Eccellenza 2018-2022, Project no. E11G18000350001, and the Scientific Reseach Programmes of Relevant National Interest project n. 2017KL4EF3. NL also acknowledges Compagnia di San Paolo. This research was also partially supported by the Basque Government through the BERC 2018- 2021 program and by the Spanish State Research Agency through BCAM Severo Ochoa excellence accreditation SEV-2017-0718. MC has received funding from the European Unions Horizon 2020 research and innovation programme under the Marie Skodowska- Curie grant agreement No. 713673. The project that gave rise to these results received the support of a fellowship from la Caixa Foundation (ID 100010434). The fellowship code is LCF/BQ/IN17/116 References [1] Y. Azimzade, A. A. Saberi, and M. Sahimi. Regulation of migration of chemotactic tumor cells by the spatial distribution of collagen fiber orientation. Phys. Rev. E, 99:062414, 2019. [2] H. C. Berg. Random Walks in Biology. Princeton University Press, revised edition, 1983. [3] H. C. Berg and E. M. Purcell. Physics of chemoreception. Biophys. J., 20(2):193–219, 1977. [4] M. Bisi, J. A. Carrillo, and B. Lods. Equilibrium solution to the inelastic boltzmann equation driven by a particle bath. J. Stat. Phys., 133(5):841–870, 2008. [5] S. M. Block, J. E. Segall, and H. C. Berg. Adaptation kinetics in bacterial chemotaxis. J. Bacteriol. Res., 154(1):312–323, 1983. [6] B. A. Bromberek, P. A. J. Enever, D. I. Shreiber, M. D. Caldwell, and R. T. Tranquillo. Macrophages influence a competition of contact guidance and chemotaxis for fibroblast alignment in a fibrin gel coculture assay. Exp. Cell Res., 275(2):230–242, 2002. [7] F. A. C. C. Chalub, P. A. Markowich, B. Perthame, and C. Schmeiser. Kinetic models for chemotaxis and their drift-diffusion limits. Monatsh. Math., 142(1):123–141, 2004. [8] A. Chauviere, T. Hillen, and L. Preziosi. Modeling cell movement in anisotropic and heterogeneous network tissues. Netw. Heterog. Media, 2(2):333–351, 2007. [9] A. Chauviere, T. Hillen, and L. Preziosi. Modeling the motion of a cell population in the extracellular matrix. Discrete Cont. Dyn.-B, 2007(Supplemental volume):250–259, 2007. [10] L. Chen, K. J. Painter, C. Surulescu, and A. Zhigun. Mathematical models for cell migration: a nonlocal perspective. arXiv preprint arXiv:1911.05200, 2019. [11] A. Colombi, M. Scianna, and L. Preziosi. Coherent modelling switch between pointwise and distributed representations of cell aggregates. J. Math. Biol., 74(4):783–808, 2017. [12] A. Colombi, M. Scianna, and A. Tosin. Differentiated cell behavior: a multiscale approach using measure theory. J. Math. Biol., 71:1049–1079, 2015. [13] M. Conte, L. Gerardo-Giorda, and M. Groppi. Glioma invasion and its interplay with nervous tissue and therapy: A multiscale model. J. Theo. Biol., 486:110088, 2020. [14] E. Di Costanzo, M. Menci, E. Messina, R. Natalini, and A. Vecchio. A hybrid model of collective motion of discrete particles under alignment and continuum chemotaxis. Discrete Cont. Dyn.-B, 25:443–472, 2020. [15] R. Dickinson and R. T. Tranquillo. Stochastic model of biased cell migration based on binding fluctuations of adhesion receptors. J. Math. Biol., 19:563–600, 1991. [16] R. B. Dickinson. A generalized transport model for biased cell migration in an anisotropic environment. J. Math. Biol., 40(2):97–135, 2000. [17] R. Eftimie. Hyperbolic and kinetic models for self-organized biological aggregations and movement: a brief review. J. Math. Biol., 65(1):35–75, 2012. [18] C. Engwer, T. Hillen, M. Knappitsch, and C. Surulescu. Glioma follow white matter tracts: a multiscale dti-based model. J. Math. Biol., 71(3):551–582, 2015. [19] C. Engwer, M. Knappitsch, and C. Surulescu. A multiscale model for glioma spread including cell-tissue interactions and proliferation. Math. Biosci. Eng., 13:443–460, 2016. 31 [20] C. Engwer, C. Stinner, and C. Surulescu. On a structured multiscale model for acid-mediated tumor invasion: The effects of adhesion and proliferation. Math. Mod. Meth. Appl. S., 27:1355–1390, 2017. [21] P. Friedl. Prespecification and plasticity: shifting mechanisms of cell migration. Curr. Opin. Cell Biol., 16:1423, 2004. [22] P. Friedl and E.-B. Brocker. The biology of cell locomotion within three dimensional extracellular matrix. Cell Mol Life Sci., 57:41–64, 2000. [23] R. Gininait, R. E. Baker, P. M. Kulesa, and P. K. Maini. Modelling collective cell migration: neural crest as a model paradigm. J. Math. Biol., 80:481–504, 2019. [24] T. Hillen. M5 mesoscopic and macroscopic models for mesenchymal motion. J. Math. Biol., 53(4):585–616, 2006. [25] T. Hillen, A. Murtha, K. J. Painter, and A. Swan. Moments of the von mises and fischer distributions and applications. Math. Biosci. Eng., 14(3):673–694, 2017. [26] T. Hillen and H. G. Othmer. The diffusion limit of transport equations derived from velocityjump processes. SIAM J. Appl. Math., 61:751–775, 2000. [27] T. Hillen and K. J. Painter. A user’s guide to pde models for chemotaxis. J. Math. Biol., 58(1):183–217, 2008. [28] J. Johnson, M. O. Nowicki, C. H. Lee, E. A. Chiocca, M. S. Viapiano, S. E. Lawler, and J. J Lannutti. Quantitative analysis of complex glioma cell migration on electrospun polycaprolactone using time-lapse microscopy. Tissue Eng. Part C-Me, 15(4):531–540, 2009. [29] E. F. Keller and L. A. Segel. Initiation of slime mold aggregation viewed as an instability. J. Theo. Biol., 26(3):399–415, 1970. [30] P. J. Kevin. Mathematical models for chemotaxis and their applications in self-organisation phenomena. J. Theor. Biol., 481:162–182, 2019. [31] P. J. Kevin, P. K. Maini, and H. G. Othmer. Development and applications of a model for cellular response to multiple chemotactic cues. J. Math. Biol., 41(4):285–314, 2000. [32] N. Kolbe, N. Sfakianakis, C. Stinner, C. Surulescu, and J. Lenz. Modeling multiple taxis: tumor invasion with phenotypic heterogeneity, haptotaxis, and unilateral interspecies repellence. arXiv preprint arXiv:2005.01444, 2020. [33] A. Laforgia and P. Natalini. Some inequalities for modified bessel functions. J. Inequal. Appl., 2010(1):253035, 2010. [34] L. Lara and I. Schneider. Directed cell migration in multi-cue environments. Integr. Biol., 5(11):1306–1323, 2013. [35] B. Lods. Semigroup generation propertiesof streaming operators with noncontractive boundary conditions. Math. Comput. Model., 42:1441–1462, 2005. [36] N. Loy and L. Preziosi. Kinetic models with non-local sensing determining cell polarization and speed according to independent cues. J. Math. Biol., 80:373–421, 2019. [37] N. Loy and L. Preziosi. Modelling physical limits of migration by a kinetic model with non-local sensing. J. Math. Biol., 2019. In Press. [38] G. Maheshwari, A. Wells, L. G. Griffith, and D. A. Lauffenburger. Biophysical integration of effects of epidermal growth factor and fibronectin on fibroblast migration. Biophys. J., 76(5):2814–2823, 1999. 32 [39] K. V. Mardia and P. E. Jupp. Directional statistics, volume 494. John Wiley & Sons, 2009. [40] H. Othmer and T. Hillen. The diffusion limit of transport equations ii: Chemotaxis equations. SIAM J. Appl. Math., 62:1222–1250, 2002. [41] H. Othmer and A. Stevens. Aggregation, blowup, and collapse: The ABC’s of taxis in reinforced random walks. SIAM J. Appl. Math., 57:1044–1081, 2001. [42] H. G. Othmer, S. R. Dunbar, and W. Alt. Models of dispersal in biological systems. J. Math. Biol., 26(3):263–298, 1988. [43] K. J. Painter. Modelling cell migration strategies in the extracellular matrix. J. Math. Biol., 58(4):511–543, 2008. [44] K. J. Painter and T. Hillen. Transport and anisotropic diffusion models for movement in oriented habitats, volume 2071, pages 177–222. Lect. Notes Math., Springer - verlag -, 2013. [45] A. Palcewski. Velocity averaging for boundary value problems, pages 1–284. Ser. Adv. Math. Appl. Sci. World Scientific Publishing Company, 1992. [46] R. Pettersson. On solutions to the Linear Boltzmann equation for granular gases. Transport Theor. Stat., 33(5-7):527–543, 2004. [47] R. G. Plaza. Derivation of a bacterial nutrient-taxis system with doubly degenerate crossdiffusion as the parabolic limit of a velocity-jump process. J. Math. Biol., 78(6):1681–1711, 2019. [48] K. E. Pourfarhangi, E. Hoz, A. Cohen, and B. Gligorijevic. Contact guidance is cell cycledependent. APL Bioeng., 2:031904, 2018. [49] P. P. Provenzano, K. W. Eliceiri, J. M. Campbell, and et al. Collagen reorganization at the tumor-stromal interface facilitates local invasion. BMC Med., 4(1):38, 2006. [50] P. P. Provenzano, K. W. Eliceiri, and P. J. Keely. Shining new light on 3d cell motility and the metastatic process. Trends Cell Biol., 19(11):638–648, 2009. [51] A. M. Rajnicek, L. E. Foubister, and C. D. McCaig. Prioritising guidance cues: Directional migration induced by substratum contours and electrical gradients is controlled by a rho/cdc42 switch. Dev. Biol., 312(1):448–460, 2007. [52] S. W. Rhee, A. M. Taylor, C. H. Tu, D. H. Cribbs, C. Cotman, and N. Li Jeon. Patterned cell culture inside microfluidic devices. Lab Chip, 51:102–107, 2005. [53] D. Schlter, I. Ramis-Conde, and M. Chaplain. Computational modeling of single-cell migration: The leading role of extracellular matrix fibers. Biophys. J., 103:1141–51, 2012. [54] M. Scianna, L. Preziosi, and K. Wolf. A cellular potts model simulating cell migration on and in matrix environments. Math. Biosci. Eng., 10:235–261, 2013. [55] P. Steeg. Targeting metastasis. Nat. Rev. Cancer., 16:201–218, 2016. [56] D. W. Stroock. Some stochastic processes which arise from a model of the motion of a bacterium. Z. Wahrscheinlichkeit, 28(4):305–315, 1974. [57] H. Sundararaghavan, R. Saunders, D. Hammer, and J. Burdick. Fiber alignment directs cell motility over chemotactic gradients. Biotechnol. Bioeng., 110(4):1249–1254, 2013. [58] M. A. Wagle and R. T. Tranquillo. A self-consistent cell flux expression for simultaneous chemotaxis and contact guidance in tissues. J. Math. Biol., 41(4):315–330, 2000. 33 [59] P. C. Wilkinson and J. M. Lackie. The influence of contact guidance on chemotaxis of human neutrophil leukocytes. Exp. Cell Res., 145(2):255–264, 1983. [60] K. Wolf, I. Mazo, H. Leung, K. Engelke, U. H. von Andrian, E. I. Deryugina, A. Y. Strongin, E.-B. Br¨ocker, and P. Friedl. Compensation mechanism in tumor cell migration: mesenchymalamoeboid transition after blocking of pericellular proteolysis. Int. J. Cell Biol., 160(2):267–277, 2003. 34 ================================================ FILE: Chapter08/gpt-2-train_files/memory_saving_gradients.py ================================================ from toposort import toposort import contextlib import numpy as np import tensorflow as tf import tensorflow.contrib.graph_editor as ge import time import sys sys.setrecursionlimit(10000) # refers back to current module if we decide to split helpers out util = sys.modules[__name__] # getting rid of "WARNING:tensorflow:VARIABLES collection name is deprecated" setattr(tf.GraphKeys, "VARIABLES", "variables") # save original gradients since tf.gradient could be monkey-patched to point # to our version from tensorflow.python.ops import gradients as tf_gradients_lib tf_gradients = tf_gradients_lib.gradients MIN_CHECKPOINT_NODE_SIZE=1024 # use lower value during testing # specific versions we can use to do process-wide replacement of tf.gradients def gradients_speed(ys, xs, grad_ys=None, **kwargs): return gradients(ys, xs, grad_ys, checkpoints='speed', **kwargs) def gradients_memory(ys, xs, grad_ys=None, **kwargs): return gradients(ys, xs, grad_ys, checkpoints='memory', **kwargs) def gradients_collection(ys, xs, grad_ys=None, **kwargs): return gradients(ys, xs, grad_ys, checkpoints='collection', **kwargs) def gradients(ys, xs, grad_ys=None, checkpoints='collection', **kwargs): ''' Authors: Tim Salimans & Yaroslav Bulatov memory efficient gradient implementation inspired by "Training Deep Nets with Sublinear Memory Cost" by Chen et al. 2016 (https://arxiv.org/abs/1604.06174) ys,xs,grad_ys,kwargs are the arguments to standard tensorflow tf.gradients (https://www.tensorflow.org/versions/r0.12/api_docs/python/train.html#gradients) 'checkpoints' can either be - a list consisting of tensors from the forward pass of the neural net that we should re-use when calculating the gradients in the backward pass all other tensors that do not appear in this list will be re-computed - a string specifying how this list should be determined. currently we support - 'speed': checkpoint all outputs of convolutions and matmuls. these ops are usually the most expensive, so checkpointing them maximizes the running speed (this is a good option if nonlinearities, concats, batchnorms, etc are taking up a lot of memory) - 'memory': try to minimize the memory usage (currently using a very simple strategy that identifies a number of bottleneck tensors in the graph to checkpoint) - 'collection': look for a tensorflow collection named 'checkpoints', which holds the tensors to checkpoint ''' # print("Calling memsaving gradients with", checkpoints) if not isinstance(ys,list): ys = [ys] if not isinstance(xs,list): xs = [xs] bwd_ops = ge.get_backward_walk_ops([y.op for y in ys], inclusive=True) debug_print("bwd_ops: %s", bwd_ops) # forward ops are all ops that are candidates for recomputation fwd_ops = ge.get_forward_walk_ops([x.op for x in xs], inclusive=True, within_ops=bwd_ops) debug_print("fwd_ops: %s", fwd_ops) # exclude ops with no inputs fwd_ops = [op for op in fwd_ops if op.inputs] # don't recompute xs, remove variables xs_ops = _to_ops(xs) fwd_ops = [op for op in fwd_ops if not op in xs_ops] fwd_ops = [op for op in fwd_ops if not '/assign' in op.name] fwd_ops = [op for op in fwd_ops if not '/Assign' in op.name] fwd_ops = [op for op in fwd_ops if not '/read' in op.name] ts_all = ge.filter_ts(fwd_ops, True) # get the tensors ts_all = [t for t in ts_all if '/read' not in t.name] ts_all = set(ts_all) - set(xs) - set(ys) # construct list of tensors to checkpoint during forward pass, if not # given as input if type(checkpoints) is not list: if checkpoints == 'collection': checkpoints = tf.get_collection('checkpoints') elif checkpoints == 'speed': # checkpoint all expensive ops to maximize running speed checkpoints = ge.filter_ts_from_regex(fwd_ops, 'conv2d|Conv|MatMul') elif checkpoints == 'memory': # remove very small tensors and some weird ops def fixdims(t): # tf.Dimension values are not compatible with int, convert manually try: return [int(e if e.value is not None else 64) for e in t] except: return [0] # unknown shape ts_all = [t for t in ts_all if np.prod(fixdims(t.shape)) > MIN_CHECKPOINT_NODE_SIZE] ts_all = [t for t in ts_all if 'L2Loss' not in t.name] ts_all = [t for t in ts_all if 'entropy' not in t.name] ts_all = [t for t in ts_all if 'FusedBatchNorm' not in t.name] ts_all = [t for t in ts_all if 'Switch' not in t.name] ts_all = [t for t in ts_all if 'dropout' not in t.name] # DV: FP16_FIX - need to add 'Cast' layer here to make it work for FP16 ts_all = [t for t in ts_all if 'Cast' not in t.name] # filter out all tensors that are inputs of the backward graph with util.capture_ops() as bwd_ops: tf_gradients(ys, xs, grad_ys, **kwargs) bwd_inputs = [t for op in bwd_ops for t in op.inputs] # list of tensors in forward graph that is in input to bwd graph ts_filtered = list(set(bwd_inputs).intersection(ts_all)) debug_print("Using tensors %s", ts_filtered) # try two slightly different ways of getting bottlenecks tensors # to checkpoint for ts in [ts_filtered, ts_all]: # get all bottlenecks in the graph bottleneck_ts = [] for t in ts: b = set(ge.get_backward_walk_ops(t.op, inclusive=True, within_ops=fwd_ops)) f = set(ge.get_forward_walk_ops(t.op, inclusive=False, within_ops=fwd_ops)) # check that there are not shortcuts b_inp = set([inp for op in b for inp in op.inputs]).intersection(ts_all) f_inp = set([inp for op in f for inp in op.inputs]).intersection(ts_all) if not set(b_inp).intersection(f_inp) and len(b_inp)+len(f_inp) >= len(ts_all): bottleneck_ts.append(t) # we have a bottleneck! else: debug_print("Rejected bottleneck candidate and ops %s", [t] + list(set(ts_all) - set(b_inp) - set(f_inp))) # success? or try again without filtering? if len(bottleneck_ts) >= np.sqrt(len(ts_filtered)): # yes, enough bottlenecks found! break if not bottleneck_ts: raise Exception('unable to find bottleneck tensors! please provide checkpoint nodes manually, or use checkpoints="speed".') # sort the bottlenecks bottlenecks_sorted_lists = tf_toposort(bottleneck_ts, within_ops=fwd_ops) sorted_bottlenecks = [t for ts in bottlenecks_sorted_lists for t in ts] # save an approximately optimal number ~ sqrt(N) N = len(ts_filtered) if len(bottleneck_ts) <= np.ceil(np.sqrt(N)): checkpoints = sorted_bottlenecks else: step = int(np.ceil(len(bottleneck_ts) / np.sqrt(N))) checkpoints = sorted_bottlenecks[step::step] else: raise Exception('%s is unsupported input for "checkpoints"' % (checkpoints,)) checkpoints = list(set(checkpoints).intersection(ts_all)) # at this point automatic selection happened and checkpoints is list of nodes assert isinstance(checkpoints, list) debug_print("Checkpoint nodes used: %s", checkpoints) # better error handling of special cases # xs are already handled as checkpoint nodes, so no need to include them xs_intersect_checkpoints = set(xs).intersection(set(checkpoints)) if xs_intersect_checkpoints: debug_print("Warning, some input nodes are also checkpoint nodes: %s", xs_intersect_checkpoints) ys_intersect_checkpoints = set(ys).intersection(set(checkpoints)) debug_print("ys: %s, checkpoints: %s, intersect: %s", ys, checkpoints, ys_intersect_checkpoints) # saving an output node (ys) gives no benefit in memory while creating # new edge cases, exclude them if ys_intersect_checkpoints: debug_print("Warning, some output nodes are also checkpoints nodes: %s", format_ops(ys_intersect_checkpoints)) # remove initial and terminal nodes from checkpoints list if present checkpoints = list(set(checkpoints) - set(ys) - set(xs)) # check that we have some nodes to checkpoint # if not checkpoints: # raise Exception('no checkpoints nodes found or given as input! ') # disconnect dependencies between checkpointed tensors checkpoints_disconnected = {} for x in checkpoints: if x.op and x.op.name is not None: grad_node = tf.stop_gradient(x, name=x.op.name+"_sg") else: grad_node = tf.stop_gradient(x) checkpoints_disconnected[x] = grad_node # partial derivatives to the checkpointed tensors and xs ops_to_copy = fast_backward_ops(seed_ops=[y.op for y in ys], stop_at_ts=checkpoints, within_ops=fwd_ops) debug_print("Found %s ops to copy within fwd_ops %s, seed %s, stop_at %s", len(ops_to_copy), fwd_ops, [r.op for r in ys], checkpoints) debug_print("ops_to_copy = %s", ops_to_copy) debug_print("Processing list %s", ys) copied_sgv, info = ge.copy_with_input_replacements(ge.sgv(ops_to_copy), {}) for origin_op, op in info._transformed_ops.items(): op._set_device(origin_op.node_def.device) copied_ops = info._transformed_ops.values() debug_print("Copied %s to %s", ops_to_copy, copied_ops) ge.reroute_ts(checkpoints_disconnected.values(), checkpoints_disconnected.keys(), can_modify=copied_ops) debug_print("Rewired %s in place of %s restricted to %s", checkpoints_disconnected.values(), checkpoints_disconnected.keys(), copied_ops) # get gradients with respect to current boundary + original x's copied_ys = [info._transformed_ops[y.op]._outputs[0] for y in ys] boundary = list(checkpoints_disconnected.values()) dv = tf_gradients(ys=copied_ys, xs=boundary+xs, grad_ys=grad_ys, **kwargs) debug_print("Got gradients %s", dv) debug_print("for %s", copied_ys) debug_print("with respect to %s", boundary+xs) inputs_to_do_before = [y.op for y in ys] if grad_ys is not None: inputs_to_do_before += grad_ys wait_to_do_ops = list(copied_ops) + [g.op for g in dv if g is not None] my_add_control_inputs(wait_to_do_ops, inputs_to_do_before) # partial derivatives to the checkpointed nodes # dictionary of "node: backprop" for nodes in the boundary d_checkpoints = {r: dr for r,dr in zip(checkpoints_disconnected.keys(), dv[:len(checkpoints_disconnected)])} # partial derivatives to xs (usually the params of the neural net) d_xs = dv[len(checkpoints_disconnected):] # incorporate derivatives flowing through the checkpointed nodes checkpoints_sorted_lists = tf_toposort(checkpoints, within_ops=fwd_ops) for ts in checkpoints_sorted_lists[::-1]: debug_print("Processing list %s", ts) checkpoints_other = [r for r in checkpoints if r not in ts] checkpoints_disconnected_other = [checkpoints_disconnected[r] for r in checkpoints_other] # copy part of the graph below current checkpoint node, stopping at # other checkpoints nodes ops_to_copy = fast_backward_ops(within_ops=fwd_ops, seed_ops=[r.op for r in ts], stop_at_ts=checkpoints_other) debug_print("Found %s ops to copy within %s, seed %s, stop_at %s", len(ops_to_copy), fwd_ops, [r.op for r in ts], checkpoints_other) debug_print("ops_to_copy = %s", ops_to_copy) if not ops_to_copy: # we're done! break copied_sgv, info = ge.copy_with_input_replacements(ge.sgv(ops_to_copy), {}) for origin_op, op in info._transformed_ops.items(): op._set_device(origin_op.node_def.device) copied_ops = info._transformed_ops.values() debug_print("Copied %s to %s", ops_to_copy, copied_ops) ge.reroute_ts(checkpoints_disconnected_other, checkpoints_other, can_modify=copied_ops) debug_print("Rewired %s in place of %s restricted to %s", checkpoints_disconnected_other, checkpoints_other, copied_ops) # gradient flowing through the checkpointed node boundary = [info._transformed_ops[r.op]._outputs[0] for r in ts] substitute_backprops = [d_checkpoints[r] for r in ts] dv = tf_gradients(boundary, checkpoints_disconnected_other+xs, grad_ys=substitute_backprops, **kwargs) debug_print("Got gradients %s", dv) debug_print("for %s", boundary) debug_print("with respect to %s", checkpoints_disconnected_other+xs) debug_print("with boundary backprop substitutions %s", substitute_backprops) inputs_to_do_before = [d_checkpoints[r].op for r in ts] wait_to_do_ops = list(copied_ops) + [g.op for g in dv if g is not None] my_add_control_inputs(wait_to_do_ops, inputs_to_do_before) # partial derivatives to the checkpointed nodes for r, dr in zip(checkpoints_other, dv[:len(checkpoints_other)]): if dr is not None: if d_checkpoints[r] is None: d_checkpoints[r] = dr else: d_checkpoints[r] += dr def _unsparsify(x): if not isinstance(x, tf.IndexedSlices): return x assert x.dense_shape is not None, "memory_saving_gradients encountered sparse gradients of unknown shape" indices = x.indices while indices.shape.ndims < x.values.shape.ndims: indices = tf.expand_dims(indices, -1) return tf.scatter_nd(indices, x.values, x.dense_shape) # partial derivatives to xs (usually the params of the neural net) d_xs_new = dv[len(checkpoints_other):] for j in range(len(xs)): if d_xs_new[j] is not None: if d_xs[j] is None: d_xs[j] = _unsparsify(d_xs_new[j]) else: d_xs[j] += _unsparsify(d_xs_new[j]) return d_xs def tf_toposort(ts, within_ops=None): all_ops = ge.get_forward_walk_ops([x.op for x in ts], within_ops=within_ops) deps = {} for op in all_ops: for o in op.outputs: deps[o] = set(op.inputs) sorted_ts = toposort(deps) # only keep the tensors from our original list ts_sorted_lists = [] for l in sorted_ts: keep = list(set(l).intersection(ts)) if keep: ts_sorted_lists.append(keep) return ts_sorted_lists def fast_backward_ops(within_ops, seed_ops, stop_at_ts): bwd_ops = set(ge.get_backward_walk_ops(seed_ops, stop_at_ts=stop_at_ts)) ops = bwd_ops.intersection(within_ops).difference([t.op for t in stop_at_ts]) return list(ops) @contextlib.contextmanager def capture_ops(): """Decorator to capture ops created in the block. with capture_ops() as ops: # create some ops print(ops) # => prints ops created. """ micros = int(time.time()*10**6) scope_name = str(micros) op_list = [] with tf.name_scope(scope_name): yield op_list g = tf.get_default_graph() op_list.extend(ge.select_ops(scope_name+"/.*", graph=g)) def _to_op(tensor_or_op): if hasattr(tensor_or_op, "op"): return tensor_or_op.op return tensor_or_op def _to_ops(iterable): if not _is_iterable(iterable): return iterable return [_to_op(i) for i in iterable] def _is_iterable(o): try: _ = iter(o) except Exception: return False return True DEBUG_LOGGING=False def debug_print(s, *args): """Like logger.log, but also replaces all TensorFlow ops/tensors with their names. Sensitive to value of DEBUG_LOGGING, see enable_debug/disable_debug Usage: debug_print("see tensors %s for %s", tensorlist, [1,2,3]) """ if DEBUG_LOGGING: formatted_args = [format_ops(arg) for arg in args] print("DEBUG "+s % tuple(formatted_args)) def format_ops(ops, sort_outputs=True): """Helper method for printing ops. Converts Tensor/Operation op to op.name, rest to str(op).""" if hasattr(ops, '__iter__') and not isinstance(ops, str): l = [(op.name if hasattr(op, "name") else str(op)) for op in ops] if sort_outputs: return sorted(l) return l else: return ops.name if hasattr(ops, "name") else str(ops) def my_add_control_inputs(wait_to_do_ops, inputs_to_do_before): for op in wait_to_do_ops: ci = [i for i in inputs_to_do_before if op.control_inputs is None or i not in op.control_inputs] ge.add_control_inputs(op, ci) ================================================ FILE: Chapter08/gpt-2-train_files/train.py ================================================ #!/usr/bin/env python3 # Usage: # PYTHONPATH=src ./train --dataset import argparse import json import os import numpy as np import tensorflow as tf import time import tqdm from tensorflow.core.protobuf import rewriter_config_pb2 import model, sample, encoder from load_dataset import load_dataset, Sampler from accumulate import AccumulatingOptimizer import memory_saving_gradients CHECKPOINT_DIR = 'checkpoint' SAMPLE_DIR = 'samples' parser = argparse.ArgumentParser( description='Fine-tune GPT-2 on your custom dataset.', formatter_class=argparse.ArgumentDefaultsHelpFormatter) parser.add_argument('--dataset', metavar='PATH', type=str, required=True, help='Input file, directory, or glob pattern (utf-8 text, or preencoded .npz files).') parser.add_argument('--model_name', metavar='MODEL', type=str, default='117M', help='Pretrained model name') parser.add_argument('--combine', metavar='CHARS', type=int, default=50000, help='Concatenate input files with <|endoftext|> separator into chunks of this minimum size') parser.add_argument('--encoding', type=str, default='utf-8', help='Set the encoding for reading and writing files.') parser.add_argument('--batch_size', metavar='SIZE', type=int, default=1, help='Batch size') parser.add_argument('--learning_rate', metavar='LR', type=float, default=0.00002, help='Learning rate for Adam') parser.add_argument('--accumulate_gradients', metavar='N', type=int, default=1, help='Accumulate gradients across N minibatches.') parser.add_argument('--memory_saving_gradients', default=False, action='store_true', help='Use gradient checkpointing to reduce vram usage.') parser.add_argument('--only_train_transformer_layers', default=False, action='store_true', help='Restrict training to the transformer blocks.') parser.add_argument('--optimizer', type=str, default='adam', help='Optimizer. .') parser.add_argument('--noise', type=float, default=0.0, help='Add noise to input training data to regularize against typos.') parser.add_argument('--top_k', type=int, default=40, help='K for top-k sampling.') parser.add_argument('--top_p', type=float, default=0.0, help='P for top-p sampling. Overrides top_k if set > 0.') parser.add_argument('--restore_from', type=str, default='latest', help='Either "latest", "fresh", or a path to a checkpoint file') parser.add_argument('--run_name', type=str, default='run1', help='Run id. Name of subdirectory in checkpoint/ and samples/') parser.add_argument('--sample_every', metavar='N', type=int, default=100, help='Generate samples every N steps') parser.add_argument('--sample_length', metavar='TOKENS', type=int, default=1023, help='Sample this many tokens') parser.add_argument('--sample_num', metavar='N', type=int, default=1, help='Generate this many samples') parser.add_argument('--save_every', metavar='N', type=int, default=1000, help='Write a checkpoint every N steps') parser.add_argument('--val_dataset', metavar='PATH', type=str, default=None, help='Dataset for validation loss, defaults to --dataset.') parser.add_argument('--val_batch_size', metavar='SIZE', type=int, default=2, help='Batch size for validation.') parser.add_argument('--val_batch_count', metavar='N', type=int, default=40, help='Number of batches for validation.') parser.add_argument('--val_every', metavar='STEPS', type=int, default=0, help='Calculate validation loss every STEPS steps.') def maketree(path): try: os.makedirs(path) except: pass def randomize(context, hparams, p): if p > 0: mask = tf.random.uniform(shape=tf.shape(context)) < p noise = tf.random.uniform(shape=tf.shape(context), minval=0, maxval=hparams.n_vocab, dtype=tf.int32) return tf.where(mask, noise, context) else: return context def main(): args = parser.parse_args() models_dir='/content/gpt-2/src/models' enc = encoder.get_encoder(args.model_name,models_dir) hparams = model.default_hparams() with open(os.path.join('models', args.model_name, 'hparams.json')) as f: hparams.override_from_dict(json.load(f)) if args.sample_length > hparams.n_ctx: raise ValueError( "Can't get samples longer than window size: %s" % hparams.n_ctx) if args.model_name == '345M': args.memory_saving_gradients = True if args.optimizer == 'adam': args.only_train_transformer_layers = True config = tf.ConfigProto() config.gpu_options.allow_growth = True config.graph_options.rewrite_options.layout_optimizer = rewriter_config_pb2.RewriterConfig.OFF with tf.Session(config=config) as sess: context = tf.placeholder(tf.int32, [args.batch_size, None]) context_in = randomize(context, hparams, args.noise) output = model.model(hparams=hparams, X=context_in) loss = tf.reduce_mean( tf.nn.sparse_softmax_cross_entropy_with_logits( labels=context[:, 1:], logits=output['logits'][:, :-1])) if args.val_every > 0: val_context = tf.placeholder(tf.int32, [args.val_batch_size, None]) val_output = model.model(hparams=hparams, X=val_context) val_loss = tf.reduce_mean( tf.nn.sparse_softmax_cross_entropy_with_logits( labels=val_context[:, 1:], logits=val_output['logits'][:, :-1])) val_loss_summary = tf.summary.scalar('val_loss', val_loss) tf_sample = sample.sample_sequence( hparams=hparams, length=args.sample_length, context=context, batch_size=args.batch_size, temperature=1.0, top_k=args.top_k, top_p=args.top_p) all_vars = [v for v in tf.trainable_variables() if 'model' in v.name] train_vars = [v for v in all_vars if '/h' in v.name] if args.only_train_transformer_layers else all_vars if args.optimizer == 'adam': opt = tf.train.AdamOptimizer(learning_rate=args.learning_rate) elif args.optimizer == 'sgd': opt = tf.train.GradientDescentOptimizer(learning_rate=args.learning_rate) else: exit('Bad optimizer:', args.optimizer) if args.accumulate_gradients > 1: if args.memory_saving_gradients: exit("Memory saving gradients are not implemented for gradient accumulation yet.") opt = AccumulatingOptimizer( opt=opt, var_list=train_vars) opt_reset = opt.reset() opt_compute = opt.compute_gradients(loss) opt_apply = opt.apply_gradients() summary_loss = tf.summary.scalar('loss', opt_apply) else: if args.memory_saving_gradients: opt_grads = memory_saving_gradients.gradients(loss, train_vars) else: opt_grads = tf.gradients(loss, train_vars) opt_grads = list(zip(opt_grads, train_vars)) opt_apply = opt.apply_gradients(opt_grads) summary_loss = tf.summary.scalar('loss', loss) summary_lr = tf.summary.scalar('learning_rate', args.learning_rate) summaries = tf.summary.merge([summary_lr, summary_loss]) summary_log = tf.summary.FileWriter( os.path.join(CHECKPOINT_DIR, args.run_name)) saver = tf.train.Saver( var_list=all_vars, max_to_keep=5, keep_checkpoint_every_n_hours=2) sess.run(tf.global_variables_initializer()) if args.restore_from == 'latest': ckpt = tf.train.latest_checkpoint( os.path.join(CHECKPOINT_DIR, args.run_name)) if ckpt is None: # Get fresh GPT weights if new run. ckpt = tf.train.latest_checkpoint( os.path.join('models', args.model_name)) elif args.restore_from == 'fresh': ckpt = tf.train.latest_checkpoint( os.path.join('models', args.model_name)) else: ckpt = tf.train.latest_checkpoint(args.restore_from) print('Loading checkpoint', ckpt) saver.restore(sess, ckpt) print('Loading dataset...') chunks = load_dataset(enc, args.dataset, args.combine, encoding=args.encoding) data_sampler = Sampler(chunks) if args.val_every > 0: if args.val_dataset: val_chunks = load_dataset(enc, args.val_dataset, args.combine, encoding=args.encoding) else: val_chunks = chunks print('dataset has', data_sampler.total_size, 'tokens') print('Training...') if args.val_every > 0: # Sample from validation set once with fixed seed to make # it deterministic during training as well as across runs. val_data_sampler = Sampler(val_chunks, seed=1) val_batches = [[val_data_sampler.sample(1024) for _ in range(args.val_batch_size)] for _ in range(args.val_batch_count)] counter = 1 counter_path = os.path.join(CHECKPOINT_DIR, args.run_name, 'counter') if os.path.exists(counter_path): # Load the step number if we're resuming a run # Add 1 so we don't immediately try to save again with open(counter_path, 'r') as fp: counter = int(fp.read()) + 1 def save(): maketree(os.path.join(CHECKPOINT_DIR, args.run_name)) print( 'Saving', os.path.join(CHECKPOINT_DIR, args.run_name, 'model-{}').format(counter)) saver.save( sess, os.path.join(CHECKPOINT_DIR, args.run_name, 'model'), global_step=counter) with open(counter_path, 'w') as fp: fp.write(str(counter) + '\n') def generate_samples(): print('Generating samples...') context_tokens = data_sampler.sample(1) all_text = [] index = 0 while index < args.sample_num: out = sess.run( tf_sample, feed_dict={context: args.batch_size * [context_tokens]}) for i in range(min(args.sample_num - index, args.batch_size)): text = enc.decode(out[i]) text = '======== SAMPLE {} ========\n{}\n'.format( index + 1, text) all_text.append(text) index += 1 print(text) maketree(os.path.join(SAMPLE_DIR, args.run_name)) with open( os.path.join(SAMPLE_DIR, args.run_name, 'samples-{}').format(counter), 'w', encoding=args.encoding) as fp: fp.write('\n'.join(all_text)) def validation(): print('Calculating validation loss...') losses = [] for batch in tqdm.tqdm(val_batches): losses.append(sess.run(val_loss, feed_dict={val_context: batch})) v_val_loss = np.mean(losses) v_summary = sess.run(val_loss_summary, feed_dict={val_loss: v_val_loss}) summary_log.add_summary(v_summary, counter) summary_log.flush() print( '[{counter} | {time:2.2f}] validation loss = {loss:2.2f}' .format( counter=counter, time=time.time() - start_time, loss=v_val_loss)) def sample_batch(): return [data_sampler.sample(1024) for _ in range(args.batch_size)] avg_loss = (0.0, 0.0) start_time = time.time() try: while True: if counter % args.save_every == 0: save() if counter % args.sample_every == 0: generate_samples() if args.val_every > 0 and (counter % args.val_every == 0 or counter == 1): validation() if args.accumulate_gradients > 1: sess.run(opt_reset) for _ in range(args.accumulate_gradients): sess.run( opt_compute, feed_dict={context: sample_batch()}) (v_loss, v_summary) = sess.run((opt_apply, summaries)) else: (_, v_loss, v_summary) = sess.run( (opt_apply, loss, summaries), feed_dict={context: sample_batch()}) summary_log.add_summary(v_summary, counter) avg_loss = (avg_loss[0] * 0.99 + v_loss, avg_loss[1] * 0.99 + 1.0) print( '[{counter} | {time:2.2f}] loss={loss:2.2f} avg={avg:2.2f}' .format( counter=counter, time=time.time() - start_time, loss=v_loss, avg=avg_loss[0] / avg_loss[1])) counter += 1 except KeyboardInterrupt: print('interrupted') save() if __name__ == '__main__': main() ================================================ FILE: Chapter08/text.txt ================================================ [File too large to display: 10.9 MB] ================================================ FILE: Chapter09/SRL.ipynb ================================================ { "nbformat": 4, "nbformat_minor": 0, "metadata": { "colab": { "name": "SRL.ipynb", "provenance": [], "collapsed_sections": [], "toc_visible": true }, "kernelspec": { "name": "python3", "display_name": "Python 3" } }, "cells": [ { "cell_type": "markdown", "metadata": { "id": "bzRpKWxSgZmb" }, "source": [ "#Semantic Role Labeling(SRL)\n", "\n", "The notebook is an implementation of the Allen Institute for AI BERT-based model. [Reference usage of the Notebook](https://demo.allennlp.org/semantic-role-labeling/MjE4NjI1Ng==)\n", "\n", "The BERT-based model is an implementation of [Peng Shi and Jimmy Lin, (2019), ‘Simple BERT Models for Relation Extraction and Semantic Role Labeling’]( https://arxiv.org/abs/1904.05255)\n" ] }, { "cell_type": "markdown", "metadata": { "id": "9aeqrxgQhKmE" }, "source": [ "Intalling Allen NLP" ] }, { "cell_type": "code", "metadata": { "id": "XAIkwYFaeBBD", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "d75870e6-02e6-4d6f-f165-c89bf32e17bb" }, "source": [ "!pip install allennlp==1.0.0 allennlp-models==1.0.0" ], "execution_count": 1, "outputs": [ { "output_type": "stream", "text": [ "Collecting allennlp==1.0.0\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/2c/49/bf0ec241496a82c9dd2f0b6ff6f8156b6b2b72b849df8c00a4f2bcf61485/allennlp-1.0.0-py3-none-any.whl (473kB)\n", "\u001b[K |████████████████████████████████| 481kB 9.4MB/s \n", "\u001b[?25hCollecting allennlp-models==1.0.0\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/3d/d5/9ee1d0b8c217b6978e42e54fbab8bafe9e792f0f8262f381dde44cee44ae/allennlp_models-1.0.0-py3-none-any.whl (282kB)\n", "\u001b[K |████████████████████████████████| 286kB 28.6MB/s \n", "\u001b[?25hRequirement already satisfied: tqdm>=4.19 in /usr/local/lib/python3.6/dist-packages (from allennlp==1.0.0) (4.41.1)\n", "Requirement already satisfied: spacy<2.3,>=2.1.0 in /usr/local/lib/python3.6/dist-packages (from allennlp==1.0.0) (2.2.4)\n", "Requirement already satisfied: scikit-learn in /usr/local/lib/python3.6/dist-packages (from allennlp==1.0.0) (0.22.2.post1)\n", "Requirement already satisfied: scipy in /usr/local/lib/python3.6/dist-packages (from allennlp==1.0.0) (1.4.1)\n", "Requirement already satisfied: nltk in /usr/local/lib/python3.6/dist-packages (from allennlp==1.0.0) (3.2.5)\n", "Collecting torch<1.6.0,>=1.5.0\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/62/01/457b49d790b6c4b9720e6f9dbbb617692f6ce8afdaadf425c055c41a7416/torch-1.5.1-cp36-cp36m-manylinux1_x86_64.whl (753.2MB)\n", "\u001b[K |████████████████████████████████| 753.2MB 23kB/s \n", "\u001b[?25hRequirement already satisfied: pytest in /usr/local/lib/python3.6/dist-packages (from allennlp==1.0.0) (3.6.4)\n", "Requirement already satisfied: dataclasses; python_version < \"3.7\" in /usr/local/lib/python3.6/dist-packages (from allennlp==1.0.0) (0.8)\n", "Collecting overrides==3.0.0\n", " Downloading https://files.pythonhosted.org/packages/42/8d/caa729f809ecdf8e76fac3c1ff7d3f0b72c398c9dd8a6919927a30a873b3/overrides-3.0.0.tar.gz\n", "Requirement already satisfied: requests>=2.18 in /usr/local/lib/python3.6/dist-packages (from allennlp==1.0.0) (2.23.0)\n", "Collecting jsonpickle\n", " Downloading https://files.pythonhosted.org/packages/ee/d5/1cc282dc23346a43aab461bf2e8c36593aacd34242bee1a13fa750db0cfe/jsonpickle-1.4.2-py2.py3-none-any.whl\n", "Collecting transformers<2.12,>=2.9\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/48/35/ad2c5b1b8f99feaaf9d7cdadaeef261f098c6e1a6a2935d4d07662a6b780/transformers-2.11.0-py3-none-any.whl (674kB)\n", "\u001b[K |████████████████████████████████| 675kB 42.1MB/s \n", "\u001b[?25hRequirement already satisfied: filelock<3.1,>=3.0 in /usr/local/lib/python3.6/dist-packages (from allennlp==1.0.0) (3.0.12)\n", "Collecting tensorboardX>=1.2\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/af/0c/4f41bcd45db376e6fe5c619c01100e9b7531c55791b7244815bac6eac32c/tensorboardX-2.1-py2.py3-none-any.whl (308kB)\n", "\u001b[K |████████████████████████████████| 317kB 30.5MB/s \n", "\u001b[?25hCollecting boto3\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/f6/bf/6fd00f2e8b2f9e8688b10b616c66be985a0053729cb1e92ac2f6e8ec1cd6/boto3-1.16.40-py2.py3-none-any.whl (130kB)\n", "\u001b[K |████████████████████████████████| 133kB 47.7MB/s \n", "\u001b[?25hRequirement already satisfied: numpy in /usr/local/lib/python3.6/dist-packages (from allennlp==1.0.0) (1.19.4)\n", "Requirement already satisfied: h5py in /usr/local/lib/python3.6/dist-packages (from allennlp==1.0.0) (2.10.0)\n", "Collecting jsonnet>=0.10.0; sys_platform != \"win32\"\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/42/40/6f16e5ac994b16fa71c24310f97174ce07d3a97b433275589265c6b94d2b/jsonnet-0.17.0.tar.gz (259kB)\n", "\u001b[K |████████████████████████████████| 266kB 59.0MB/s \n", "\u001b[?25hCollecting word2number>=1.1\n", " Downloading https://files.pythonhosted.org/packages/4a/29/a31940c848521f0725f0df6b25dca8917f13a2025b0e8fcbe5d0457e45e6/word2number-1.1.zip\n", "Collecting conllu==3.0\n", " Downloading https://files.pythonhosted.org/packages/66/0b/a8863b5c14aee200a13a0f8c28550fd0132e947ae88441c9f517eb84613b/conllu-3.0-py2.py3-none-any.whl\n", "Collecting py-rouge==1.1\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/9c/1d/0bdbaf559fb7afe32308ebc84a2028600988212d7eb7fb9f69c4e829e4a0/py_rouge-1.1-py3-none-any.whl (56kB)\n", "\u001b[K |████████████████████████████████| 61kB 5.9MB/s \n", "\u001b[?25hRequirement already satisfied: thinc==7.4.0 in /usr/local/lib/python3.6/dist-packages (from spacy<2.3,>=2.1.0->allennlp==1.0.0) (7.4.0)\n", "Requirement already satisfied: preshed<3.1.0,>=3.0.2 in /usr/local/lib/python3.6/dist-packages (from spacy<2.3,>=2.1.0->allennlp==1.0.0) (3.0.5)\n", "Requirement already satisfied: srsly<1.1.0,>=1.0.2 in /usr/local/lib/python3.6/dist-packages (from spacy<2.3,>=2.1.0->allennlp==1.0.0) (1.0.5)\n", "Requirement already satisfied: cymem<2.1.0,>=2.0.2 in /usr/local/lib/python3.6/dist-packages (from spacy<2.3,>=2.1.0->allennlp==1.0.0) (2.0.5)\n", "Requirement already satisfied: plac<1.2.0,>=0.9.6 in /usr/local/lib/python3.6/dist-packages (from spacy<2.3,>=2.1.0->allennlp==1.0.0) (1.1.3)\n", "Requirement already satisfied: blis<0.5.0,>=0.4.0 in /usr/local/lib/python3.6/dist-packages (from spacy<2.3,>=2.1.0->allennlp==1.0.0) (0.4.1)\n", "Requirement already satisfied: catalogue<1.1.0,>=0.0.7 in /usr/local/lib/python3.6/dist-packages (from spacy<2.3,>=2.1.0->allennlp==1.0.0) (1.0.0)\n", "Requirement already satisfied: setuptools in /usr/local/lib/python3.6/dist-packages (from spacy<2.3,>=2.1.0->allennlp==1.0.0) (50.3.2)\n", "Requirement already satisfied: wasabi<1.1.0,>=0.4.0 in /usr/local/lib/python3.6/dist-packages (from spacy<2.3,>=2.1.0->allennlp==1.0.0) (0.8.0)\n", "Requirement already satisfied: murmurhash<1.1.0,>=0.28.0 in /usr/local/lib/python3.6/dist-packages (from spacy<2.3,>=2.1.0->allennlp==1.0.0) (1.0.5)\n", "Requirement already satisfied: joblib>=0.11 in /usr/local/lib/python3.6/dist-packages (from scikit-learn->allennlp==1.0.0) (1.0.0)\n", "Requirement already satisfied: six in /usr/local/lib/python3.6/dist-packages (from nltk->allennlp==1.0.0) (1.15.0)\n", "Requirement already satisfied: future in /usr/local/lib/python3.6/dist-packages (from torch<1.6.0,>=1.5.0->allennlp==1.0.0) (0.16.0)\n", "Requirement already satisfied: attrs>=17.4.0 in /usr/local/lib/python3.6/dist-packages (from pytest->allennlp==1.0.0) (20.3.0)\n", "Requirement already satisfied: more-itertools>=4.0.0 in /usr/local/lib/python3.6/dist-packages (from pytest->allennlp==1.0.0) (8.6.0)\n", "Requirement already satisfied: pluggy<0.8,>=0.5 in /usr/local/lib/python3.6/dist-packages (from pytest->allennlp==1.0.0) (0.7.1)\n", "Requirement already satisfied: py>=1.5.0 in /usr/local/lib/python3.6/dist-packages (from pytest->allennlp==1.0.0) (1.10.0)\n", "Requirement already satisfied: atomicwrites>=1.0 in /usr/local/lib/python3.6/dist-packages (from pytest->allennlp==1.0.0) (1.4.0)\n", "Requirement already satisfied: idna<3,>=2.5 in /usr/local/lib/python3.6/dist-packages (from requests>=2.18->allennlp==1.0.0) (2.10)\n", "Requirement already satisfied: chardet<4,>=3.0.2 in /usr/local/lib/python3.6/dist-packages (from requests>=2.18->allennlp==1.0.0) (3.0.4)\n", "Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.6/dist-packages (from requests>=2.18->allennlp==1.0.0) (2020.12.5)\n", "Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /usr/local/lib/python3.6/dist-packages (from requests>=2.18->allennlp==1.0.0) (1.24.3)\n", "Requirement already satisfied: importlib-metadata; python_version < \"3.8\" in /usr/local/lib/python3.6/dist-packages (from jsonpickle->allennlp==1.0.0) (3.3.0)\n", "Collecting tokenizers==0.7.0\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/14/e5/a26eb4716523808bb0a799fcfdceb6ebf77a18169d9591b2f46a9adb87d9/tokenizers-0.7.0-cp36-cp36m-manylinux1_x86_64.whl (3.8MB)\n", "\u001b[K |████████████████████████████████| 3.8MB 48.9MB/s \n", "\u001b[?25hCollecting sentencepiece\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/e5/2d/6d4ca4bef9a67070fa1cac508606328329152b1df10bdf31fb6e4e727894/sentencepiece-0.1.94-cp36-cp36m-manylinux2014_x86_64.whl (1.1MB)\n", "\u001b[K |████████████████████████████████| 1.1MB 47.5MB/s \n", "\u001b[?25hCollecting sacremoses\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/7d/34/09d19aff26edcc8eb2a01bed8e98f13a1537005d31e95233fd48216eed10/sacremoses-0.0.43.tar.gz (883kB)\n", "\u001b[K |████████████████████████████████| 890kB 50.1MB/s \n", "\u001b[?25hRequirement already satisfied: regex!=2019.12.17 in /usr/local/lib/python3.6/dist-packages (from transformers<2.12,>=2.9->allennlp==1.0.0) (2019.12.20)\n", "Requirement already satisfied: packaging in /usr/local/lib/python3.6/dist-packages (from transformers<2.12,>=2.9->allennlp==1.0.0) (20.8)\n", "Requirement already satisfied: protobuf>=3.8.0 in /usr/local/lib/python3.6/dist-packages (from tensorboardX>=1.2->allennlp==1.0.0) (3.12.4)\n", "Collecting jmespath<1.0.0,>=0.7.1\n", " Downloading https://files.pythonhosted.org/packages/07/cb/5f001272b6faeb23c1c9e0acc04d48eaaf5c862c17709d20e3469c6e0139/jmespath-0.10.0-py2.py3-none-any.whl\n", "Collecting botocore<1.20.0,>=1.19.40\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/ea/77/13fc099a10c22d08d766e244412c114694e21982c04cafc1ade33d8a430c/botocore-1.19.40-py2.py3-none-any.whl (7.1MB)\n", "\u001b[K |████████████████████████████████| 7.1MB 34.3MB/s \n", "\u001b[?25hCollecting s3transfer<0.4.0,>=0.3.0\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/69/79/e6afb3d8b0b4e96cefbdc690f741d7dd24547ff1f94240c997a26fa908d3/s3transfer-0.3.3-py2.py3-none-any.whl (69kB)\n", "\u001b[K |████████████████████████████████| 71kB 7.6MB/s \n", "\u001b[?25hRequirement already satisfied: typing-extensions>=3.6.4; python_version < \"3.8\" in /usr/local/lib/python3.6/dist-packages (from importlib-metadata; python_version < \"3.8\"->jsonpickle->allennlp==1.0.0) (3.7.4.3)\n", "Requirement already satisfied: zipp>=0.5 in /usr/local/lib/python3.6/dist-packages (from importlib-metadata; python_version < \"3.8\"->jsonpickle->allennlp==1.0.0) (3.4.0)\n", "Requirement already satisfied: click in /usr/local/lib/python3.6/dist-packages (from sacremoses->transformers<2.12,>=2.9->allennlp==1.0.0) (7.1.2)\n", "Requirement already satisfied: pyparsing>=2.0.2 in /usr/local/lib/python3.6/dist-packages (from packaging->transformers<2.12,>=2.9->allennlp==1.0.0) (2.4.7)\n", "Requirement already satisfied: python-dateutil<3.0.0,>=2.1 in /usr/local/lib/python3.6/dist-packages (from botocore<1.20.0,>=1.19.40->boto3->allennlp==1.0.0) (2.8.1)\n", "Building wheels for collected packages: overrides, jsonnet, word2number, sacremoses\n", " Building wheel for overrides (setup.py) ... \u001b[?25l\u001b[?25hdone\n", " Created wheel for overrides: filename=overrides-3.0.0-cp36-none-any.whl size=5669 sha256=6f80143088a78455dd287d78fdced3dd986f9bb4a23edb94ec3376db1b81df6f\n", " Stored in directory: /root/.cache/pip/wheels/6f/1b/ec/6c71a1eb823df7f850d956b2d8c50a6d49c191e1063d73b9be\n", " Building wheel for jsonnet (setup.py) ... \u001b[?25l\u001b[?25hdone\n", " Created wheel for jsonnet: filename=jsonnet-0.17.0-cp36-cp36m-linux_x86_64.whl size=3387942 sha256=1418eb22c110c8535ec13a7e223704f7d00de2f9d27b395203bcc0a6044e03aa\n", " Stored in directory: /root/.cache/pip/wheels/26/7a/37/7dbcc30a6b4efd17b91ad1f0128b7bbf84813bd4e1cfb8c1e3\n", " Building wheel for word2number (setup.py) ... \u001b[?25l\u001b[?25hdone\n", " Created wheel for word2number: filename=word2number-1.1-cp36-none-any.whl size=5588 sha256=6c711b492080c629e45bd6f594bbd082f84b77fb2859e691b07f8cc43e891868\n", " Stored in directory: /root/.cache/pip/wheels/46/2f/53/5f5c1d275492f2fce1cdab9a9bb12d49286dead829a4078e0e\n", " Building wheel for sacremoses (setup.py) ... \u001b[?25l\u001b[?25hdone\n", " Created wheel for sacremoses: filename=sacremoses-0.0.43-cp36-none-any.whl size=893261 sha256=ca22850c9a27802373c980ccdce43020db3a1c30576474ebc27c849dd7a8374e\n", " Stored in directory: /root/.cache/pip/wheels/29/3c/fd/7ce5c3f0666dab31a50123635e6fb5e19ceb42ce38d4e58f45\n", "Successfully built overrides jsonnet word2number sacremoses\n", "\u001b[31mERROR: torchvision 0.8.1+cu101 has requirement torch==1.7.0, but you'll have torch 1.5.1 which is incompatible.\u001b[0m\n", "\u001b[31mERROR: botocore 1.19.40 has requirement urllib3<1.27,>=1.25.4; python_version != \"3.4\", but you'll have urllib3 1.24.3 which is incompatible.\u001b[0m\n", "Installing collected packages: torch, overrides, jsonpickle, tokenizers, sentencepiece, sacremoses, transformers, tensorboardX, jmespath, botocore, s3transfer, boto3, jsonnet, allennlp, word2number, conllu, py-rouge, allennlp-models\n", " Found existing installation: torch 1.7.0+cu101\n", " Uninstalling torch-1.7.0+cu101:\n", " Successfully uninstalled torch-1.7.0+cu101\n", "Successfully installed allennlp-1.0.0 allennlp-models-1.0.0 boto3-1.16.40 botocore-1.19.40 conllu-3.0 jmespath-0.10.0 jsonnet-0.17.0 jsonpickle-1.4.2 overrides-3.0.0 py-rouge-1.1 s3transfer-0.3.3 sacremoses-0.0.43 sentencepiece-0.1.94 tensorboardX-2.1 tokenizers-0.7.0 torch-1.5.1 transformers-2.11.0 word2number-1.1\n" ], "name": "stdout" } ] }, { "cell_type": "markdown", "metadata": { "id": "EcSZJu8ohUv5" }, "source": [ "Sample 1: Did Bob really think he could prepare a meal for 50 people in only a few hours?" ] }, { "cell_type": "code", "metadata": { "id": "1pziWuZKeMti", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "cef0cbfb-9256-45d6-9419-3cba6bd616c3" }, "source": [ "!echo '{\"sentence\": \"Did Bob really think he could prepare a meal for 50 people in only a few hours?\"}' | \\\n", "allennlp predict https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz -" ], "execution_count": 2, "outputs": [ { "output_type": "stream", "text": [ "2020-12-20 09:07:37,371 - INFO - transformers.file_utils - PyTorch version 1.5.1 available.\n", "2020-12-20 09:07:37.529750: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library libcudart.so.10.1\n", "2020-12-20 09:07:39,325 - INFO - transformers.file_utils - TensorFlow version 2.4.0 available.\n", "[nltk_data] Downloading package punkt to /root/nltk_data...\n", "[nltk_data] Unzipping tokenizers/punkt.zip.\n", "[nltk_data] Downloading package wordnet to /root/nltk_data...\n", "[nltk_data] Unzipping corpora/wordnet.zip.\n", "2020-12-20 09:07:43,159 - INFO - allennlp.common.file_utils - checking cache for https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz at /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\n", "2020-12-20 09:07:43,159 - INFO - allennlp.common.file_utils - waiting to acquire lock on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\n", "2020-12-20 09:07:43,160 - INFO - filelock - Lock 140370276310488 acquired on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724.lock\n", "2020-12-20 09:07:43,161 - INFO - allennlp.common.file_utils - https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz not found in cache, downloading to /root/.allennlp/cache/tmpqkld8r7a.tmp\n", "100% 406056588/406056588 [00:06<00:00, 64834243.25B/s]\n", "2020-12-20 09:07:49,714 - INFO - allennlp.common.file_utils - Renaming temp file /root/.allennlp/cache/tmpqkld8r7a.tmp to cache at /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\n", "2020-12-20 09:07:49,714 - INFO - allennlp.common.file_utils - creating metadata file for /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\n", "2020-12-20 09:07:49,715 - INFO - filelock - Lock 140370276310488 released on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724.lock\n", "2020-12-20 09:07:49,715 - INFO - allennlp.models.archival - loading archive file https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz from cache at /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\n", "2020-12-20 09:07:49,715 - INFO - allennlp.models.archival - extracting archive file /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724 to temp dir /tmp/tmpaqcbgixa\n", "2020-12-20 09:07:54,008 - INFO - allennlp.common.params - type = from_instances\n", "2020-12-20 09:07:54,009 - INFO - allennlp.data.vocabulary - Loading token dictionary from /tmp/tmpaqcbgixa/vocabulary.\n", "2020-12-20 09:07:54,009 - INFO - filelock - Lock 140370276158376 acquired on /tmp/tmpaqcbgixa/vocabulary/.lock\n", "2020-12-20 09:07:54,036 - INFO - filelock - Lock 140370276158376 released on /tmp/tmpaqcbgixa/vocabulary/.lock\n", "2020-12-20 09:07:54,037 - INFO - allennlp.common.params - model.type = srl_bert\n", "2020-12-20 09:07:54,037 - INFO - allennlp.common.params - model.regularizer = None\n", "2020-12-20 09:07:54,037 - INFO - allennlp.common.params - model.bert_model = bert-base-uncased\n", "2020-12-20 09:07:54,037 - INFO - allennlp.common.params - model.embedding_dropout = 0.1\n", "2020-12-20 09:07:54,038 - INFO - allennlp.common.params - model.initializer = \n", "2020-12-20 09:07:54,038 - INFO - allennlp.common.params - model.label_smoothing = None\n", "2020-12-20 09:07:54,038 - INFO - allennlp.common.params - model.ignore_span_metric = False\n", "2020-12-20 09:07:54,038 - INFO - allennlp.common.params - model.srl_eval_path = /usr/local/lib/python3.6/dist-packages/allennlp_models/structured_prediction/tools/srl-eval.pl\n", "2020-12-20 09:07:54,338 - INFO - filelock - Lock 140370267846920 acquired on /root/.cache/torch/transformers/4dad0251492946e18ac39290fcfe91b89d370fee250efe9521476438fe8ca185.7156163d5fdc189c3016baca0775ffce230789d7fa2a42ef516483e4ca884517.lock\n", "2020-12-20 09:07:54,339 - INFO - transformers.file_utils - https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-config.json not found in cache or force_download set to True, downloading to /root/.cache/torch/transformers/tmp9z3yyeli\n", "Downloading: 100% 433/433 [00:00<00:00, 297kB/s]\n", "2020-12-20 09:07:54,651 - INFO - transformers.file_utils - storing https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-config.json in cache at /root/.cache/torch/transformers/4dad0251492946e18ac39290fcfe91b89d370fee250efe9521476438fe8ca185.7156163d5fdc189c3016baca0775ffce230789d7fa2a42ef516483e4ca884517\n", "2020-12-20 09:07:54,651 - INFO - transformers.file_utils - creating metadata file for /root/.cache/torch/transformers/4dad0251492946e18ac39290fcfe91b89d370fee250efe9521476438fe8ca185.7156163d5fdc189c3016baca0775ffce230789d7fa2a42ef516483e4ca884517\n", "2020-12-20 09:07:54,651 - INFO - filelock - Lock 140370267846920 released on /root/.cache/torch/transformers/4dad0251492946e18ac39290fcfe91b89d370fee250efe9521476438fe8ca185.7156163d5fdc189c3016baca0775ffce230789d7fa2a42ef516483e4ca884517.lock\n", "2020-12-20 09:07:54,652 - INFO - transformers.configuration_utils - loading configuration file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-config.json from cache at /root/.cache/torch/transformers/4dad0251492946e18ac39290fcfe91b89d370fee250efe9521476438fe8ca185.7156163d5fdc189c3016baca0775ffce230789d7fa2a42ef516483e4ca884517\n", "2020-12-20 09:07:54,652 - INFO - transformers.configuration_utils - Model config BertConfig {\n", " \"architectures\": [\n", " \"BertForMaskedLM\"\n", " ],\n", " \"attention_probs_dropout_prob\": 0.1,\n", " \"hidden_act\": \"gelu\",\n", " \"hidden_dropout_prob\": 0.1,\n", " \"hidden_size\": 768,\n", " \"initializer_range\": 0.02,\n", " \"intermediate_size\": 3072,\n", " \"layer_norm_eps\": 1e-12,\n", " \"max_position_embeddings\": 512,\n", " \"model_type\": \"bert\",\n", " \"num_attention_heads\": 12,\n", " \"num_hidden_layers\": 12,\n", " \"pad_token_id\": 0,\n", " \"type_vocab_size\": 2,\n", " \"vocab_size\": 30522\n", "}\n", "\n", "2020-12-20 09:07:54,878 - INFO - filelock - Lock 140370267847312 acquired on /root/.cache/torch/transformers/f2ee78bdd635b758cc0a12352586868bef80e47401abe4c4fcc3832421e7338b.36ca03ab34a1a5d5fa7bc3d03d55c4fa650fed07220e2eeebc06ce58d0e9a157.lock\n", "2020-12-20 09:07:54,879 - INFO - transformers.file_utils - https://cdn.huggingface.co/bert-base-uncased-pytorch_model.bin not found in cache or force_download set to True, downloading to /root/.cache/torch/transformers/tmp2oj6r8r1\n", "Downloading: 100% 440M/440M [00:10<00:00, 43.8MB/s]\n", "2020-12-20 09:08:04,981 - INFO - transformers.file_utils - storing https://cdn.huggingface.co/bert-base-uncased-pytorch_model.bin in cache at /root/.cache/torch/transformers/f2ee78bdd635b758cc0a12352586868bef80e47401abe4c4fcc3832421e7338b.36ca03ab34a1a5d5fa7bc3d03d55c4fa650fed07220e2eeebc06ce58d0e9a157\n", "2020-12-20 09:08:04,981 - INFO - transformers.file_utils - creating metadata file for /root/.cache/torch/transformers/f2ee78bdd635b758cc0a12352586868bef80e47401abe4c4fcc3832421e7338b.36ca03ab34a1a5d5fa7bc3d03d55c4fa650fed07220e2eeebc06ce58d0e9a157\n", "2020-12-20 09:08:04,982 - INFO - filelock - Lock 140370267847312 released on /root/.cache/torch/transformers/f2ee78bdd635b758cc0a12352586868bef80e47401abe4c4fcc3832421e7338b.36ca03ab34a1a5d5fa7bc3d03d55c4fa650fed07220e2eeebc06ce58d0e9a157.lock\n", "2020-12-20 09:08:04,982 - INFO - transformers.modeling_utils - loading weights file https://cdn.huggingface.co/bert-base-uncased-pytorch_model.bin from cache at /root/.cache/torch/transformers/f2ee78bdd635b758cc0a12352586868bef80e47401abe4c4fcc3832421e7338b.36ca03ab34a1a5d5fa7bc3d03d55c4fa650fed07220e2eeebc06ce58d0e9a157\n", "2020-12-20 09:08:07,795 - INFO - allennlp.nn.initializers - Initializing parameters\n", "2020-12-20 09:08:07,796 - INFO - allennlp.nn.initializers - Done initializing parameters; the following parameters are using their default initialization from their code\n", "2020-12-20 09:08:07,796 - INFO - allennlp.nn.initializers - bert_model.embeddings.LayerNorm.bias\n", "2020-12-20 09:08:07,796 - INFO - allennlp.nn.initializers - bert_model.embeddings.LayerNorm.weight\n", "2020-12-20 09:08:07,796 - INFO - allennlp.nn.initializers - bert_model.embeddings.position_embeddings.weight\n", "2020-12-20 09:08:07,796 - INFO - allennlp.nn.initializers - bert_model.embeddings.token_type_embeddings.weight\n", "2020-12-20 09:08:07,796 - INFO - allennlp.nn.initializers - bert_model.embeddings.word_embeddings.weight\n", "2020-12-20 09:08:07,796 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:07,796 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:07,796 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.output.dense.bias\n", "2020-12-20 09:08:07,796 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.output.dense.weight\n", "2020-12-20 09:08:07,796 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.key.bias\n", "2020-12-20 09:08:07,796 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.key.weight\n", "2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.query.bias\n", "2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.query.weight\n", "2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.value.bias\n", "2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.value.weight\n", "2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.intermediate.dense.bias\n", "2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.intermediate.dense.weight\n", "2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.output.LayerNorm.bias\n", "2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.output.LayerNorm.weight\n", "2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.output.dense.bias\n", "2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.output.dense.weight\n", "2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.output.dense.bias\n", "2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.output.dense.weight\n", "2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.key.bias\n", "2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.key.weight\n", "2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.query.bias\n", "2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.query.weight\n", "2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.value.bias\n", "2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.value.weight\n", "2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.intermediate.dense.bias\n", "2020-12-20 09:08:07,797 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.intermediate.dense.weight\n", "2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.output.LayerNorm.bias\n", "2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.output.LayerNorm.weight\n", "2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.output.dense.bias\n", "2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.output.dense.weight\n", "2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.output.dense.bias\n", "2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.output.dense.weight\n", "2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.key.bias\n", "2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.key.weight\n", "2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.query.bias\n", "2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.query.weight\n", "2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.value.bias\n", "2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.value.weight\n", "2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.intermediate.dense.bias\n", "2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.intermediate.dense.weight\n", "2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.output.LayerNorm.bias\n", "2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.output.LayerNorm.weight\n", "2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.output.dense.bias\n", "2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.output.dense.weight\n", "2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:07,798 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.output.dense.bias\n", "2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.output.dense.weight\n", "2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.key.bias\n", "2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.key.weight\n", "2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.query.bias\n", "2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.query.weight\n", "2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.value.bias\n", "2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.value.weight\n", "2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.intermediate.dense.bias\n", "2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.intermediate.dense.weight\n", "2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.output.LayerNorm.bias\n", "2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.output.LayerNorm.weight\n", "2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.output.dense.bias\n", "2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.output.dense.weight\n", "2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.output.dense.bias\n", "2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.output.dense.weight\n", "2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.key.bias\n", "2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.key.weight\n", "2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.query.bias\n", "2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.query.weight\n", "2020-12-20 09:08:07,799 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.value.bias\n", "2020-12-20 09:08:07,800 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.value.weight\n", "2020-12-20 09:08:07,800 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.intermediate.dense.bias\n", "2020-12-20 09:08:07,800 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.intermediate.dense.weight\n", "2020-12-20 09:08:07,800 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.output.LayerNorm.bias\n", "2020-12-20 09:08:07,800 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.output.LayerNorm.weight\n", "2020-12-20 09:08:07,800 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.output.dense.bias\n", "2020-12-20 09:08:07,800 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.output.dense.weight\n", "2020-12-20 09:08:07,800 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:07,800 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:07,800 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.output.dense.bias\n", "2020-12-20 09:08:07,800 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.output.dense.weight\n", "2020-12-20 09:08:07,800 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.key.bias\n", "2020-12-20 09:08:07,800 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.key.weight\n", "2020-12-20 09:08:07,800 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.query.bias\n", "2020-12-20 09:08:07,801 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.query.weight\n", "2020-12-20 09:08:07,801 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.value.bias\n", "2020-12-20 09:08:07,801 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.value.weight\n", "2020-12-20 09:08:07,801 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.intermediate.dense.bias\n", "2020-12-20 09:08:07,801 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.intermediate.dense.weight\n", "2020-12-20 09:08:07,801 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.output.LayerNorm.bias\n", "2020-12-20 09:08:07,801 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.output.LayerNorm.weight\n", "2020-12-20 09:08:07,801 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.output.dense.bias\n", "2020-12-20 09:08:07,801 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.output.dense.weight\n", "2020-12-20 09:08:07,801 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:07,801 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:07,801 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.output.dense.bias\n", "2020-12-20 09:08:07,801 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.output.dense.weight\n", "2020-12-20 09:08:07,801 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.key.bias\n", "2020-12-20 09:08:07,802 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.key.weight\n", "2020-12-20 09:08:07,802 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.query.bias\n", "2020-12-20 09:08:07,802 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.query.weight\n", "2020-12-20 09:08:07,802 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.value.bias\n", "2020-12-20 09:08:07,802 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.value.weight\n", "2020-12-20 09:08:07,802 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.intermediate.dense.bias\n", "2020-12-20 09:08:07,802 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.intermediate.dense.weight\n", "2020-12-20 09:08:07,802 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.output.LayerNorm.bias\n", "2020-12-20 09:08:07,802 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.output.LayerNorm.weight\n", "2020-12-20 09:08:07,802 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.output.dense.bias\n", "2020-12-20 09:08:07,802 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.output.dense.weight\n", "2020-12-20 09:08:07,802 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:07,802 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:07,802 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.output.dense.bias\n", "2020-12-20 09:08:07,802 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.output.dense.weight\n", "2020-12-20 09:08:07,802 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.key.bias\n", "2020-12-20 09:08:07,802 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.key.weight\n", "2020-12-20 09:08:07,803 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.query.bias\n", "2020-12-20 09:08:07,803 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.query.weight\n", "2020-12-20 09:08:07,803 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.value.bias\n", "2020-12-20 09:08:07,803 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.value.weight\n", "2020-12-20 09:08:07,803 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.intermediate.dense.bias\n", "2020-12-20 09:08:07,803 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.intermediate.dense.weight\n", "2020-12-20 09:08:07,803 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.output.LayerNorm.bias\n", "2020-12-20 09:08:07,803 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.output.LayerNorm.weight\n", "2020-12-20 09:08:07,803 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.output.dense.bias\n", "2020-12-20 09:08:07,803 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.output.dense.weight\n", "2020-12-20 09:08:07,803 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:07,803 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:07,803 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.output.dense.bias\n", "2020-12-20 09:08:07,803 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.output.dense.weight\n", "2020-12-20 09:08:07,803 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.key.bias\n", "2020-12-20 09:08:07,803 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.key.weight\n", "2020-12-20 09:08:07,803 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.query.bias\n", "2020-12-20 09:08:07,804 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.query.weight\n", "2020-12-20 09:08:07,804 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.value.bias\n", "2020-12-20 09:08:07,804 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.value.weight\n", "2020-12-20 09:08:07,804 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.intermediate.dense.bias\n", "2020-12-20 09:08:07,804 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.intermediate.dense.weight\n", "2020-12-20 09:08:07,804 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.output.LayerNorm.bias\n", "2020-12-20 09:08:07,804 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.output.LayerNorm.weight\n", "2020-12-20 09:08:07,804 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.output.dense.bias\n", "2020-12-20 09:08:07,804 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.output.dense.weight\n", "2020-12-20 09:08:07,804 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:07,804 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:07,804 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.output.dense.bias\n", "2020-12-20 09:08:07,804 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.output.dense.weight\n", "2020-12-20 09:08:07,804 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.key.bias\n", "2020-12-20 09:08:07,804 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.key.weight\n", "2020-12-20 09:08:07,804 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.query.bias\n", "2020-12-20 09:08:07,805 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.query.weight\n", "2020-12-20 09:08:07,805 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.value.bias\n", "2020-12-20 09:08:07,805 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.value.weight\n", "2020-12-20 09:08:07,805 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.intermediate.dense.bias\n", "2020-12-20 09:08:07,805 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.intermediate.dense.weight\n", "2020-12-20 09:08:07,805 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.output.LayerNorm.bias\n", "2020-12-20 09:08:07,805 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.output.LayerNorm.weight\n", "2020-12-20 09:08:07,805 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.output.dense.bias\n", "2020-12-20 09:08:07,805 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.output.dense.weight\n", "2020-12-20 09:08:07,805 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:07,805 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:07,805 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.output.dense.bias\n", "2020-12-20 09:08:07,805 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.output.dense.weight\n", "2020-12-20 09:08:07,805 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.key.bias\n", "2020-12-20 09:08:07,805 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.key.weight\n", "2020-12-20 09:08:07,805 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.query.bias\n", "2020-12-20 09:08:07,806 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.query.weight\n", "2020-12-20 09:08:07,806 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.value.bias\n", "2020-12-20 09:08:07,806 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.value.weight\n", "2020-12-20 09:08:07,806 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.intermediate.dense.bias\n", "2020-12-20 09:08:07,806 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.intermediate.dense.weight\n", "2020-12-20 09:08:07,806 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.output.LayerNorm.bias\n", "2020-12-20 09:08:07,806 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.output.LayerNorm.weight\n", "2020-12-20 09:08:07,806 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.output.dense.bias\n", "2020-12-20 09:08:07,806 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.output.dense.weight\n", "2020-12-20 09:08:07,806 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:07,806 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:07,806 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.output.dense.bias\n", "2020-12-20 09:08:07,806 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.output.dense.weight\n", "2020-12-20 09:08:07,806 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.key.bias\n", "2020-12-20 09:08:07,806 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.key.weight\n", "2020-12-20 09:08:07,806 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.query.bias\n", "2020-12-20 09:08:07,807 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.query.weight\n", "2020-12-20 09:08:07,807 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.value.bias\n", "2020-12-20 09:08:07,807 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.value.weight\n", "2020-12-20 09:08:07,807 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.intermediate.dense.bias\n", "2020-12-20 09:08:07,807 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.intermediate.dense.weight\n", "2020-12-20 09:08:07,807 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.output.LayerNorm.bias\n", "2020-12-20 09:08:07,807 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.output.LayerNorm.weight\n", "2020-12-20 09:08:07,807 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.output.dense.bias\n", "2020-12-20 09:08:07,807 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.output.dense.weight\n", "2020-12-20 09:08:07,807 - INFO - allennlp.nn.initializers - bert_model.pooler.dense.bias\n", "2020-12-20 09:08:07,807 - INFO - allennlp.nn.initializers - bert_model.pooler.dense.weight\n", "2020-12-20 09:08:07,807 - INFO - allennlp.nn.initializers - tag_projection_layer.bias\n", "2020-12-20 09:08:07,807 - INFO - allennlp.nn.initializers - tag_projection_layer.weight\n", "2020-12-20 09:08:08,259 - INFO - allennlp.common.params - dataset_reader.type = srl\n", "2020-12-20 09:08:08,260 - INFO - allennlp.common.params - dataset_reader.lazy = False\n", "2020-12-20 09:08:08,260 - INFO - allennlp.common.params - dataset_reader.cache_directory = None\n", "2020-12-20 09:08:08,260 - INFO - allennlp.common.params - dataset_reader.max_instances = None\n", "2020-12-20 09:08:08,260 - INFO - allennlp.common.params - dataset_reader.manual_distributed_sharding = False\n", "2020-12-20 09:08:08,260 - INFO - allennlp.common.params - dataset_reader.manual_multi_process_sharding = False\n", "2020-12-20 09:08:08,260 - INFO - allennlp.common.params - dataset_reader.token_indexers = None\n", "2020-12-20 09:08:08,260 - INFO - allennlp.common.params - dataset_reader.domain_identifier = None\n", "2020-12-20 09:08:08,260 - INFO - allennlp.common.params - dataset_reader.bert_model_name = bert-base-uncased\n", "2020-12-20 09:08:08,568 - INFO - filelock - Lock 140370276307520 acquired on /root/.cache/torch/transformers/26bc1ad6c0ac742e9b52263248f6d0f00068293b33709fae12320c0e35ccfbbb.542ce4285a40d23a559526243235df47c5f75c197f04f37d1a0c124c32c9a084.lock\n", "2020-12-20 09:08:08,568 - INFO - transformers.file_utils - https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt not found in cache or force_download set to True, downloading to /root/.cache/torch/transformers/tmp733mj1ki\n", "Downloading: 100% 232k/232k [00:00<00:00, 880kB/s]\n", "2020-12-20 09:08:09,117 - INFO - transformers.file_utils - storing https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt in cache at /root/.cache/torch/transformers/26bc1ad6c0ac742e9b52263248f6d0f00068293b33709fae12320c0e35ccfbbb.542ce4285a40d23a559526243235df47c5f75c197f04f37d1a0c124c32c9a084\n", "2020-12-20 09:08:09,117 - INFO - transformers.file_utils - creating metadata file for /root/.cache/torch/transformers/26bc1ad6c0ac742e9b52263248f6d0f00068293b33709fae12320c0e35ccfbbb.542ce4285a40d23a559526243235df47c5f75c197f04f37d1a0c124c32c9a084\n", "2020-12-20 09:08:09,117 - INFO - filelock - Lock 140370276307520 released on /root/.cache/torch/transformers/26bc1ad6c0ac742e9b52263248f6d0f00068293b33709fae12320c0e35ccfbbb.542ce4285a40d23a559526243235df47c5f75c197f04f37d1a0c124c32c9a084.lock\n", "2020-12-20 09:08:09,118 - INFO - transformers.tokenization_utils - loading file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt from cache at /root/.cache/torch/transformers/26bc1ad6c0ac742e9b52263248f6d0f00068293b33709fae12320c0e35ccfbbb.542ce4285a40d23a559526243235df47c5f75c197f04f37d1a0c124c32c9a084\n", "input 0: {\"sentence\": \"Did Bob really think he could prepare a meal for 50 people in only a few hours?\"}\n", "prediction: {\"verbs\": [{\"verb\": \"think\", \"description\": \"Did [ARG0: Bob] [ARGM-ADV: really] [V: think] [ARG1: he could prepare a meal for 50 people in only a few hours] ?\", \"tags\": [\"O\", \"B-ARG0\", \"B-ARGM-ADV\", \"B-V\", \"B-ARG1\", \"I-ARG1\", \"I-ARG1\", \"I-ARG1\", \"I-ARG1\", \"I-ARG1\", \"I-ARG1\", \"I-ARG1\", \"I-ARG1\", \"I-ARG1\", \"I-ARG1\", \"I-ARG1\", \"I-ARG1\", \"O\"]}, {\"verb\": \"could\", \"description\": \"Did Bob really think he [V: could] prepare a meal for 50 people in only a few hours ?\", \"tags\": [\"O\", \"O\", \"O\", \"O\", \"O\", \"B-V\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\"]}, {\"verb\": \"prepare\", \"description\": \"Did Bob really think [ARG0: he] [ARGM-MOD: could] [V: prepare] [ARG1: a meal for 50 people] [ARGM-TMP: in only a few hours] ?\", \"tags\": [\"O\", \"O\", \"O\", \"O\", \"B-ARG0\", \"B-ARGM-MOD\", \"B-V\", \"B-ARG1\", \"I-ARG1\", \"I-ARG1\", \"I-ARG1\", \"I-ARG1\", \"B-ARGM-TMP\", \"I-ARGM-TMP\", \"I-ARGM-TMP\", \"I-ARGM-TMP\", \"I-ARGM-TMP\", \"O\"]}], \"words\": [\"Did\", \"Bob\", \"really\", \"think\", \"he\", \"could\", \"prepare\", \"a\", \"meal\", \"for\", \"50\", \"people\", \"in\", \"only\", \"a\", \"few\", \"hours\", \"?\"]}\n", "\n", "2020-12-20 09:08:10,277 - INFO - allennlp.models.archival - removing temporary unarchived model dir at /tmp/tmpaqcbgixa\n" ], "name": "stdout" } ] }, { "cell_type": "markdown", "metadata": { "id": "hWHpOrNvZQ3m" }, "source": [ "Sample 2: Mrs. and Mr. Tomaso went to Europe for vacation and visited Paris and first went to visit the Eiffel Tower." ] }, { "cell_type": "code", "metadata": { "id": "yFKPLyqihrB_", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "e0f52798-903a-4d04-f388-d26a700b8cd7" }, "source": [ "!echo '{\"sentence\": \"Mrs. and Mr. Tomaso went to Europe for vacation and visited Paris and first went to visit the Eiffel Tower.\"}' | \\\n", "allennlp predict https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz -" ], "execution_count": 3, "outputs": [ { "output_type": "stream", "text": [ "2020-12-20 09:08:12,622 - INFO - transformers.file_utils - PyTorch version 1.5.1 available.\n", "2020-12-20 09:08:12.774532: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library libcudart.so.10.1\n", "2020-12-20 09:08:14,547 - INFO - transformers.file_utils - TensorFlow version 2.4.0 available.\n", "2020-12-20 09:08:15,761 - INFO - allennlp.common.file_utils - checking cache for https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz at /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\n", "2020-12-20 09:08:15,761 - INFO - allennlp.common.file_utils - waiting to acquire lock on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\n", "2020-12-20 09:08:15,762 - INFO - filelock - Lock 139888470380440 acquired on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724.lock\n", "2020-12-20 09:08:15,763 - INFO - allennlp.common.file_utils - cache of https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz is up-to-date\n", "2020-12-20 09:08:15,763 - INFO - filelock - Lock 139888470380440 released on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724.lock\n", "2020-12-20 09:08:15,763 - INFO - allennlp.models.archival - loading archive file https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz from cache at /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\n", "2020-12-20 09:08:15,763 - INFO - allennlp.models.archival - extracting archive file /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724 to temp dir /tmp/tmp7jeuj77a\n", "2020-12-20 09:08:19,975 - INFO - allennlp.common.params - type = from_instances\n", "2020-12-20 09:08:19,976 - INFO - allennlp.data.vocabulary - Loading token dictionary from /tmp/tmp7jeuj77a/vocabulary.\n", "2020-12-20 09:08:19,976 - INFO - filelock - Lock 139888468747992 acquired on /tmp/tmp7jeuj77a/vocabulary/.lock\n", "2020-12-20 09:08:20,002 - INFO - filelock - Lock 139888468747992 released on /tmp/tmp7jeuj77a/vocabulary/.lock\n", "2020-12-20 09:08:20,003 - INFO - allennlp.common.params - model.type = srl_bert\n", "2020-12-20 09:08:20,004 - INFO - allennlp.common.params - model.regularizer = None\n", "2020-12-20 09:08:20,004 - INFO - allennlp.common.params - model.bert_model = bert-base-uncased\n", "2020-12-20 09:08:20,004 - INFO - allennlp.common.params - model.embedding_dropout = 0.1\n", "2020-12-20 09:08:20,004 - INFO - allennlp.common.params - model.initializer = \n", "2020-12-20 09:08:20,004 - INFO - allennlp.common.params - model.label_smoothing = None\n", "2020-12-20 09:08:20,004 - INFO - allennlp.common.params - model.ignore_span_metric = False\n", "2020-12-20 09:08:20,004 - INFO - allennlp.common.params - model.srl_eval_path = /usr/local/lib/python3.6/dist-packages/allennlp_models/structured_prediction/tools/srl-eval.pl\n", "2020-12-20 09:08:20,298 - INFO - transformers.configuration_utils - loading configuration file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-config.json from cache at /root/.cache/torch/transformers/4dad0251492946e18ac39290fcfe91b89d370fee250efe9521476438fe8ca185.7156163d5fdc189c3016baca0775ffce230789d7fa2a42ef516483e4ca884517\n", "2020-12-20 09:08:20,298 - INFO - transformers.configuration_utils - Model config BertConfig {\n", " \"architectures\": [\n", " \"BertForMaskedLM\"\n", " ],\n", " \"attention_probs_dropout_prob\": 0.1,\n", " \"hidden_act\": \"gelu\",\n", " \"hidden_dropout_prob\": 0.1,\n", " \"hidden_size\": 768,\n", " \"initializer_range\": 0.02,\n", " \"intermediate_size\": 3072,\n", " \"layer_norm_eps\": 1e-12,\n", " \"max_position_embeddings\": 512,\n", " \"model_type\": \"bert\",\n", " \"num_attention_heads\": 12,\n", " \"num_hidden_layers\": 12,\n", " \"pad_token_id\": 0,\n", " \"type_vocab_size\": 2,\n", " \"vocab_size\": 30522\n", "}\n", "\n", "2020-12-20 09:08:20,492 - INFO - transformers.modeling_utils - loading weights file https://cdn.huggingface.co/bert-base-uncased-pytorch_model.bin from cache at /root/.cache/torch/transformers/f2ee78bdd635b758cc0a12352586868bef80e47401abe4c4fcc3832421e7338b.36ca03ab34a1a5d5fa7bc3d03d55c4fa650fed07220e2eeebc06ce58d0e9a157\n", "2020-12-20 09:08:23,170 - INFO - allennlp.nn.initializers - Initializing parameters\n", "2020-12-20 09:08:23,171 - INFO - allennlp.nn.initializers - Done initializing parameters; the following parameters are using their default initialization from their code\n", "2020-12-20 09:08:23,171 - INFO - allennlp.nn.initializers - bert_model.embeddings.LayerNorm.bias\n", "2020-12-20 09:08:23,172 - INFO - allennlp.nn.initializers - bert_model.embeddings.LayerNorm.weight\n", "2020-12-20 09:08:23,172 - INFO - allennlp.nn.initializers - bert_model.embeddings.position_embeddings.weight\n", "2020-12-20 09:08:23,172 - INFO - allennlp.nn.initializers - bert_model.embeddings.token_type_embeddings.weight\n", "2020-12-20 09:08:23,172 - INFO - allennlp.nn.initializers - bert_model.embeddings.word_embeddings.weight\n", "2020-12-20 09:08:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.output.dense.bias\n", "2020-12-20 09:08:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.output.dense.weight\n", "2020-12-20 09:08:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.key.bias\n", "2020-12-20 09:08:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.key.weight\n", "2020-12-20 09:08:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.query.bias\n", "2020-12-20 09:08:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.query.weight\n", "2020-12-20 09:08:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.value.bias\n", "2020-12-20 09:08:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.value.weight\n", "2020-12-20 09:08:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.intermediate.dense.bias\n", "2020-12-20 09:08:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.intermediate.dense.weight\n", "2020-12-20 09:08:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.output.LayerNorm.bias\n", "2020-12-20 09:08:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.output.LayerNorm.weight\n", "2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.output.dense.bias\n", "2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.output.dense.weight\n", "2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.output.dense.bias\n", "2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.output.dense.weight\n", "2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.key.bias\n", "2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.key.weight\n", "2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.query.bias\n", "2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.query.weight\n", "2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.value.bias\n", "2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.value.weight\n", "2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.intermediate.dense.bias\n", "2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.intermediate.dense.weight\n", "2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.output.LayerNorm.bias\n", "2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.output.LayerNorm.weight\n", "2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.output.dense.bias\n", "2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.output.dense.weight\n", "2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.output.dense.bias\n", "2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.output.dense.weight\n", "2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.key.bias\n", "2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.key.weight\n", "2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.query.bias\n", "2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.query.weight\n", "2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.value.bias\n", "2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.value.weight\n", "2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.intermediate.dense.bias\n", "2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.intermediate.dense.weight\n", "2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.output.LayerNorm.bias\n", "2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.output.LayerNorm.weight\n", "2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.output.dense.bias\n", "2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.output.dense.weight\n", "2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.output.dense.bias\n", "2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.output.dense.weight\n", "2020-12-20 09:08:23,174 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.key.bias\n", "2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.key.weight\n", "2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.query.bias\n", "2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.query.weight\n", "2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.value.bias\n", "2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.value.weight\n", "2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.intermediate.dense.bias\n", "2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.intermediate.dense.weight\n", "2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.output.LayerNorm.bias\n", "2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.output.LayerNorm.weight\n", "2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.output.dense.bias\n", "2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.output.dense.weight\n", "2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.output.dense.bias\n", "2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.output.dense.weight\n", "2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.key.bias\n", "2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.key.weight\n", "2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.query.bias\n", "2020-12-20 09:08:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.query.weight\n", "2020-12-20 09:08:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.value.bias\n", "2020-12-20 09:08:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.value.weight\n", "2020-12-20 09:08:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.intermediate.dense.bias\n", "2020-12-20 09:08:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.intermediate.dense.weight\n", "2020-12-20 09:08:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.output.LayerNorm.bias\n", "2020-12-20 09:08:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.output.LayerNorm.weight\n", "2020-12-20 09:08:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.output.dense.bias\n", "2020-12-20 09:08:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.output.dense.weight\n", "2020-12-20 09:08:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.output.dense.bias\n", "2020-12-20 09:08:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.output.dense.weight\n", "2020-12-20 09:08:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.key.bias\n", "2020-12-20 09:08:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.key.weight\n", "2020-12-20 09:08:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.query.bias\n", "2020-12-20 09:08:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.query.weight\n", "2020-12-20 09:08:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.value.bias\n", "2020-12-20 09:08:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.value.weight\n", "2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.intermediate.dense.bias\n", "2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.intermediate.dense.weight\n", "2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.output.LayerNorm.bias\n", "2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.output.LayerNorm.weight\n", "2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.output.dense.bias\n", "2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.output.dense.weight\n", "2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.output.dense.bias\n", "2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.output.dense.weight\n", "2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.key.bias\n", "2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.key.weight\n", "2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.query.bias\n", "2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.query.weight\n", "2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.value.bias\n", "2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.value.weight\n", "2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.intermediate.dense.bias\n", "2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.intermediate.dense.weight\n", "2020-12-20 09:08:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.output.LayerNorm.bias\n", "2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.output.LayerNorm.weight\n", "2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.output.dense.bias\n", "2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.output.dense.weight\n", "2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.output.dense.bias\n", "2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.output.dense.weight\n", "2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.key.bias\n", "2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.key.weight\n", "2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.query.bias\n", "2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.query.weight\n", "2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.value.bias\n", "2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.value.weight\n", "2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.intermediate.dense.bias\n", "2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.intermediate.dense.weight\n", "2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.output.LayerNorm.bias\n", "2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.output.LayerNorm.weight\n", "2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.output.dense.bias\n", "2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.output.dense.weight\n", "2020-12-20 09:08:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:23,179 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:23,179 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.output.dense.bias\n", "2020-12-20 09:08:23,179 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.output.dense.weight\n", "2020-12-20 09:08:23,179 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.key.bias\n", "2020-12-20 09:08:23,179 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.key.weight\n", "2020-12-20 09:08:23,179 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.query.bias\n", "2020-12-20 09:08:23,179 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.query.weight\n", "2020-12-20 09:08:23,179 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.value.bias\n", "2020-12-20 09:08:23,179 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.value.weight\n", "2020-12-20 09:08:23,179 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.intermediate.dense.bias\n", "2020-12-20 09:08:23,179 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.intermediate.dense.weight\n", "2020-12-20 09:08:23,179 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.output.LayerNorm.bias\n", "2020-12-20 09:08:23,179 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.output.LayerNorm.weight\n", "2020-12-20 09:08:23,179 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.output.dense.bias\n", "2020-12-20 09:08:23,179 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.output.dense.weight\n", "2020-12-20 09:08:23,180 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:23,180 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:23,180 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.output.dense.bias\n", "2020-12-20 09:08:23,180 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.output.dense.weight\n", "2020-12-20 09:08:23,180 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.key.bias\n", "2020-12-20 09:08:23,180 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.key.weight\n", "2020-12-20 09:08:23,180 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.query.bias\n", "2020-12-20 09:08:23,180 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.query.weight\n", "2020-12-20 09:08:23,180 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.value.bias\n", "2020-12-20 09:08:23,208 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.value.weight\n", "2020-12-20 09:08:23,209 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.intermediate.dense.bias\n", "2020-12-20 09:08:23,209 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.intermediate.dense.weight\n", "2020-12-20 09:08:23,209 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.output.LayerNorm.bias\n", "2020-12-20 09:08:23,209 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.output.LayerNorm.weight\n", "2020-12-20 09:08:23,209 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.output.dense.bias\n", "2020-12-20 09:08:23,209 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.output.dense.weight\n", "2020-12-20 09:08:23,209 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:23,209 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:23,209 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.output.dense.bias\n", "2020-12-20 09:08:23,210 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.output.dense.weight\n", "2020-12-20 09:08:23,210 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.key.bias\n", "2020-12-20 09:08:23,210 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.key.weight\n", "2020-12-20 09:08:23,210 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.query.bias\n", "2020-12-20 09:08:23,210 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.query.weight\n", "2020-12-20 09:08:23,210 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.value.bias\n", "2020-12-20 09:08:23,210 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.value.weight\n", "2020-12-20 09:08:23,210 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.intermediate.dense.bias\n", "2020-12-20 09:08:23,210 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.intermediate.dense.weight\n", "2020-12-20 09:08:23,210 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.output.LayerNorm.bias\n", "2020-12-20 09:08:23,210 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.output.LayerNorm.weight\n", "2020-12-20 09:08:23,210 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.output.dense.bias\n", "2020-12-20 09:08:23,210 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.output.dense.weight\n", "2020-12-20 09:08:23,210 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:23,210 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:23,211 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.output.dense.bias\n", "2020-12-20 09:08:23,211 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.output.dense.weight\n", "2020-12-20 09:08:23,211 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.key.bias\n", "2020-12-20 09:08:23,211 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.key.weight\n", "2020-12-20 09:08:23,211 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.query.bias\n", "2020-12-20 09:08:23,211 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.query.weight\n", "2020-12-20 09:08:23,211 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.value.bias\n", "2020-12-20 09:08:23,211 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.value.weight\n", "2020-12-20 09:08:23,211 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.intermediate.dense.bias\n", "2020-12-20 09:08:23,211 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.intermediate.dense.weight\n", "2020-12-20 09:08:23,211 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.output.LayerNorm.bias\n", "2020-12-20 09:08:23,211 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.output.LayerNorm.weight\n", "2020-12-20 09:08:23,211 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.output.dense.bias\n", "2020-12-20 09:08:23,211 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.output.dense.weight\n", "2020-12-20 09:08:23,211 - INFO - allennlp.nn.initializers - bert_model.pooler.dense.bias\n", "2020-12-20 09:08:23,211 - INFO - allennlp.nn.initializers - bert_model.pooler.dense.weight\n", "2020-12-20 09:08:23,211 - INFO - allennlp.nn.initializers - tag_projection_layer.bias\n", "2020-12-20 09:08:23,212 - INFO - allennlp.nn.initializers - tag_projection_layer.weight\n", "2020-12-20 09:08:23,664 - INFO - allennlp.common.params - dataset_reader.type = srl\n", "2020-12-20 09:08:23,664 - INFO - allennlp.common.params - dataset_reader.lazy = False\n", "2020-12-20 09:08:23,665 - INFO - allennlp.common.params - dataset_reader.cache_directory = None\n", "2020-12-20 09:08:23,665 - INFO - allennlp.common.params - dataset_reader.max_instances = None\n", "2020-12-20 09:08:23,665 - INFO - allennlp.common.params - dataset_reader.manual_distributed_sharding = False\n", "2020-12-20 09:08:23,665 - INFO - allennlp.common.params - dataset_reader.manual_multi_process_sharding = False\n", "2020-12-20 09:08:23,665 - INFO - allennlp.common.params - dataset_reader.token_indexers = None\n", "2020-12-20 09:08:23,665 - INFO - allennlp.common.params - dataset_reader.domain_identifier = None\n", "2020-12-20 09:08:23,665 - INFO - allennlp.common.params - dataset_reader.bert_model_name = bert-base-uncased\n", "2020-12-20 09:08:23,987 - INFO - transformers.tokenization_utils - loading file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt from cache at /root/.cache/torch/transformers/26bc1ad6c0ac742e9b52263248f6d0f00068293b33709fae12320c0e35ccfbbb.542ce4285a40d23a559526243235df47c5f75c197f04f37d1a0c124c32c9a084\n", "input 0: {\"sentence\": \"Mrs. and Mr. Tomaso went to Europe for vacation and visited Paris and first went to visit the Eiffel Tower.\"}\n", "prediction: {\"verbs\": [{\"verb\": \"went\", \"description\": \"[ARG0: Mrs. and Mr. Tomaso] [V: went] [ARG4: to Europe] [ARGM-PRP: for vacation] and visited Paris and first went to visit the Eiffel Tower .\", \"tags\": [\"B-ARG0\", \"I-ARG0\", \"I-ARG0\", \"I-ARG0\", \"B-V\", \"B-ARG4\", \"I-ARG4\", \"B-ARGM-PRP\", \"I-ARGM-PRP\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\"]}, {\"verb\": \"visited\", \"description\": \"[ARG0: Mrs. and Mr. Tomaso] went to Europe for vacation and [V: visited] [ARG1: Paris] and first went to visit the Eiffel Tower .\", \"tags\": [\"B-ARG0\", \"I-ARG0\", \"I-ARG0\", \"I-ARG0\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"B-V\", \"B-ARG1\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\"]}, {\"verb\": \"went\", \"description\": \"[ARG0: Mrs. and Mr. Tomaso] went to Europe for vacation and visited Paris and [ARGM-TMP: first] [V: went] [ARGM-PRP: to visit the Eiffel Tower] .\", \"tags\": [\"B-ARG0\", \"I-ARG0\", \"I-ARG0\", \"I-ARG0\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"B-ARGM-TMP\", \"B-V\", \"B-ARGM-PRP\", \"I-ARGM-PRP\", \"I-ARGM-PRP\", \"I-ARGM-PRP\", \"I-ARGM-PRP\", \"O\"]}, {\"verb\": \"visit\", \"description\": \"[ARG0: Mrs. and Mr. Tomaso] went to Europe for vacation and visited Paris and first went to [V: visit] [ARG1: the Eiffel Tower] .\", \"tags\": [\"B-ARG0\", \"I-ARG0\", \"I-ARG0\", \"I-ARG0\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"B-V\", \"B-ARG1\", \"I-ARG1\", \"I-ARG1\", \"O\"]}], \"words\": [\"Mrs.\", \"and\", \"Mr.\", \"Tomaso\", \"went\", \"to\", \"Europe\", \"for\", \"vacation\", \"and\", \"visited\", \"Paris\", \"and\", \"first\", \"went\", \"to\", \"visit\", \"the\", \"Eiffel\", \"Tower\", \".\"]}\n", "\n", "2020-12-20 09:08:25,342 - INFO - allennlp.models.archival - removing temporary unarchived model dir at /tmp/tmp7jeuj77a\n" ], "name": "stdout" } ] }, { "cell_type": "markdown", "metadata": { "id": "v45ooI5ReoXk" }, "source": [ "Sample 3:John wanted to drink tea, Mary likes to drink coffee but Karim drank some cool water and Faiza would like to drink tomato juice." ] }, { "cell_type": "code", "metadata": { "id": "Pz-jLVeAersa", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "13b50208-4caf-476d-abf8-627b21c65863" }, "source": [ "!echo '{\"sentence\": \"John wanted to drink tea, Mary likes to drink coffee but Karim drank some cool water and Faiza would like to drink tomato juice.\"}' | \\\n", "allennlp predict https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz -" ], "execution_count": 4, "outputs": [ { "output_type": "stream", "text": [ "2020-12-20 09:08:27,582 - INFO - transformers.file_utils - PyTorch version 1.5.1 available.\n", "2020-12-20 09:08:27.767124: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library libcudart.so.10.1\n", "2020-12-20 09:08:29,592 - INFO - transformers.file_utils - TensorFlow version 2.4.0 available.\n", "2020-12-20 09:08:30,797 - INFO - allennlp.common.file_utils - checking cache for https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz at /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\n", "2020-12-20 09:08:30,797 - INFO - allennlp.common.file_utils - waiting to acquire lock on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\n", "2020-12-20 09:08:30,799 - INFO - filelock - Lock 140584440236464 acquired on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724.lock\n", "2020-12-20 09:08:30,799 - INFO - allennlp.common.file_utils - cache of https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz is up-to-date\n", "2020-12-20 09:08:30,799 - INFO - filelock - Lock 140584440236464 released on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724.lock\n", "2020-12-20 09:08:30,799 - INFO - allennlp.models.archival - loading archive file https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz from cache at /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\n", "2020-12-20 09:08:30,799 - INFO - allennlp.models.archival - extracting archive file /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724 to temp dir /tmp/tmpse7z902p\n", "2020-12-20 09:08:35,061 - INFO - allennlp.common.params - type = from_instances\n", "2020-12-20 09:08:35,061 - INFO - allennlp.data.vocabulary - Loading token dictionary from /tmp/tmpse7z902p/vocabulary.\n", "2020-12-20 09:08:35,062 - INFO - filelock - Lock 140584442130328 acquired on /tmp/tmpse7z902p/vocabulary/.lock\n", "2020-12-20 09:08:35,089 - INFO - filelock - Lock 140584442130328 released on /tmp/tmpse7z902p/vocabulary/.lock\n", "2020-12-20 09:08:35,089 - INFO - allennlp.common.params - model.type = srl_bert\n", "2020-12-20 09:08:35,090 - INFO - allennlp.common.params - model.regularizer = None\n", "2020-12-20 09:08:35,090 - INFO - allennlp.common.params - model.bert_model = bert-base-uncased\n", "2020-12-20 09:08:35,090 - INFO - allennlp.common.params - model.embedding_dropout = 0.1\n", "2020-12-20 09:08:35,090 - INFO - allennlp.common.params - model.initializer = \n", "2020-12-20 09:08:35,090 - INFO - allennlp.common.params - model.label_smoothing = None\n", "2020-12-20 09:08:35,090 - INFO - allennlp.common.params - model.ignore_span_metric = False\n", "2020-12-20 09:08:35,090 - INFO - allennlp.common.params - model.srl_eval_path = /usr/local/lib/python3.6/dist-packages/allennlp_models/structured_prediction/tools/srl-eval.pl\n", "2020-12-20 09:08:35,400 - INFO - transformers.configuration_utils - loading configuration file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-config.json from cache at /root/.cache/torch/transformers/4dad0251492946e18ac39290fcfe91b89d370fee250efe9521476438fe8ca185.7156163d5fdc189c3016baca0775ffce230789d7fa2a42ef516483e4ca884517\n", "2020-12-20 09:08:35,400 - INFO - transformers.configuration_utils - Model config BertConfig {\n", " \"architectures\": [\n", " \"BertForMaskedLM\"\n", " ],\n", " \"attention_probs_dropout_prob\": 0.1,\n", " \"hidden_act\": \"gelu\",\n", " \"hidden_dropout_prob\": 0.1,\n", " \"hidden_size\": 768,\n", " \"initializer_range\": 0.02,\n", " \"intermediate_size\": 3072,\n", " \"layer_norm_eps\": 1e-12,\n", " \"max_position_embeddings\": 512,\n", " \"model_type\": \"bert\",\n", " \"num_attention_heads\": 12,\n", " \"num_hidden_layers\": 12,\n", " \"pad_token_id\": 0,\n", " \"type_vocab_size\": 2,\n", " \"vocab_size\": 30522\n", "}\n", "\n", "2020-12-20 09:08:35,598 - INFO - transformers.modeling_utils - loading weights file https://cdn.huggingface.co/bert-base-uncased-pytorch_model.bin from cache at /root/.cache/torch/transformers/f2ee78bdd635b758cc0a12352586868bef80e47401abe4c4fcc3832421e7338b.36ca03ab34a1a5d5fa7bc3d03d55c4fa650fed07220e2eeebc06ce58d0e9a157\n", "2020-12-20 09:08:38,288 - INFO - allennlp.nn.initializers - Initializing parameters\n", "2020-12-20 09:08:38,289 - INFO - allennlp.nn.initializers - Done initializing parameters; the following parameters are using their default initialization from their code\n", "2020-12-20 09:08:38,289 - INFO - allennlp.nn.initializers - bert_model.embeddings.LayerNorm.bias\n", "2020-12-20 09:08:38,289 - INFO - allennlp.nn.initializers - bert_model.embeddings.LayerNorm.weight\n", "2020-12-20 09:08:38,289 - INFO - allennlp.nn.initializers - bert_model.embeddings.position_embeddings.weight\n", "2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers - bert_model.embeddings.token_type_embeddings.weight\n", "2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers - bert_model.embeddings.word_embeddings.weight\n", "2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.output.dense.bias\n", "2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.output.dense.weight\n", "2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.key.bias\n", "2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.key.weight\n", "2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.query.bias\n", "2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.query.weight\n", "2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.value.bias\n", "2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.value.weight\n", "2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.intermediate.dense.bias\n", "2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.intermediate.dense.weight\n", "2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.output.LayerNorm.bias\n", "2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.output.LayerNorm.weight\n", "2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.output.dense.bias\n", "2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.output.dense.weight\n", "2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.output.dense.bias\n", "2020-12-20 09:08:38,290 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.output.dense.weight\n", "2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.key.bias\n", "2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.key.weight\n", "2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.query.bias\n", "2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.query.weight\n", "2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.value.bias\n", "2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.value.weight\n", "2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.intermediate.dense.bias\n", "2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.intermediate.dense.weight\n", "2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.output.LayerNorm.bias\n", "2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.output.LayerNorm.weight\n", "2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.output.dense.bias\n", "2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.output.dense.weight\n", "2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.output.dense.bias\n", "2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.output.dense.weight\n", "2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.key.bias\n", "2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.key.weight\n", "2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.query.bias\n", "2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.query.weight\n", "2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.value.bias\n", "2020-12-20 09:08:38,291 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.value.weight\n", "2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.intermediate.dense.bias\n", "2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.intermediate.dense.weight\n", "2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.output.LayerNorm.bias\n", "2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.output.LayerNorm.weight\n", "2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.output.dense.bias\n", "2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.output.dense.weight\n", "2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.output.dense.bias\n", "2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.output.dense.weight\n", "2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.key.bias\n", "2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.key.weight\n", "2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.query.bias\n", "2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.query.weight\n", "2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.value.bias\n", "2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.value.weight\n", "2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.intermediate.dense.bias\n", "2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.intermediate.dense.weight\n", "2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.output.LayerNorm.bias\n", "2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.output.LayerNorm.weight\n", "2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.output.dense.bias\n", "2020-12-20 09:08:38,292 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.output.dense.weight\n", "2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.output.dense.bias\n", "2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.output.dense.weight\n", "2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.key.bias\n", "2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.key.weight\n", "2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.query.bias\n", "2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.query.weight\n", "2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.value.bias\n", "2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.value.weight\n", "2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.intermediate.dense.bias\n", "2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.intermediate.dense.weight\n", "2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.output.LayerNorm.bias\n", "2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.output.LayerNorm.weight\n", "2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.output.dense.bias\n", "2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.output.dense.weight\n", "2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.output.dense.bias\n", "2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.output.dense.weight\n", "2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.key.bias\n", "2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.key.weight\n", "2020-12-20 09:08:38,293 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.query.bias\n", "2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.query.weight\n", "2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.value.bias\n", "2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.value.weight\n", "2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.intermediate.dense.bias\n", "2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.intermediate.dense.weight\n", "2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.output.LayerNorm.bias\n", "2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.output.LayerNorm.weight\n", "2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.output.dense.bias\n", "2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.output.dense.weight\n", "2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.output.dense.bias\n", "2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.output.dense.weight\n", "2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.key.bias\n", "2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.key.weight\n", "2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.query.bias\n", "2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.query.weight\n", "2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.value.bias\n", "2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.value.weight\n", "2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.intermediate.dense.bias\n", "2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.intermediate.dense.weight\n", "2020-12-20 09:08:38,294 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.output.LayerNorm.bias\n", "2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.output.LayerNorm.weight\n", "2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.output.dense.bias\n", "2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.output.dense.weight\n", "2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.output.dense.bias\n", "2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.output.dense.weight\n", "2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.key.bias\n", "2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.key.weight\n", "2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.query.bias\n", "2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.query.weight\n", "2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.value.bias\n", "2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.value.weight\n", "2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.intermediate.dense.bias\n", "2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.intermediate.dense.weight\n", "2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.output.LayerNorm.bias\n", "2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.output.LayerNorm.weight\n", "2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.output.dense.bias\n", "2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.output.dense.weight\n", "2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.output.dense.bias\n", "2020-12-20 09:08:38,295 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.output.dense.weight\n", "2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.key.bias\n", "2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.key.weight\n", "2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.query.bias\n", "2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.query.weight\n", "2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.value.bias\n", "2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.value.weight\n", "2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.intermediate.dense.bias\n", "2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.intermediate.dense.weight\n", "2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.output.LayerNorm.bias\n", "2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.output.LayerNorm.weight\n", "2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.output.dense.bias\n", "2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.output.dense.weight\n", "2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.output.dense.bias\n", "2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.output.dense.weight\n", "2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.key.bias\n", "2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.key.weight\n", "2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.query.bias\n", "2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.query.weight\n", "2020-12-20 09:08:38,296 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.value.bias\n", "2020-12-20 09:08:38,374 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.value.weight\n", "2020-12-20 09:08:38,374 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.intermediate.dense.bias\n", "2020-12-20 09:08:38,374 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.intermediate.dense.weight\n", "2020-12-20 09:08:38,374 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.output.LayerNorm.bias\n", "2020-12-20 09:08:38,374 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.output.LayerNorm.weight\n", "2020-12-20 09:08:38,374 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.output.dense.bias\n", "2020-12-20 09:08:38,374 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.output.dense.weight\n", "2020-12-20 09:08:38,374 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:38,374 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:38,375 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.output.dense.bias\n", "2020-12-20 09:08:38,375 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.output.dense.weight\n", "2020-12-20 09:08:38,375 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.key.bias\n", "2020-12-20 09:08:38,375 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.key.weight\n", "2020-12-20 09:08:38,375 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.query.bias\n", "2020-12-20 09:08:38,375 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.query.weight\n", "2020-12-20 09:08:38,375 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.value.bias\n", "2020-12-20 09:08:38,375 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.value.weight\n", "2020-12-20 09:08:38,375 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.intermediate.dense.bias\n", "2020-12-20 09:08:38,375 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.intermediate.dense.weight\n", "2020-12-20 09:08:38,375 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.output.LayerNorm.bias\n", "2020-12-20 09:08:38,375 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.output.LayerNorm.weight\n", "2020-12-20 09:08:38,375 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.output.dense.bias\n", "2020-12-20 09:08:38,375 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.output.dense.weight\n", "2020-12-20 09:08:38,375 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:38,375 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:38,376 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.output.dense.bias\n", "2020-12-20 09:08:38,376 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.output.dense.weight\n", "2020-12-20 09:08:38,376 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.key.bias\n", "2020-12-20 09:08:38,376 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.key.weight\n", "2020-12-20 09:08:38,376 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.query.bias\n", "2020-12-20 09:08:38,376 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.query.weight\n", "2020-12-20 09:08:38,376 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.value.bias\n", "2020-12-20 09:08:38,376 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.value.weight\n", "2020-12-20 09:08:38,376 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.intermediate.dense.bias\n", "2020-12-20 09:08:38,376 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.intermediate.dense.weight\n", "2020-12-20 09:08:38,376 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.output.LayerNorm.bias\n", "2020-12-20 09:08:38,376 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.output.LayerNorm.weight\n", "2020-12-20 09:08:38,376 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.output.dense.bias\n", "2020-12-20 09:08:38,376 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.output.dense.weight\n", "2020-12-20 09:08:38,376 - INFO - allennlp.nn.initializers - bert_model.pooler.dense.bias\n", "2020-12-20 09:08:38,376 - INFO - allennlp.nn.initializers - bert_model.pooler.dense.weight\n", "2020-12-20 09:08:38,376 - INFO - allennlp.nn.initializers - tag_projection_layer.bias\n", "2020-12-20 09:08:38,376 - INFO - allennlp.nn.initializers - tag_projection_layer.weight\n", "2020-12-20 09:08:38,830 - INFO - allennlp.common.params - dataset_reader.type = srl\n", "2020-12-20 09:08:38,831 - INFO - allennlp.common.params - dataset_reader.lazy = False\n", "2020-12-20 09:08:38,831 - INFO - allennlp.common.params - dataset_reader.cache_directory = None\n", "2020-12-20 09:08:38,831 - INFO - allennlp.common.params - dataset_reader.max_instances = None\n", "2020-12-20 09:08:38,831 - INFO - allennlp.common.params - dataset_reader.manual_distributed_sharding = False\n", "2020-12-20 09:08:38,831 - INFO - allennlp.common.params - dataset_reader.manual_multi_process_sharding = False\n", "2020-12-20 09:08:38,831 - INFO - allennlp.common.params - dataset_reader.token_indexers = None\n", "2020-12-20 09:08:38,831 - INFO - allennlp.common.params - dataset_reader.domain_identifier = None\n", "2020-12-20 09:08:38,831 - INFO - allennlp.common.params - dataset_reader.bert_model_name = bert-base-uncased\n", "2020-12-20 09:08:39,125 - INFO - transformers.tokenization_utils - loading file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt from cache at /root/.cache/torch/transformers/26bc1ad6c0ac742e9b52263248f6d0f00068293b33709fae12320c0e35ccfbbb.542ce4285a40d23a559526243235df47c5f75c197f04f37d1a0c124c32c9a084\n", "input 0: {\"sentence\": \"John wanted to drink tea, Mary likes to drink coffee but Karim drank some cool water and Faiza would like to drink tomato juice.\"}\n", "prediction: {\"verbs\": [{\"verb\": \"wanted\", \"description\": \"[ARG0: John] [V: wanted] [ARG1: to drink tea] , Mary likes to drink coffee but Karim drank some cool water and Faiza would like to drink tomato juice .\", \"tags\": [\"B-ARG0\", \"B-V\", \"B-ARG1\", \"I-ARG1\", \"I-ARG1\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\"]}, {\"verb\": \"drink\", \"description\": \"[ARG0: John] wanted to [V: drink] [ARG1: tea] , Mary likes to drink coffee but Karim drank some cool water and Faiza would like to drink tomato juice .\", \"tags\": [\"B-ARG0\", \"O\", \"O\", \"B-V\", \"B-ARG1\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\"]}, {\"verb\": \"likes\", \"description\": \"John wanted to drink tea , [ARG0: Mary] [V: likes] [ARG1: to drink coffee] but Karim drank some cool water and Faiza would like to drink tomato juice .\", \"tags\": [\"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"B-ARG0\", \"B-V\", \"B-ARG1\", \"I-ARG1\", \"I-ARG1\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\"]}, {\"verb\": \"drink\", \"description\": \"John wanted to drink tea , [ARG0: Mary] likes to [V: drink] [ARG1: coffee] but Karim drank some cool water and Faiza would like to drink tomato juice .\", \"tags\": [\"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"B-ARG0\", \"O\", \"O\", \"B-V\", \"B-ARG1\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\"]}, {\"verb\": \"drank\", \"description\": \"John wanted to drink tea , Mary likes to drink coffee but [ARG0: Karim] [V: drank] [ARG1: some cool water and Faiza] would like to drink tomato juice .\", \"tags\": [\"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"B-ARG0\", \"B-V\", \"B-ARG1\", \"I-ARG1\", \"I-ARG1\", \"I-ARG1\", \"I-ARG1\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\"]}, {\"verb\": \"would\", \"description\": \"John wanted to drink tea , Mary likes to drink coffee but Karim drank some cool water and Faiza [V: would] [ARGM-DIS: like] to drink tomato juice .\", \"tags\": [\"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"B-V\", \"B-ARGM-DIS\", \"O\", \"O\", \"O\", \"O\", \"O\"]}, {\"verb\": \"like\", \"description\": \"John wanted to drink tea , Mary likes to drink coffee but Karim drank [ARG0: some cool water and Faiza] [ARGM-MOD: would] [V: like] [ARG1: to drink tomato juice] .\", \"tags\": [\"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"B-ARG0\", \"I-ARG0\", \"I-ARG0\", \"I-ARG0\", \"I-ARG0\", \"B-ARGM-MOD\", \"B-V\", \"B-ARG1\", \"I-ARG1\", \"I-ARG1\", \"I-ARG1\", \"O\"]}, {\"verb\": \"drink\", \"description\": \"John wanted to drink tea , Mary likes to drink coffee but Karim drank [ARG0: some cool water and Faiza] would like to [V: drink] [ARG1: tomato juice] .\", \"tags\": [\"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"B-ARG0\", \"I-ARG0\", \"I-ARG0\", \"I-ARG0\", \"I-ARG0\", \"O\", \"O\", \"O\", \"B-V\", \"B-ARG1\", \"I-ARG1\", \"O\"]}], \"words\": [\"John\", \"wanted\", \"to\", \"drink\", \"tea\", \",\", \"Mary\", \"likes\", \"to\", \"drink\", \"coffee\", \"but\", \"Karim\", \"drank\", \"some\", \"cool\", \"water\", \"and\", \"Faiza\", \"would\", \"like\", \"to\", \"drink\", \"tomato\", \"juice\", \".\"]}\n", "\n", "2020-12-20 09:08:40,852 - INFO - allennlp.models.archival - removing temporary unarchived model dir at /tmp/tmpse7z902p\n" ], "name": "stdout" } ] }, { "cell_type": "markdown", "metadata": { "id": "k7QVm45YmxTt" }, "source": [ "Sample 4: Alice, whose husband went jogging every Sunday, liked to go to a dancing class in the meantime." ] }, { "cell_type": "code", "metadata": { "id": "mvm6zN7_m0GI", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "736f6e5d-0f14-4692-962d-696f3e884c5e" }, "source": [ "!echo '{\"sentence\": \"Alice, whose husband went jogging every Sunday, liked to go to a dancing class in the meantime.\"}' | \\\n", "allennlp predict https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz -" ], "execution_count": 5, "outputs": [ { "output_type": "stream", "text": [ "2020-12-20 09:08:43,153 - INFO - transformers.file_utils - PyTorch version 1.5.1 available.\n", "2020-12-20 09:08:43.324358: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library libcudart.so.10.1\n", "2020-12-20 09:08:45,080 - INFO - transformers.file_utils - TensorFlow version 2.4.0 available.\n", "2020-12-20 09:08:46,294 - INFO - allennlp.common.file_utils - checking cache for https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz at /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\n", "2020-12-20 09:08:46,294 - INFO - allennlp.common.file_utils - waiting to acquire lock on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\n", "2020-12-20 09:08:46,295 - INFO - filelock - Lock 139693451663232 acquired on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724.lock\n", "2020-12-20 09:08:46,295 - INFO - allennlp.common.file_utils - cache of https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz is up-to-date\n", "2020-12-20 09:08:46,295 - INFO - filelock - Lock 139693451663232 released on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724.lock\n", "2020-12-20 09:08:46,295 - INFO - allennlp.models.archival - loading archive file https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz from cache at /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\n", "2020-12-20 09:08:46,295 - INFO - allennlp.models.archival - extracting archive file /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724 to temp dir /tmp/tmpmnefy_l0\n", "2020-12-20 09:08:50,552 - INFO - allennlp.common.params - type = from_instances\n", "2020-12-20 09:08:50,552 - INFO - allennlp.data.vocabulary - Loading token dictionary from /tmp/tmpmnefy_l0/vocabulary.\n", "2020-12-20 09:08:50,552 - INFO - filelock - Lock 139692801159064 acquired on /tmp/tmpmnefy_l0/vocabulary/.lock\n", "2020-12-20 09:08:50,580 - INFO - filelock - Lock 139692801159064 released on /tmp/tmpmnefy_l0/vocabulary/.lock\n", "2020-12-20 09:08:50,580 - INFO - allennlp.common.params - model.type = srl_bert\n", "2020-12-20 09:08:50,581 - INFO - allennlp.common.params - model.regularizer = None\n", "2020-12-20 09:08:50,581 - INFO - allennlp.common.params - model.bert_model = bert-base-uncased\n", "2020-12-20 09:08:50,581 - INFO - allennlp.common.params - model.embedding_dropout = 0.1\n", "2020-12-20 09:08:50,581 - INFO - allennlp.common.params - model.initializer = \n", "2020-12-20 09:08:50,581 - INFO - allennlp.common.params - model.label_smoothing = None\n", "2020-12-20 09:08:50,581 - INFO - allennlp.common.params - model.ignore_span_metric = False\n", "2020-12-20 09:08:50,581 - INFO - allennlp.common.params - model.srl_eval_path = /usr/local/lib/python3.6/dist-packages/allennlp_models/structured_prediction/tools/srl-eval.pl\n", "2020-12-20 09:08:50,888 - INFO - transformers.configuration_utils - loading configuration file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-config.json from cache at /root/.cache/torch/transformers/4dad0251492946e18ac39290fcfe91b89d370fee250efe9521476438fe8ca185.7156163d5fdc189c3016baca0775ffce230789d7fa2a42ef516483e4ca884517\n", "2020-12-20 09:08:50,889 - INFO - transformers.configuration_utils - Model config BertConfig {\n", " \"architectures\": [\n", " \"BertForMaskedLM\"\n", " ],\n", " \"attention_probs_dropout_prob\": 0.1,\n", " \"hidden_act\": \"gelu\",\n", " \"hidden_dropout_prob\": 0.1,\n", " \"hidden_size\": 768,\n", " \"initializer_range\": 0.02,\n", " \"intermediate_size\": 3072,\n", " \"layer_norm_eps\": 1e-12,\n", " \"max_position_embeddings\": 512,\n", " \"model_type\": \"bert\",\n", " \"num_attention_heads\": 12,\n", " \"num_hidden_layers\": 12,\n", " \"pad_token_id\": 0,\n", " \"type_vocab_size\": 2,\n", " \"vocab_size\": 30522\n", "}\n", "\n", "2020-12-20 09:08:50,928 - INFO - transformers.modeling_utils - loading weights file https://cdn.huggingface.co/bert-base-uncased-pytorch_model.bin from cache at /root/.cache/torch/transformers/f2ee78bdd635b758cc0a12352586868bef80e47401abe4c4fcc3832421e7338b.36ca03ab34a1a5d5fa7bc3d03d55c4fa650fed07220e2eeebc06ce58d0e9a157\n", "2020-12-20 09:08:53,601 - INFO - allennlp.nn.initializers - Initializing parameters\n", "2020-12-20 09:08:53,602 - INFO - allennlp.nn.initializers - Done initializing parameters; the following parameters are using their default initialization from their code\n", "2020-12-20 09:08:53,602 - INFO - allennlp.nn.initializers - bert_model.embeddings.LayerNorm.bias\n", "2020-12-20 09:08:53,602 - INFO - allennlp.nn.initializers - bert_model.embeddings.LayerNorm.weight\n", "2020-12-20 09:08:53,602 - INFO - allennlp.nn.initializers - bert_model.embeddings.position_embeddings.weight\n", "2020-12-20 09:08:53,602 - INFO - allennlp.nn.initializers - bert_model.embeddings.token_type_embeddings.weight\n", "2020-12-20 09:08:53,602 - INFO - allennlp.nn.initializers - bert_model.embeddings.word_embeddings.weight\n", "2020-12-20 09:08:53,602 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:53,602 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:53,602 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.output.dense.bias\n", "2020-12-20 09:08:53,602 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.output.dense.weight\n", "2020-12-20 09:08:53,602 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.key.bias\n", "2020-12-20 09:08:53,602 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.key.weight\n", "2020-12-20 09:08:53,602 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.query.bias\n", "2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.query.weight\n", "2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.value.bias\n", "2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.value.weight\n", "2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.intermediate.dense.bias\n", "2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.intermediate.dense.weight\n", "2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.output.LayerNorm.bias\n", "2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.output.LayerNorm.weight\n", "2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.output.dense.bias\n", "2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.output.dense.weight\n", "2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.output.dense.bias\n", "2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.output.dense.weight\n", "2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.key.bias\n", "2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.key.weight\n", "2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.query.bias\n", "2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.query.weight\n", "2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.value.bias\n", "2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.value.weight\n", "2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.intermediate.dense.bias\n", "2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.intermediate.dense.weight\n", "2020-12-20 09:08:53,603 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.output.LayerNorm.bias\n", "2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.output.LayerNorm.weight\n", "2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.output.dense.bias\n", "2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.output.dense.weight\n", "2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.output.dense.bias\n", "2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.output.dense.weight\n", "2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.key.bias\n", "2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.key.weight\n", "2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.query.bias\n", "2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.query.weight\n", "2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.value.bias\n", "2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.value.weight\n", "2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.intermediate.dense.bias\n", "2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.intermediate.dense.weight\n", "2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.output.LayerNorm.bias\n", "2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.output.LayerNorm.weight\n", "2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.output.dense.bias\n", "2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.output.dense.weight\n", "2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.output.dense.bias\n", "2020-12-20 09:08:53,604 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.output.dense.weight\n", "2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.key.bias\n", "2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.key.weight\n", "2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.query.bias\n", "2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.query.weight\n", "2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.value.bias\n", "2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.value.weight\n", "2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.intermediate.dense.bias\n", "2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.intermediate.dense.weight\n", "2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.output.LayerNorm.bias\n", "2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.output.LayerNorm.weight\n", "2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.output.dense.bias\n", "2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.output.dense.weight\n", "2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.output.dense.bias\n", "2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.output.dense.weight\n", "2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.key.bias\n", "2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.key.weight\n", "2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.query.bias\n", "2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.query.weight\n", "2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.value.bias\n", "2020-12-20 09:08:53,605 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.value.weight\n", "2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.intermediate.dense.bias\n", "2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.intermediate.dense.weight\n", "2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.output.LayerNorm.bias\n", "2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.output.LayerNorm.weight\n", "2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.output.dense.bias\n", "2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.output.dense.weight\n", "2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.output.dense.bias\n", "2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.output.dense.weight\n", "2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.key.bias\n", "2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.key.weight\n", "2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.query.bias\n", "2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.query.weight\n", "2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.value.bias\n", "2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.value.weight\n", "2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.intermediate.dense.bias\n", "2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.intermediate.dense.weight\n", "2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.output.LayerNorm.bias\n", "2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.output.LayerNorm.weight\n", "2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.output.dense.bias\n", "2020-12-20 09:08:53,606 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.output.dense.weight\n", "2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.output.dense.bias\n", "2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.output.dense.weight\n", "2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.key.bias\n", "2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.key.weight\n", "2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.query.bias\n", "2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.query.weight\n", "2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.value.bias\n", "2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.value.weight\n", "2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.intermediate.dense.bias\n", "2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.intermediate.dense.weight\n", "2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.output.LayerNorm.bias\n", "2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.output.LayerNorm.weight\n", "2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.output.dense.bias\n", "2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.output.dense.weight\n", "2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.output.dense.bias\n", "2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.output.dense.weight\n", "2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.key.bias\n", "2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.key.weight\n", "2020-12-20 09:08:53,607 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.query.bias\n", "2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.query.weight\n", "2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.value.bias\n", "2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.value.weight\n", "2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.intermediate.dense.bias\n", "2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.intermediate.dense.weight\n", "2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.output.LayerNorm.bias\n", "2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.output.LayerNorm.weight\n", "2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.output.dense.bias\n", "2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.output.dense.weight\n", "2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.output.dense.bias\n", "2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.output.dense.weight\n", "2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.key.bias\n", "2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.key.weight\n", "2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.query.bias\n", "2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.query.weight\n", "2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.value.bias\n", "2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.value.weight\n", "2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.intermediate.dense.bias\n", "2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.intermediate.dense.weight\n", "2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.output.LayerNorm.bias\n", "2020-12-20 09:08:53,608 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.output.LayerNorm.weight\n", "2020-12-20 09:08:53,609 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.output.dense.bias\n", "2020-12-20 09:08:53,609 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.output.dense.weight\n", "2020-12-20 09:08:53,609 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:53,609 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:53,609 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.output.dense.bias\n", "2020-12-20 09:08:53,609 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.output.dense.weight\n", "2020-12-20 09:08:53,609 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.key.bias\n", "2020-12-20 09:08:53,609 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.key.weight\n", "2020-12-20 09:08:53,609 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.query.bias\n", "2020-12-20 09:08:53,609 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.query.weight\n", "2020-12-20 09:08:53,609 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.value.bias\n", "2020-12-20 09:08:53,683 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.value.weight\n", "2020-12-20 09:08:53,683 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.intermediate.dense.bias\n", "2020-12-20 09:08:53,683 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.intermediate.dense.weight\n", "2020-12-20 09:08:53,683 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.output.LayerNorm.bias\n", "2020-12-20 09:08:53,683 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.output.LayerNorm.weight\n", "2020-12-20 09:08:53,683 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.output.dense.bias\n", "2020-12-20 09:08:53,683 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.output.dense.weight\n", "2020-12-20 09:08:53,683 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:53,684 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:53,684 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.output.dense.bias\n", "2020-12-20 09:08:53,684 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.output.dense.weight\n", "2020-12-20 09:08:53,684 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.key.bias\n", "2020-12-20 09:08:53,684 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.key.weight\n", "2020-12-20 09:08:53,684 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.query.bias\n", "2020-12-20 09:08:53,684 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.query.weight\n", "2020-12-20 09:08:53,684 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.value.bias\n", "2020-12-20 09:08:53,684 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.value.weight\n", "2020-12-20 09:08:53,684 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.intermediate.dense.bias\n", "2020-12-20 09:08:53,684 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.intermediate.dense.weight\n", "2020-12-20 09:08:53,684 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.output.LayerNorm.bias\n", "2020-12-20 09:08:53,684 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.output.LayerNorm.weight\n", "2020-12-20 09:08:53,684 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.output.dense.bias\n", "2020-12-20 09:08:53,684 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.output.dense.weight\n", "2020-12-20 09:08:53,684 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.output.LayerNorm.bias\n", "2020-12-20 09:08:53,685 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.output.LayerNorm.weight\n", "2020-12-20 09:08:53,685 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.output.dense.bias\n", "2020-12-20 09:08:53,685 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.output.dense.weight\n", "2020-12-20 09:08:53,685 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.key.bias\n", "2020-12-20 09:08:53,685 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.key.weight\n", "2020-12-20 09:08:53,685 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.query.bias\n", "2020-12-20 09:08:53,685 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.query.weight\n", "2020-12-20 09:08:53,685 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.value.bias\n", "2020-12-20 09:08:53,685 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.value.weight\n", "2020-12-20 09:08:53,685 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.intermediate.dense.bias\n", "2020-12-20 09:08:53,685 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.intermediate.dense.weight\n", "2020-12-20 09:08:53,685 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.output.LayerNorm.bias\n", "2020-12-20 09:08:53,685 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.output.LayerNorm.weight\n", "2020-12-20 09:08:53,685 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.output.dense.bias\n", "2020-12-20 09:08:53,685 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.output.dense.weight\n", "2020-12-20 09:08:53,685 - INFO - allennlp.nn.initializers - bert_model.pooler.dense.bias\n", "2020-12-20 09:08:53,685 - INFO - allennlp.nn.initializers - bert_model.pooler.dense.weight\n", "2020-12-20 09:08:53,686 - INFO - allennlp.nn.initializers - tag_projection_layer.bias\n", "2020-12-20 09:08:53,686 - INFO - allennlp.nn.initializers - tag_projection_layer.weight\n", "2020-12-20 09:08:54,186 - INFO - allennlp.common.params - dataset_reader.type = srl\n", "2020-12-20 09:08:54,187 - INFO - allennlp.common.params - dataset_reader.lazy = False\n", "2020-12-20 09:08:54,187 - INFO - allennlp.common.params - dataset_reader.cache_directory = None\n", "2020-12-20 09:08:54,187 - INFO - allennlp.common.params - dataset_reader.max_instances = None\n", "2020-12-20 09:08:54,187 - INFO - allennlp.common.params - dataset_reader.manual_distributed_sharding = False\n", "2020-12-20 09:08:54,187 - INFO - allennlp.common.params - dataset_reader.manual_multi_process_sharding = False\n", "2020-12-20 09:08:54,187 - INFO - allennlp.common.params - dataset_reader.token_indexers = None\n", "2020-12-20 09:08:54,187 - INFO - allennlp.common.params - dataset_reader.domain_identifier = None\n", "2020-12-20 09:08:54,187 - INFO - allennlp.common.params - dataset_reader.bert_model_name = bert-base-uncased\n", "2020-12-20 09:08:54,497 - INFO - transformers.tokenization_utils - loading file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt from cache at /root/.cache/torch/transformers/26bc1ad6c0ac742e9b52263248f6d0f00068293b33709fae12320c0e35ccfbbb.542ce4285a40d23a559526243235df47c5f75c197f04f37d1a0c124c32c9a084\n", "input 0: {\"sentence\": \"Alice, whose husband went jogging every Sunday, liked to go to a dancing class in the meantime.\"}\n", "prediction: {\"verbs\": [{\"verb\": \"went\", \"description\": \"Alice , [ARG1: whose husband] [V: went] [ARG2: jogging] [ARGM-TMP: every Sunday] , liked to go to a dancing class in the meantime .\", \"tags\": [\"O\", \"O\", \"B-ARG1\", \"I-ARG1\", \"B-V\", \"B-ARG2\", \"B-ARGM-TMP\", \"I-ARGM-TMP\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\"]}, {\"verb\": \"jogging\", \"description\": \"Alice , [ARG0: whose husband] went [V: jogging] [ARGM-TMP: every Sunday] , liked to go to a dancing class in the meantime .\", \"tags\": [\"O\", \"O\", \"B-ARG0\", \"I-ARG0\", \"O\", \"B-V\", \"B-ARGM-TMP\", \"I-ARGM-TMP\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\"]}, {\"verb\": \"liked\", \"description\": \"[ARG0: Alice , whose husband went jogging every Sunday] , [V: liked] [ARG1: to go to a dancing class in the meantime] .\", \"tags\": [\"B-ARG0\", \"I-ARG0\", \"I-ARG0\", \"I-ARG0\", \"I-ARG0\", \"I-ARG0\", \"I-ARG0\", \"I-ARG0\", \"O\", \"B-V\", \"B-ARG1\", \"I-ARG1\", \"I-ARG1\", \"I-ARG1\", \"I-ARG1\", \"I-ARG1\", \"I-ARG1\", \"I-ARG1\", \"I-ARG1\", \"O\"]}, {\"verb\": \"go\", \"description\": \"[ARG0: Alice , whose husband went jogging every Sunday] , liked to [V: go] [ARG4: to a dancing class] [ARGM-TMP: in the meantime] .\", \"tags\": [\"B-ARG0\", \"I-ARG0\", \"I-ARG0\", \"I-ARG0\", \"I-ARG0\", \"I-ARG0\", \"I-ARG0\", \"I-ARG0\", \"O\", \"O\", \"O\", \"B-V\", \"B-ARG4\", \"I-ARG4\", \"I-ARG4\", \"I-ARG4\", \"B-ARGM-TMP\", \"I-ARGM-TMP\", \"I-ARGM-TMP\", \"O\"]}, {\"verb\": \"dancing\", \"description\": \"Alice , whose husband went jogging every Sunday , liked to go to a [V: dancing] [ARG0: class] in the meantime .\", \"tags\": [\"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"B-V\", \"B-ARG0\", \"O\", \"O\", \"O\", \"O\"]}], \"words\": [\"Alice\", \",\", \"whose\", \"husband\", \"went\", \"jogging\", \"every\", \"Sunday\", \",\", \"liked\", \"to\", \"go\", \"to\", \"a\", \"dancing\", \"class\", \"in\", \"the\", \"meantime\", \".\"]}\n", "\n", "2020-12-20 09:08:55,842 - INFO - allennlp.models.archival - removing temporary unarchived model dir at /tmp/tmpmnefy_l0\n" ], "name": "stdout" } ] }, { "cell_type": "markdown", "metadata": { "id": "Hog7HwIHzdm8" }, "source": [ "Sample 5: The bright sun, the blue sky, the warm sand, the palm trees, everything round off." ] }, { "cell_type": "code", "metadata": { "id": "6NFVmvYtzguX", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "48e86088-f681-4721-9ea4-42fd7476e6a8" }, "source": [ "!echo '{\"sentence\": \"The bright sun, the blue sky, the warm sand, the palm trees, everything round off.\"}' | \\\n", "allennlp predict https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz -" ], "execution_count": 6, "outputs": [ { "output_type": "stream", "text": [ "2020-12-20 09:08:58,132 - INFO - transformers.file_utils - PyTorch version 1.5.1 available.\n", "2020-12-20 09:08:58.293529: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library libcudart.so.10.1\n", "2020-12-20 09:09:00,044 - INFO - transformers.file_utils - TensorFlow version 2.4.0 available.\n", "2020-12-20 09:09:01,253 - INFO - allennlp.common.file_utils - checking cache for https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz at /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\n", "2020-12-20 09:09:01,253 - INFO - allennlp.common.file_utils - waiting to acquire lock on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\n", "2020-12-20 09:09:01,255 - INFO - filelock - Lock 140218919610240 acquired on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724.lock\n", "2020-12-20 09:09:01,255 - INFO - allennlp.common.file_utils - cache of https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz is up-to-date\n", "2020-12-20 09:09:01,255 - INFO - filelock - Lock 140218919610240 released on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724.lock\n", "2020-12-20 09:09:01,255 - INFO - allennlp.models.archival - loading archive file https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz from cache at /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\n", "2020-12-20 09:09:01,256 - INFO - allennlp.models.archival - extracting archive file /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724 to temp dir /tmp/tmpn3jl1yco\n", "2020-12-20 09:09:05,507 - INFO - allennlp.common.params - type = from_instances\n", "2020-12-20 09:09:05,507 - INFO - allennlp.data.vocabulary - Loading token dictionary from /tmp/tmpn3jl1yco/vocabulary.\n", "2020-12-20 09:09:05,507 - INFO - filelock - Lock 140218269110168 acquired on /tmp/tmpn3jl1yco/vocabulary/.lock\n", "2020-12-20 09:09:05,535 - INFO - filelock - Lock 140218269110168 released on /tmp/tmpn3jl1yco/vocabulary/.lock\n", "2020-12-20 09:09:05,536 - INFO - allennlp.common.params - model.type = srl_bert\n", "2020-12-20 09:09:05,536 - INFO - allennlp.common.params - model.regularizer = None\n", "2020-12-20 09:09:05,536 - INFO - allennlp.common.params - model.bert_model = bert-base-uncased\n", "2020-12-20 09:09:05,536 - INFO - allennlp.common.params - model.embedding_dropout = 0.1\n", "2020-12-20 09:09:05,536 - INFO - allennlp.common.params - model.initializer = \n", "2020-12-20 09:09:05,537 - INFO - allennlp.common.params - model.label_smoothing = None\n", "2020-12-20 09:09:05,537 - INFO - allennlp.common.params - model.ignore_span_metric = False\n", "2020-12-20 09:09:05,537 - INFO - allennlp.common.params - model.srl_eval_path = /usr/local/lib/python3.6/dist-packages/allennlp_models/structured_prediction/tools/srl-eval.pl\n", "2020-12-20 09:09:05,837 - INFO - transformers.configuration_utils - loading configuration file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-config.json from cache at /root/.cache/torch/transformers/4dad0251492946e18ac39290fcfe91b89d370fee250efe9521476438fe8ca185.7156163d5fdc189c3016baca0775ffce230789d7fa2a42ef516483e4ca884517\n", "2020-12-20 09:09:05,837 - INFO - transformers.configuration_utils - Model config BertConfig {\n", " \"architectures\": [\n", " \"BertForMaskedLM\"\n", " ],\n", " \"attention_probs_dropout_prob\": 0.1,\n", " \"hidden_act\": \"gelu\",\n", " \"hidden_dropout_prob\": 0.1,\n", " \"hidden_size\": 768,\n", " \"initializer_range\": 0.02,\n", " \"intermediate_size\": 3072,\n", " \"layer_norm_eps\": 1e-12,\n", " \"max_position_embeddings\": 512,\n", " \"model_type\": \"bert\",\n", " \"num_attention_heads\": 12,\n", " \"num_hidden_layers\": 12,\n", " \"pad_token_id\": 0,\n", " \"type_vocab_size\": 2,\n", " \"vocab_size\": 30522\n", "}\n", "\n", "2020-12-20 09:09:06,048 - INFO - transformers.modeling_utils - loading weights file https://cdn.huggingface.co/bert-base-uncased-pytorch_model.bin from cache at /root/.cache/torch/transformers/f2ee78bdd635b758cc0a12352586868bef80e47401abe4c4fcc3832421e7338b.36ca03ab34a1a5d5fa7bc3d03d55c4fa650fed07220e2eeebc06ce58d0e9a157\n", "2020-12-20 09:09:08,747 - INFO - allennlp.nn.initializers - Initializing parameters\n", "2020-12-20 09:09:08,748 - INFO - allennlp.nn.initializers - Done initializing parameters; the following parameters are using their default initialization from their code\n", "2020-12-20 09:09:08,748 - INFO - allennlp.nn.initializers - bert_model.embeddings.LayerNorm.bias\n", "2020-12-20 09:09:08,748 - INFO - allennlp.nn.initializers - bert_model.embeddings.LayerNorm.weight\n", "2020-12-20 09:09:08,748 - INFO - allennlp.nn.initializers - bert_model.embeddings.position_embeddings.weight\n", "2020-12-20 09:09:08,748 - INFO - allennlp.nn.initializers - bert_model.embeddings.token_type_embeddings.weight\n", "2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.embeddings.word_embeddings.weight\n", "2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.output.LayerNorm.bias\n", "2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.output.LayerNorm.weight\n", "2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.output.dense.bias\n", "2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.output.dense.weight\n", "2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.key.bias\n", "2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.key.weight\n", "2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.query.bias\n", "2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.query.weight\n", "2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.value.bias\n", "2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.value.weight\n", "2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.intermediate.dense.bias\n", "2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.intermediate.dense.weight\n", "2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.output.LayerNorm.bias\n", "2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.output.LayerNorm.weight\n", "2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.output.dense.bias\n", "2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.output.dense.weight\n", "2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.output.LayerNorm.bias\n", "2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.output.LayerNorm.weight\n", "2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.output.dense.bias\n", "2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.output.dense.weight\n", "2020-12-20 09:09:08,749 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.key.bias\n", "2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.key.weight\n", "2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.query.bias\n", "2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.query.weight\n", "2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.value.bias\n", "2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.value.weight\n", "2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.intermediate.dense.bias\n", "2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.intermediate.dense.weight\n", "2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.output.LayerNorm.bias\n", "2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.output.LayerNorm.weight\n", "2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.output.dense.bias\n", "2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.output.dense.weight\n", "2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.output.LayerNorm.bias\n", "2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.output.LayerNorm.weight\n", "2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.output.dense.bias\n", "2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.output.dense.weight\n", "2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.key.bias\n", "2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.key.weight\n", "2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.query.bias\n", "2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.query.weight\n", "2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.value.bias\n", "2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.value.weight\n", "2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.intermediate.dense.bias\n", "2020-12-20 09:09:08,750 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.intermediate.dense.weight\n", "2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.output.LayerNorm.bias\n", "2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.output.LayerNorm.weight\n", "2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.output.dense.bias\n", "2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.output.dense.weight\n", "2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.output.LayerNorm.bias\n", "2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.output.LayerNorm.weight\n", "2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.output.dense.bias\n", "2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.output.dense.weight\n", "2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.key.bias\n", "2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.key.weight\n", "2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.query.bias\n", "2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.query.weight\n", "2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.value.bias\n", "2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.value.weight\n", "2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.intermediate.dense.bias\n", "2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.intermediate.dense.weight\n", "2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.output.LayerNorm.bias\n", "2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.output.LayerNorm.weight\n", "2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.output.dense.bias\n", "2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.output.dense.weight\n", "2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.output.LayerNorm.bias\n", "2020-12-20 09:09:08,751 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.output.LayerNorm.weight\n", "2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.output.dense.bias\n", "2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.output.dense.weight\n", "2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.key.bias\n", "2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.key.weight\n", "2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.query.bias\n", "2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.query.weight\n", "2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.value.bias\n", "2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.value.weight\n", "2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.intermediate.dense.bias\n", "2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.intermediate.dense.weight\n", "2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.output.LayerNorm.bias\n", "2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.output.LayerNorm.weight\n", "2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.output.dense.bias\n", "2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.output.dense.weight\n", "2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.output.LayerNorm.bias\n", "2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.output.LayerNorm.weight\n", "2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.output.dense.bias\n", "2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.output.dense.weight\n", "2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.key.bias\n", "2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.key.weight\n", "2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.query.bias\n", "2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.query.weight\n", "2020-12-20 09:09:08,752 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.value.bias\n", "2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.value.weight\n", "2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.intermediate.dense.bias\n", "2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.intermediate.dense.weight\n", "2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.output.LayerNorm.bias\n", "2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.output.LayerNorm.weight\n", "2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.output.dense.bias\n", "2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.output.dense.weight\n", "2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.output.LayerNorm.bias\n", "2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.output.LayerNorm.weight\n", "2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.output.dense.bias\n", "2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.output.dense.weight\n", "2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.key.bias\n", "2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.key.weight\n", "2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.query.bias\n", "2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.query.weight\n", "2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.value.bias\n", "2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.value.weight\n", "2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.intermediate.dense.bias\n", "2020-12-20 09:09:08,753 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.intermediate.dense.weight\n", "2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.output.LayerNorm.bias\n", "2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.output.LayerNorm.weight\n", "2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.output.dense.bias\n", "2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.output.dense.weight\n", "2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.output.LayerNorm.bias\n", "2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.output.LayerNorm.weight\n", "2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.output.dense.bias\n", "2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.output.dense.weight\n", "2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.key.bias\n", "2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.key.weight\n", "2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.query.bias\n", "2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.query.weight\n", "2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.value.bias\n", "2020-12-20 09:09:08,754 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.value.weight\n", "2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.intermediate.dense.bias\n", "2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.intermediate.dense.weight\n", "2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.output.LayerNorm.bias\n", "2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.output.LayerNorm.weight\n", "2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.output.dense.bias\n", "2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.output.dense.weight\n", "2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.output.LayerNorm.bias\n", "2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.output.LayerNorm.weight\n", "2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.output.dense.bias\n", "2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.output.dense.weight\n", "2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.key.bias\n", "2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.key.weight\n", "2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.query.bias\n", "2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.query.weight\n", "2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.value.bias\n", "2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.value.weight\n", "2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.intermediate.dense.bias\n", "2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.intermediate.dense.weight\n", "2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.output.LayerNorm.bias\n", "2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.output.LayerNorm.weight\n", "2020-12-20 09:09:08,755 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.output.dense.bias\n", "2020-12-20 09:09:08,756 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.output.dense.weight\n", "2020-12-20 09:09:08,756 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.output.LayerNorm.bias\n", "2020-12-20 09:09:08,756 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.output.LayerNorm.weight\n", "2020-12-20 09:09:08,756 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.output.dense.bias\n", "2020-12-20 09:09:08,756 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.output.dense.weight\n", "2020-12-20 09:09:08,756 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.key.bias\n", "2020-12-20 09:09:08,756 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.key.weight\n", "2020-12-20 09:09:08,756 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.query.bias\n", "2020-12-20 09:09:08,756 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.query.weight\n", "2020-12-20 09:09:08,756 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.value.bias\n", "2020-12-20 09:09:08,786 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.value.weight\n", "2020-12-20 09:09:08,787 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.intermediate.dense.bias\n", "2020-12-20 09:09:08,787 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.intermediate.dense.weight\n", "2020-12-20 09:09:08,787 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.output.LayerNorm.bias\n", "2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.output.LayerNorm.weight\n", "2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.output.dense.bias\n", "2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.output.dense.weight\n", "2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.output.LayerNorm.bias\n", "2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.output.LayerNorm.weight\n", "2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.output.dense.bias\n", "2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.output.dense.weight\n", "2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.key.bias\n", "2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.key.weight\n", "2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.query.bias\n", "2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.query.weight\n", "2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.value.bias\n", "2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.value.weight\n", "2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.intermediate.dense.bias\n", "2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.intermediate.dense.weight\n", "2020-12-20 09:09:08,788 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.output.LayerNorm.bias\n", "2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.output.LayerNorm.weight\n", "2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.output.dense.bias\n", "2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.output.dense.weight\n", "2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.output.LayerNorm.bias\n", "2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.output.LayerNorm.weight\n", "2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.output.dense.bias\n", "2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.output.dense.weight\n", "2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.key.bias\n", "2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.key.weight\n", "2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.query.bias\n", "2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.query.weight\n", "2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.value.bias\n", "2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.value.weight\n", "2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.intermediate.dense.bias\n", "2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.intermediate.dense.weight\n", "2020-12-20 09:09:08,789 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.output.LayerNorm.bias\n", "2020-12-20 09:09:08,790 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.output.LayerNorm.weight\n", "2020-12-20 09:09:08,790 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.output.dense.bias\n", "2020-12-20 09:09:08,790 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.output.dense.weight\n", "2020-12-20 09:09:08,790 - INFO - allennlp.nn.initializers - bert_model.pooler.dense.bias\n", "2020-12-20 09:09:08,790 - INFO - allennlp.nn.initializers - bert_model.pooler.dense.weight\n", "2020-12-20 09:09:08,790 - INFO - allennlp.nn.initializers - tag_projection_layer.bias\n", "2020-12-20 09:09:08,790 - INFO - allennlp.nn.initializers - tag_projection_layer.weight\n", "2020-12-20 09:09:09,268 - INFO - allennlp.common.params - dataset_reader.type = srl\n", "2020-12-20 09:09:09,268 - INFO - allennlp.common.params - dataset_reader.lazy = False\n", "2020-12-20 09:09:09,268 - INFO - allennlp.common.params - dataset_reader.cache_directory = None\n", "2020-12-20 09:09:09,268 - INFO - allennlp.common.params - dataset_reader.max_instances = None\n", "2020-12-20 09:09:09,268 - INFO - allennlp.common.params - dataset_reader.manual_distributed_sharding = False\n", "2020-12-20 09:09:09,269 - INFO - allennlp.common.params - dataset_reader.manual_multi_process_sharding = False\n", "2020-12-20 09:09:09,269 - INFO - allennlp.common.params - dataset_reader.token_indexers = None\n", "2020-12-20 09:09:09,269 - INFO - allennlp.common.params - dataset_reader.domain_identifier = None\n", "2020-12-20 09:09:09,269 - INFO - allennlp.common.params - dataset_reader.bert_model_name = bert-base-uncased\n", "2020-12-20 09:09:09,561 - INFO - transformers.tokenization_utils - loading file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt from cache at /root/.cache/torch/transformers/26bc1ad6c0ac742e9b52263248f6d0f00068293b33709fae12320c0e35ccfbbb.542ce4285a40d23a559526243235df47c5f75c197f04f37d1a0c124c32c9a084\n", "input 0: {\"sentence\": \"The bright sun, the blue sky, the warm sand, the palm trees, everything round off.\"}\n", "prediction: {\"verbs\": [], \"words\": [\"The\", \"bright\", \"sun\", \",\", \"the\", \"blue\", \"sky\", \",\", \"the\", \"warm\", \"sand\", \",\", \"the\", \"palm\", \"trees\", \",\", \"everything\", \"round\", \"off\", \".\"]}\n", "\n", "2020-12-20 09:09:10,283 - INFO - allennlp.models.archival - removing temporary unarchived model dir at /tmp/tmpn3jl1yco\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "T9UCG-qN018X", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "af4f9689-442f-4726-a908-7dba1a824036" }, "source": [ "!echo '{\"sentence\": \"The bright sun, the blue sky, the warm sand, the palm trees, everything rounds off.\"}' | \\\n", "allennlp predict https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz -" ], "execution_count": 7, "outputs": [ { "output_type": "stream", "text": [ "2020-12-20 09:09:12,636 - INFO - transformers.file_utils - PyTorch version 1.5.1 available.\n", "2020-12-20 09:09:12.789933: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library libcudart.so.10.1\n", "2020-12-20 09:09:14,547 - INFO - transformers.file_utils - TensorFlow version 2.4.0 available.\n", "2020-12-20 09:09:15,750 - INFO - allennlp.common.file_utils - checking cache for https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz at /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\n", "2020-12-20 09:09:15,750 - INFO - allennlp.common.file_utils - waiting to acquire lock on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\n", "2020-12-20 09:09:15,751 - INFO - filelock - Lock 139884787906432 acquired on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724.lock\n", "2020-12-20 09:09:15,751 - INFO - allennlp.common.file_utils - cache of https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz is up-to-date\n", "2020-12-20 09:09:15,751 - INFO - filelock - Lock 139884787906432 released on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724.lock\n", "2020-12-20 09:09:15,751 - INFO - allennlp.models.archival - loading archive file https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz from cache at /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\n", "2020-12-20 09:09:15,751 - INFO - allennlp.models.archival - extracting archive file /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724 to temp dir /tmp/tmpuj2lb1i1\n", "2020-12-20 09:09:19,983 - INFO - allennlp.common.params - type = from_instances\n", "2020-12-20 09:09:19,983 - INFO - allennlp.data.vocabulary - Loading token dictionary from /tmp/tmpuj2lb1i1/vocabulary.\n", "2020-12-20 09:09:19,983 - INFO - filelock - Lock 139884137381784 acquired on /tmp/tmpuj2lb1i1/vocabulary/.lock\n", "2020-12-20 09:09:20,009 - INFO - filelock - Lock 139884137381784 released on /tmp/tmpuj2lb1i1/vocabulary/.lock\n", "2020-12-20 09:09:20,010 - INFO - allennlp.common.params - model.type = srl_bert\n", "2020-12-20 09:09:20,010 - INFO - allennlp.common.params - model.regularizer = None\n", "2020-12-20 09:09:20,010 - INFO - allennlp.common.params - model.bert_model = bert-base-uncased\n", "2020-12-20 09:09:20,011 - INFO - allennlp.common.params - model.embedding_dropout = 0.1\n", "2020-12-20 09:09:20,011 - INFO - allennlp.common.params - model.initializer = \n", "2020-12-20 09:09:20,011 - INFO - allennlp.common.params - model.label_smoothing = None\n", "2020-12-20 09:09:20,011 - INFO - allennlp.common.params - model.ignore_span_metric = False\n", "2020-12-20 09:09:20,011 - INFO - allennlp.common.params - model.srl_eval_path = /usr/local/lib/python3.6/dist-packages/allennlp_models/structured_prediction/tools/srl-eval.pl\n", "2020-12-20 09:09:20,306 - INFO - transformers.configuration_utils - loading configuration file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-config.json from cache at /root/.cache/torch/transformers/4dad0251492946e18ac39290fcfe91b89d370fee250efe9521476438fe8ca185.7156163d5fdc189c3016baca0775ffce230789d7fa2a42ef516483e4ca884517\n", "2020-12-20 09:09:20,307 - INFO - transformers.configuration_utils - Model config BertConfig {\n", " \"architectures\": [\n", " \"BertForMaskedLM\"\n", " ],\n", " \"attention_probs_dropout_prob\": 0.1,\n", " \"hidden_act\": \"gelu\",\n", " \"hidden_dropout_prob\": 0.1,\n", " \"hidden_size\": 768,\n", " \"initializer_range\": 0.02,\n", " \"intermediate_size\": 3072,\n", " \"layer_norm_eps\": 1e-12,\n", " \"max_position_embeddings\": 512,\n", " \"model_type\": \"bert\",\n", " \"num_attention_heads\": 12,\n", " \"num_hidden_layers\": 12,\n", " \"pad_token_id\": 0,\n", " \"type_vocab_size\": 2,\n", " \"vocab_size\": 30522\n", "}\n", "\n", "2020-12-20 09:09:20,499 - INFO - transformers.modeling_utils - loading weights file https://cdn.huggingface.co/bert-base-uncased-pytorch_model.bin from cache at /root/.cache/torch/transformers/f2ee78bdd635b758cc0a12352586868bef80e47401abe4c4fcc3832421e7338b.36ca03ab34a1a5d5fa7bc3d03d55c4fa650fed07220e2eeebc06ce58d0e9a157\n", "2020-12-20 09:09:23,169 - INFO - allennlp.nn.initializers - Initializing parameters\n", "2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers - Done initializing parameters; the following parameters are using their default initialization from their code\n", "2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers - bert_model.embeddings.LayerNorm.bias\n", "2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers - bert_model.embeddings.LayerNorm.weight\n", "2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers - bert_model.embeddings.position_embeddings.weight\n", "2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers - bert_model.embeddings.token_type_embeddings.weight\n", "2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers - bert_model.embeddings.word_embeddings.weight\n", "2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.output.LayerNorm.bias\n", "2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.output.LayerNorm.weight\n", "2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.output.dense.bias\n", "2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.output.dense.weight\n", "2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.key.bias\n", "2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.key.weight\n", "2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.query.bias\n", "2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.query.weight\n", "2020-12-20 09:09:23,170 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.value.bias\n", "2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.value.weight\n", "2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.intermediate.dense.bias\n", "2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.intermediate.dense.weight\n", "2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.output.LayerNorm.bias\n", "2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.output.LayerNorm.weight\n", "2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.output.dense.bias\n", "2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.output.dense.weight\n", "2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.output.LayerNorm.bias\n", "2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.output.LayerNorm.weight\n", "2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.output.dense.bias\n", "2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.output.dense.weight\n", "2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.key.bias\n", "2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.key.weight\n", "2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.query.bias\n", "2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.query.weight\n", "2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.value.bias\n", "2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.value.weight\n", "2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.intermediate.dense.bias\n", "2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.intermediate.dense.weight\n", "2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.output.LayerNorm.bias\n", "2020-12-20 09:09:23,171 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.output.LayerNorm.weight\n", "2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.output.dense.bias\n", "2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.output.dense.weight\n", "2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.output.LayerNorm.bias\n", "2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.output.LayerNorm.weight\n", "2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.output.dense.bias\n", "2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.output.dense.weight\n", "2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.key.bias\n", "2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.key.weight\n", "2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.query.bias\n", "2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.query.weight\n", "2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.value.bias\n", "2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.value.weight\n", "2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.intermediate.dense.bias\n", "2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.intermediate.dense.weight\n", "2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.output.LayerNorm.bias\n", "2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.output.LayerNorm.weight\n", "2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.output.dense.bias\n", "2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.output.dense.weight\n", "2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.output.LayerNorm.bias\n", "2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.output.LayerNorm.weight\n", "2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.output.dense.bias\n", "2020-12-20 09:09:23,172 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.output.dense.weight\n", "2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.key.bias\n", "2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.key.weight\n", "2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.query.bias\n", "2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.query.weight\n", "2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.value.bias\n", "2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.value.weight\n", "2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.intermediate.dense.bias\n", "2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.intermediate.dense.weight\n", "2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.output.LayerNorm.bias\n", "2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.output.LayerNorm.weight\n", "2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.output.dense.bias\n", "2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.output.dense.weight\n", "2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.output.LayerNorm.bias\n", "2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.output.LayerNorm.weight\n", "2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.output.dense.bias\n", "2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.output.dense.weight\n", "2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.key.bias\n", "2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.key.weight\n", "2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.query.bias\n", "2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.query.weight\n", "2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.value.bias\n", "2020-12-20 09:09:23,173 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.value.weight\n", "2020-12-20 09:09:23,174 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.intermediate.dense.bias\n", "2020-12-20 09:09:23,174 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.intermediate.dense.weight\n", "2020-12-20 09:09:23,174 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.output.LayerNorm.bias\n", "2020-12-20 09:09:23,174 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.output.LayerNorm.weight\n", "2020-12-20 09:09:23,174 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.output.dense.bias\n", "2020-12-20 09:09:23,174 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.output.dense.weight\n", "2020-12-20 09:09:23,174 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.output.LayerNorm.bias\n", "2020-12-20 09:09:23,174 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.output.LayerNorm.weight\n", "2020-12-20 09:09:23,174 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.output.dense.bias\n", "2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.output.dense.weight\n", "2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.key.bias\n", "2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.key.weight\n", "2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.query.bias\n", "2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.query.weight\n", "2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.value.bias\n", "2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.value.weight\n", "2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.intermediate.dense.bias\n", "2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.intermediate.dense.weight\n", "2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.output.LayerNorm.bias\n", "2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.output.LayerNorm.weight\n", "2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.output.dense.bias\n", "2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.output.dense.weight\n", "2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.output.LayerNorm.bias\n", "2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.output.LayerNorm.weight\n", "2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.output.dense.bias\n", "2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.output.dense.weight\n", "2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.key.bias\n", "2020-12-20 09:09:23,175 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.key.weight\n", "2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.query.bias\n", "2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.query.weight\n", "2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.value.bias\n", "2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.value.weight\n", "2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.intermediate.dense.bias\n", "2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.intermediate.dense.weight\n", "2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.output.LayerNorm.bias\n", "2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.output.LayerNorm.weight\n", "2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.output.dense.bias\n", "2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.output.dense.weight\n", "2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.output.LayerNorm.bias\n", "2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.output.LayerNorm.weight\n", "2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.output.dense.bias\n", "2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.output.dense.weight\n", "2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.key.bias\n", "2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.key.weight\n", "2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.query.bias\n", "2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.query.weight\n", "2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.value.bias\n", "2020-12-20 09:09:23,176 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.value.weight\n", "2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.intermediate.dense.bias\n", "2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.intermediate.dense.weight\n", "2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.output.LayerNorm.bias\n", "2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.output.LayerNorm.weight\n", "2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.output.dense.bias\n", "2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.output.dense.weight\n", "2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.output.LayerNorm.bias\n", "2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.output.LayerNorm.weight\n", "2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.output.dense.bias\n", "2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.output.dense.weight\n", "2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.key.bias\n", "2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.key.weight\n", "2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.query.bias\n", "2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.query.weight\n", "2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.value.bias\n", "2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.value.weight\n", "2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.intermediate.dense.bias\n", "2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.intermediate.dense.weight\n", "2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.output.LayerNorm.bias\n", "2020-12-20 09:09:23,177 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.output.LayerNorm.weight\n", "2020-12-20 09:09:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.output.dense.bias\n", "2020-12-20 09:09:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.output.dense.weight\n", "2020-12-20 09:09:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.output.LayerNorm.bias\n", "2020-12-20 09:09:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.output.LayerNorm.weight\n", "2020-12-20 09:09:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.output.dense.bias\n", "2020-12-20 09:09:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.output.dense.weight\n", "2020-12-20 09:09:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.key.bias\n", "2020-12-20 09:09:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.key.weight\n", "2020-12-20 09:09:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.query.bias\n", "2020-12-20 09:09:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.query.weight\n", "2020-12-20 09:09:23,178 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.value.bias\n", "2020-12-20 09:09:23,236 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.value.weight\n", "2020-12-20 09:09:23,236 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.intermediate.dense.bias\n", "2020-12-20 09:09:23,236 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.intermediate.dense.weight\n", "2020-12-20 09:09:23,236 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.output.LayerNorm.bias\n", "2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.output.LayerNorm.weight\n", "2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.output.dense.bias\n", "2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.output.dense.weight\n", "2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.output.LayerNorm.bias\n", "2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.output.LayerNorm.weight\n", "2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.output.dense.bias\n", "2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.output.dense.weight\n", "2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.key.bias\n", "2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.key.weight\n", "2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.query.bias\n", "2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.query.weight\n", "2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.value.bias\n", "2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.value.weight\n", "2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.intermediate.dense.bias\n", "2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.intermediate.dense.weight\n", "2020-12-20 09:09:23,237 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.output.LayerNorm.bias\n", "2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.output.LayerNorm.weight\n", "2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.output.dense.bias\n", "2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.output.dense.weight\n", "2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.output.LayerNorm.bias\n", "2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.output.LayerNorm.weight\n", "2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.output.dense.bias\n", "2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.output.dense.weight\n", "2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.key.bias\n", "2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.key.weight\n", "2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.query.bias\n", "2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.query.weight\n", "2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.value.bias\n", "2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.value.weight\n", "2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.intermediate.dense.bias\n", "2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.intermediate.dense.weight\n", "2020-12-20 09:09:23,238 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.output.LayerNorm.bias\n", "2020-12-20 09:09:23,239 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.output.LayerNorm.weight\n", "2020-12-20 09:09:23,239 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.output.dense.bias\n", "2020-12-20 09:09:23,239 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.output.dense.weight\n", "2020-12-20 09:09:23,239 - INFO - allennlp.nn.initializers - bert_model.pooler.dense.bias\n", "2020-12-20 09:09:23,239 - INFO - allennlp.nn.initializers - bert_model.pooler.dense.weight\n", "2020-12-20 09:09:23,239 - INFO - allennlp.nn.initializers - tag_projection_layer.bias\n", "2020-12-20 09:09:23,239 - INFO - allennlp.nn.initializers - tag_projection_layer.weight\n", "2020-12-20 09:09:23,707 - INFO - allennlp.common.params - dataset_reader.type = srl\n", "2020-12-20 09:09:23,708 - INFO - allennlp.common.params - dataset_reader.lazy = False\n", "2020-12-20 09:09:23,708 - INFO - allennlp.common.params - dataset_reader.cache_directory = None\n", "2020-12-20 09:09:23,708 - INFO - allennlp.common.params - dataset_reader.max_instances = None\n", "2020-12-20 09:09:23,708 - INFO - allennlp.common.params - dataset_reader.manual_distributed_sharding = False\n", "2020-12-20 09:09:23,708 - INFO - allennlp.common.params - dataset_reader.manual_multi_process_sharding = False\n", "2020-12-20 09:09:23,708 - INFO - allennlp.common.params - dataset_reader.token_indexers = None\n", "2020-12-20 09:09:23,708 - INFO - allennlp.common.params - dataset_reader.domain_identifier = None\n", "2020-12-20 09:09:23,709 - INFO - allennlp.common.params - dataset_reader.bert_model_name = bert-base-uncased\n", "2020-12-20 09:09:23,994 - INFO - transformers.tokenization_utils - loading file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt from cache at /root/.cache/torch/transformers/26bc1ad6c0ac742e9b52263248f6d0f00068293b33709fae12320c0e35ccfbbb.542ce4285a40d23a559526243235df47c5f75c197f04f37d1a0c124c32c9a084\n", "input 0: {\"sentence\": \"The bright sun, the blue sky, the warm sand, the palm trees, everything rounds off.\"}\n", "prediction: {\"verbs\": [{\"verb\": \"rounds\", \"description\": \"[ARG1: The bright sun , the blue sky , the warm sand , the palm trees] , [R-ARG1: everything] [V: rounds] off .\", \"tags\": [\"B-ARG1\", \"I-ARG1\", \"I-ARG1\", \"I-ARG1\", \"I-ARG1\", \"I-ARG1\", \"I-ARG1\", \"I-ARG1\", \"I-ARG1\", \"I-ARG1\", \"I-ARG1\", \"I-ARG1\", \"I-ARG1\", \"I-ARG1\", \"I-ARG1\", \"O\", \"B-R-ARG1\", \"B-V\", \"O\", \"O\"]}], \"words\": [\"The\", \"bright\", \"sun\", \",\", \"the\", \"blue\", \"sky\", \",\", \"the\", \"warm\", \"sand\", \",\", \"the\", \"palm\", \"trees\", \",\", \"everything\", \"rounds\", \"off\", \".\"]}\n", "\n", "2020-12-20 09:09:24,932 - INFO - allennlp.models.archival - removing temporary unarchived model dir at /tmp/tmpuj2lb1i1\n" ], "name": "stdout" } ] }, { "cell_type": "markdown", "metadata": { "id": "cBrxUvrL3Sp4" }, "source": [ "Sample 6 Ice pucks" ] }, { "cell_type": "code", "metadata": { "id": "rp77Vazw3QY8", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "c8e32eb0-1162-4acc-976e-a25ee428dbda" }, "source": [ "!echo '{\"sentence\": \"Now, ice pucks guys!\"}' | \\\n", "allennlp predict https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz -" ], "execution_count": 8, "outputs": [ { "output_type": "stream", "text": [ "2020-12-20 09:09:27,286 - INFO - transformers.file_utils - PyTorch version 1.5.1 available.\n", "2020-12-20 09:09:27.438284: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library libcudart.so.10.1\n", "2020-12-20 09:09:29,226 - INFO - transformers.file_utils - TensorFlow version 2.4.0 available.\n", "2020-12-20 09:09:30,428 - INFO - allennlp.common.file_utils - checking cache for https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz at /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\n", "2020-12-20 09:09:30,428 - INFO - allennlp.common.file_utils - waiting to acquire lock on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\n", "2020-12-20 09:09:30,429 - INFO - filelock - Lock 139618002246904 acquired on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724.lock\n", "2020-12-20 09:09:30,429 - INFO - allennlp.common.file_utils - cache of https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz is up-to-date\n", "2020-12-20 09:09:30,429 - INFO - filelock - Lock 139618002246904 released on /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724.lock\n", "2020-12-20 09:09:30,429 - INFO - allennlp.models.archival - loading archive file https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz from cache at /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724\n", "2020-12-20 09:09:30,430 - INFO - allennlp.models.archival - extracting archive file /root/.allennlp/cache/e20d5b792a8d456a1a61da245d1856d4b7778efe69ac3c30759af61940aa0f42.f72523a9682cb1f5ad3ecf834075fe53a1c25a6bcbf4b40c11e13b7f426a4724 to temp dir /tmp/tmpewur_o27\n", "2020-12-20 09:09:34,712 - INFO - allennlp.common.params - type = from_instances\n", "2020-12-20 09:09:34,712 - INFO - allennlp.data.vocabulary - Loading token dictionary from /tmp/tmpewur_o27/vocabulary.\n", "2020-12-20 09:09:34,713 - INFO - filelock - Lock 139618002367880 acquired on /tmp/tmpewur_o27/vocabulary/.lock\n", "2020-12-20 09:09:34,741 - INFO - filelock - Lock 139618002367880 released on /tmp/tmpewur_o27/vocabulary/.lock\n", "2020-12-20 09:09:34,742 - INFO - allennlp.common.params - model.type = srl_bert\n", "2020-12-20 09:09:34,742 - INFO - allennlp.common.params - model.regularizer = None\n", "2020-12-20 09:09:34,742 - INFO - allennlp.common.params - model.bert_model = bert-base-uncased\n", "2020-12-20 09:09:34,742 - INFO - allennlp.common.params - model.embedding_dropout = 0.1\n", "2020-12-20 09:09:34,743 - INFO - allennlp.common.params - model.initializer = \n", "2020-12-20 09:09:34,743 - INFO - allennlp.common.params - model.label_smoothing = None\n", "2020-12-20 09:09:34,743 - INFO - allennlp.common.params - model.ignore_span_metric = False\n", "2020-12-20 09:09:34,743 - INFO - allennlp.common.params - model.srl_eval_path = /usr/local/lib/python3.6/dist-packages/allennlp_models/structured_prediction/tools/srl-eval.pl\n", "2020-12-20 09:09:35,046 - INFO - transformers.configuration_utils - loading configuration file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-config.json from cache at /root/.cache/torch/transformers/4dad0251492946e18ac39290fcfe91b89d370fee250efe9521476438fe8ca185.7156163d5fdc189c3016baca0775ffce230789d7fa2a42ef516483e4ca884517\n", "2020-12-20 09:09:35,047 - INFO - transformers.configuration_utils - Model config BertConfig {\n", " \"architectures\": [\n", " \"BertForMaskedLM\"\n", " ],\n", " \"attention_probs_dropout_prob\": 0.1,\n", " \"hidden_act\": \"gelu\",\n", " \"hidden_dropout_prob\": 0.1,\n", " \"hidden_size\": 768,\n", " \"initializer_range\": 0.02,\n", " \"intermediate_size\": 3072,\n", " \"layer_norm_eps\": 1e-12,\n", " \"max_position_embeddings\": 512,\n", " \"model_type\": \"bert\",\n", " \"num_attention_heads\": 12,\n", " \"num_hidden_layers\": 12,\n", " \"pad_token_id\": 0,\n", " \"type_vocab_size\": 2,\n", " \"vocab_size\": 30522\n", "}\n", "\n", "2020-12-20 09:09:35,254 - INFO - transformers.modeling_utils - loading weights file https://cdn.huggingface.co/bert-base-uncased-pytorch_model.bin from cache at /root/.cache/torch/transformers/f2ee78bdd635b758cc0a12352586868bef80e47401abe4c4fcc3832421e7338b.36ca03ab34a1a5d5fa7bc3d03d55c4fa650fed07220e2eeebc06ce58d0e9a157\n", "2020-12-20 09:09:37,949 - INFO - allennlp.nn.initializers - Initializing parameters\n", "2020-12-20 09:09:37,949 - INFO - allennlp.nn.initializers - Done initializing parameters; the following parameters are using their default initialization from their code\n", "2020-12-20 09:09:37,949 - INFO - allennlp.nn.initializers - bert_model.embeddings.LayerNorm.bias\n", "2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers - bert_model.embeddings.LayerNorm.weight\n", "2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers - bert_model.embeddings.position_embeddings.weight\n", "2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers - bert_model.embeddings.token_type_embeddings.weight\n", "2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers - bert_model.embeddings.word_embeddings.weight\n", "2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.output.LayerNorm.bias\n", "2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.output.LayerNorm.weight\n", "2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.output.dense.bias\n", "2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.output.dense.weight\n", "2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.key.bias\n", "2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.key.weight\n", "2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.query.bias\n", "2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.query.weight\n", "2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.value.bias\n", "2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.attention.self.value.weight\n", "2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.intermediate.dense.bias\n", "2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.intermediate.dense.weight\n", "2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.output.LayerNorm.bias\n", "2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.output.LayerNorm.weight\n", "2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.output.dense.bias\n", "2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.0.output.dense.weight\n", "2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.output.LayerNorm.bias\n", "2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.output.LayerNorm.weight\n", "2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.output.dense.bias\n", "2020-12-20 09:09:37,950 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.output.dense.weight\n", "2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.key.bias\n", "2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.key.weight\n", "2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.query.bias\n", "2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.query.weight\n", "2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.value.bias\n", "2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.attention.self.value.weight\n", "2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.intermediate.dense.bias\n", "2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.intermediate.dense.weight\n", "2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.output.LayerNorm.bias\n", "2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.output.LayerNorm.weight\n", "2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.output.dense.bias\n", "2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.1.output.dense.weight\n", "2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.output.LayerNorm.bias\n", "2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.output.LayerNorm.weight\n", "2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.output.dense.bias\n", "2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.output.dense.weight\n", "2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.key.bias\n", "2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.key.weight\n", "2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.query.bias\n", "2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.query.weight\n", "2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.value.bias\n", "2020-12-20 09:09:37,951 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.attention.self.value.weight\n", "2020-12-20 09:09:37,952 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.intermediate.dense.bias\n", "2020-12-20 09:09:37,952 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.intermediate.dense.weight\n", "2020-12-20 09:09:37,952 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.output.LayerNorm.bias\n", "2020-12-20 09:09:37,952 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.output.LayerNorm.weight\n", "2020-12-20 09:09:37,952 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.output.dense.bias\n", "2020-12-20 09:09:37,952 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.10.output.dense.weight\n", "2020-12-20 09:09:37,952 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.output.LayerNorm.bias\n", "2020-12-20 09:09:37,952 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.output.LayerNorm.weight\n", "2020-12-20 09:09:37,952 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.output.dense.bias\n", "2020-12-20 09:09:37,952 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.output.dense.weight\n", "2020-12-20 09:09:37,952 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.key.bias\n", "2020-12-20 09:09:37,952 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.key.weight\n", "2020-12-20 09:09:37,952 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.query.bias\n", "2020-12-20 09:09:37,952 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.query.weight\n", "2020-12-20 09:09:37,952 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.value.bias\n", "2020-12-20 09:09:37,952 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.attention.self.value.weight\n", "2020-12-20 09:09:37,952 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.intermediate.dense.bias\n", "2020-12-20 09:09:37,953 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.intermediate.dense.weight\n", "2020-12-20 09:09:37,953 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.output.LayerNorm.bias\n", "2020-12-20 09:09:37,953 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.output.LayerNorm.weight\n", "2020-12-20 09:09:37,953 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.output.dense.bias\n", "2020-12-20 09:09:37,953 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.11.output.dense.weight\n", "2020-12-20 09:09:37,953 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.output.LayerNorm.bias\n", "2020-12-20 09:09:37,953 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.output.LayerNorm.weight\n", "2020-12-20 09:09:37,953 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.output.dense.bias\n", "2020-12-20 09:09:37,953 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.output.dense.weight\n", "2020-12-20 09:09:37,953 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.key.bias\n", "2020-12-20 09:09:37,953 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.key.weight\n", "2020-12-20 09:09:37,953 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.query.bias\n", "2020-12-20 09:09:37,953 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.query.weight\n", "2020-12-20 09:09:37,953 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.value.bias\n", "2020-12-20 09:09:37,953 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.attention.self.value.weight\n", "2020-12-20 09:09:37,953 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.intermediate.dense.bias\n", "2020-12-20 09:09:37,954 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.intermediate.dense.weight\n", "2020-12-20 09:09:37,954 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.output.LayerNorm.bias\n", "2020-12-20 09:09:37,954 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.output.LayerNorm.weight\n", "2020-12-20 09:09:37,954 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.output.dense.bias\n", "2020-12-20 09:09:37,954 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.2.output.dense.weight\n", "2020-12-20 09:09:37,954 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.output.LayerNorm.bias\n", "2020-12-20 09:09:37,954 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.output.LayerNorm.weight\n", "2020-12-20 09:09:37,954 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.output.dense.bias\n", "2020-12-20 09:09:37,954 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.output.dense.weight\n", "2020-12-20 09:09:37,954 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.key.bias\n", "2020-12-20 09:09:37,954 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.key.weight\n", "2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.query.bias\n", "2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.query.weight\n", "2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.value.bias\n", "2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.attention.self.value.weight\n", "2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.intermediate.dense.bias\n", "2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.intermediate.dense.weight\n", "2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.output.LayerNorm.bias\n", "2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.output.LayerNorm.weight\n", "2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.output.dense.bias\n", "2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.3.output.dense.weight\n", "2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.output.LayerNorm.bias\n", "2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.output.LayerNorm.weight\n", "2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.output.dense.bias\n", "2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.output.dense.weight\n", "2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.key.bias\n", "2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.key.weight\n", "2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.query.bias\n", "2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.query.weight\n", "2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.value.bias\n", "2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.attention.self.value.weight\n", "2020-12-20 09:09:37,955 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.intermediate.dense.bias\n", "2020-12-20 09:09:37,956 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.intermediate.dense.weight\n", "2020-12-20 09:09:37,956 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.output.LayerNorm.bias\n", "2020-12-20 09:09:37,956 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.output.LayerNorm.weight\n", "2020-12-20 09:09:37,956 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.output.dense.bias\n", "2020-12-20 09:09:37,956 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.4.output.dense.weight\n", "2020-12-20 09:09:37,956 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.output.LayerNorm.bias\n", "2020-12-20 09:09:37,956 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.output.LayerNorm.weight\n", "2020-12-20 09:09:37,956 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.output.dense.bias\n", "2020-12-20 09:09:37,956 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.output.dense.weight\n", "2020-12-20 09:09:37,956 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.key.bias\n", "2020-12-20 09:09:37,956 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.key.weight\n", "2020-12-20 09:09:37,956 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.query.bias\n", "2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.query.weight\n", "2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.value.bias\n", "2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.attention.self.value.weight\n", "2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.intermediate.dense.bias\n", "2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.intermediate.dense.weight\n", "2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.output.LayerNorm.bias\n", "2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.output.LayerNorm.weight\n", "2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.output.dense.bias\n", "2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.5.output.dense.weight\n", "2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.output.LayerNorm.bias\n", "2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.output.LayerNorm.weight\n", "2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.output.dense.bias\n", "2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.output.dense.weight\n", "2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.key.bias\n", "2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.key.weight\n", "2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.query.bias\n", "2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.query.weight\n", "2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.value.bias\n", "2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.attention.self.value.weight\n", "2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.intermediate.dense.bias\n", "2020-12-20 09:09:37,957 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.intermediate.dense.weight\n", "2020-12-20 09:09:37,958 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.output.LayerNorm.bias\n", "2020-12-20 09:09:37,958 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.output.LayerNorm.weight\n", "2020-12-20 09:09:37,958 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.output.dense.bias\n", "2020-12-20 09:09:37,958 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.6.output.dense.weight\n", "2020-12-20 09:09:37,958 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.output.LayerNorm.bias\n", "2020-12-20 09:09:37,958 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.output.LayerNorm.weight\n", "2020-12-20 09:09:37,958 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.output.dense.bias\n", "2020-12-20 09:09:37,958 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.output.dense.weight\n", "2020-12-20 09:09:37,958 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.key.bias\n", "2020-12-20 09:09:37,958 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.key.weight\n", "2020-12-20 09:09:37,958 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.query.bias\n", "2020-12-20 09:09:37,958 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.query.weight\n", "2020-12-20 09:09:37,958 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.value.bias\n", "2020-12-20 09:09:37,968 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.attention.self.value.weight\n", "2020-12-20 09:09:37,968 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.intermediate.dense.bias\n", "2020-12-20 09:09:37,968 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.intermediate.dense.weight\n", "2020-12-20 09:09:37,968 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.output.LayerNorm.bias\n", "2020-12-20 09:09:37,968 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.output.LayerNorm.weight\n", "2020-12-20 09:09:37,968 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.output.dense.bias\n", "2020-12-20 09:09:37,969 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.7.output.dense.weight\n", "2020-12-20 09:09:37,969 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.output.LayerNorm.bias\n", "2020-12-20 09:09:37,969 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.output.LayerNorm.weight\n", "2020-12-20 09:09:37,969 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.output.dense.bias\n", "2020-12-20 09:09:37,969 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.output.dense.weight\n", "2020-12-20 09:09:37,969 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.key.bias\n", "2020-12-20 09:09:37,969 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.key.weight\n", "2020-12-20 09:09:37,969 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.query.bias\n", "2020-12-20 09:09:37,969 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.query.weight\n", "2020-12-20 09:09:37,969 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.value.bias\n", "2020-12-20 09:09:37,969 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.attention.self.value.weight\n", "2020-12-20 09:09:37,969 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.intermediate.dense.bias\n", "2020-12-20 09:09:37,969 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.intermediate.dense.weight\n", "2020-12-20 09:09:37,969 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.output.LayerNorm.bias\n", "2020-12-20 09:09:37,970 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.output.LayerNorm.weight\n", "2020-12-20 09:09:37,970 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.output.dense.bias\n", "2020-12-20 09:09:37,970 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.8.output.dense.weight\n", "2020-12-20 09:09:37,970 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.output.LayerNorm.bias\n", "2020-12-20 09:09:37,970 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.output.LayerNorm.weight\n", "2020-12-20 09:09:37,970 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.output.dense.bias\n", "2020-12-20 09:09:37,970 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.output.dense.weight\n", "2020-12-20 09:09:37,970 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.key.bias\n", "2020-12-20 09:09:37,970 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.key.weight\n", "2020-12-20 09:09:37,970 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.query.bias\n", "2020-12-20 09:09:37,970 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.query.weight\n", "2020-12-20 09:09:37,970 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.value.bias\n", "2020-12-20 09:09:37,970 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.attention.self.value.weight\n", "2020-12-20 09:09:37,970 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.intermediate.dense.bias\n", "2020-12-20 09:09:37,970 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.intermediate.dense.weight\n", "2020-12-20 09:09:37,971 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.output.LayerNorm.bias\n", "2020-12-20 09:09:37,971 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.output.LayerNorm.weight\n", "2020-12-20 09:09:37,971 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.output.dense.bias\n", "2020-12-20 09:09:37,971 - INFO - allennlp.nn.initializers - bert_model.encoder.layer.9.output.dense.weight\n", "2020-12-20 09:09:37,971 - INFO - allennlp.nn.initializers - bert_model.pooler.dense.bias\n", "2020-12-20 09:09:37,971 - INFO - allennlp.nn.initializers - bert_model.pooler.dense.weight\n", "2020-12-20 09:09:37,971 - INFO - allennlp.nn.initializers - tag_projection_layer.bias\n", "2020-12-20 09:09:37,971 - INFO - allennlp.nn.initializers - tag_projection_layer.weight\n", "2020-12-20 09:09:38,432 - INFO - allennlp.common.params - dataset_reader.type = srl\n", "2020-12-20 09:09:38,433 - INFO - allennlp.common.params - dataset_reader.lazy = False\n", "2020-12-20 09:09:38,433 - INFO - allennlp.common.params - dataset_reader.cache_directory = None\n", "2020-12-20 09:09:38,433 - INFO - allennlp.common.params - dataset_reader.max_instances = None\n", "2020-12-20 09:09:38,433 - INFO - allennlp.common.params - dataset_reader.manual_distributed_sharding = False\n", "2020-12-20 09:09:38,433 - INFO - allennlp.common.params - dataset_reader.manual_multi_process_sharding = False\n", "2020-12-20 09:09:38,433 - INFO - allennlp.common.params - dataset_reader.token_indexers = None\n", "2020-12-20 09:09:38,433 - INFO - allennlp.common.params - dataset_reader.domain_identifier = None\n", "2020-12-20 09:09:38,433 - INFO - allennlp.common.params - dataset_reader.bert_model_name = bert-base-uncased\n", "2020-12-20 09:09:38,744 - INFO - transformers.tokenization_utils - loading file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt from cache at /root/.cache/torch/transformers/26bc1ad6c0ac742e9b52263248f6d0f00068293b33709fae12320c0e35ccfbbb.542ce4285a40d23a559526243235df47c5f75c197f04f37d1a0c124c32c9a084\n", "input 0: {\"sentence\": \"Now, ice pucks guys!\"}\n", "prediction: {\"verbs\": [], \"words\": [\"Now\", \",\", \"ice\", \"pucks\", \"guys\", \"!\"]}\n", "\n", "2020-12-20 09:09:39,466 - INFO - allennlp.models.archival - removing temporary unarchived model dir at /tmp/tmpewur_o27\n" ], "name": "stdout" } ] } ] } ================================================ FILE: Chapter10/Haystack_QA_Pipeline.ipynb ================================================ { "nbformat": 4, "nbformat_minor": 0, "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.7.6" }, "colab": { "name": "Haystack_QA_Pipeline.ipynb", "provenance": [], "collapsed_sections": [] }, "accelerator": "GPU", "widgets": { "application/vnd.jupyter.widget-state+json": { "8e2aa2531c9a4890ad722171e4a51122": { "model_module": "@jupyter-widgets/controls", "model_name": "HBoxModel", "state": { "_view_name": "HBoxView", "_dom_classes": [], "_model_name": "HBoxModel", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.5.0", "box_style": "", "layout": "IPY_MODEL_743ca8b7bb574fd99e3e0516ab60b7cc", "_model_module": "@jupyter-widgets/controls", "children": [ "IPY_MODEL_4b361abfd25e4dfc86b96384753f6cdd", "IPY_MODEL_f0da7f5b445a443ca0692396bdf54062" ] } }, "743ca8b7bb574fd99e3e0516ab60b7cc": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "4b361abfd25e4dfc86b96384753f6cdd": { "model_module": "@jupyter-widgets/controls", "model_name": "FloatProgressModel", "state": { "_view_name": "ProgressView", "style": "IPY_MODEL_f480c548d8774a8abc61bd8b045fee29", "_dom_classes": [], "description": "Downloading: 100%", "_model_name": "FloatProgressModel", "bar_style": "success", "max": 571, "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": 571, "_view_count": null, "_view_module_version": "1.5.0", "orientation": "horizontal", "min": 0, "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_8459fc383ca741a29ea1d53190b71457" } }, "f0da7f5b445a443ca0692396bdf54062": { "model_module": "@jupyter-widgets/controls", "model_name": "HTMLModel", "state": { "_view_name": "HTMLView", "style": "IPY_MODEL_d1b1591de71a4bd8a2909975ee82d998", "_dom_classes": [], "description": "", "_model_name": "HTMLModel", "placeholder": "​", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": " 571/571 [00:16<00:00, 34.8B/s]", "_view_count": null, "_view_module_version": "1.5.0", "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_afebd0dd7f3e4e44927220ad3fc13f17" } }, "f480c548d8774a8abc61bd8b045fee29": { "model_module": "@jupyter-widgets/controls", "model_name": "ProgressStyleModel", "state": { "_view_name": "StyleView", "_model_name": "ProgressStyleModel", "description_width": "initial", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "bar_color": null, "_model_module": "@jupyter-widgets/controls" } }, "8459fc383ca741a29ea1d53190b71457": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "d1b1591de71a4bd8a2909975ee82d998": { "model_module": "@jupyter-widgets/controls", "model_name": "DescriptionStyleModel", "state": { "_view_name": "StyleView", "_model_name": "DescriptionStyleModel", "description_width": "", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "_model_module": "@jupyter-widgets/controls" } }, "afebd0dd7f3e4e44927220ad3fc13f17": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "56c4b0d2d2654b1e9470d8f0b920ae16": { "model_module": "@jupyter-widgets/controls", "model_name": "HBoxModel", "state": { "_view_name": "HBoxView", "_dom_classes": [], "_model_name": "HBoxModel", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.5.0", "box_style": "", "layout": "IPY_MODEL_d0cdfe65d369405a90a7abcf38c289c8", "_model_module": "@jupyter-widgets/controls", "children": [ "IPY_MODEL_d66b73d1014848db81edf429c857f9ef", "IPY_MODEL_dce3f0f8e8e24c73913617300da7e370" ] } }, "d0cdfe65d369405a90a7abcf38c289c8": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "d66b73d1014848db81edf429c857f9ef": { "model_module": "@jupyter-widgets/controls", "model_name": "FloatProgressModel", "state": { "_view_name": "ProgressView", "style": "IPY_MODEL_2c095740976c4bdea8645e21d277283c", "_dom_classes": [], "description": "Downloading: 100%", "_model_name": "FloatProgressModel", "bar_style": "success", "max": 496313727, "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": 496313727, "_view_count": null, "_view_module_version": "1.5.0", "orientation": "horizontal", "min": 0, "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_fbc7634200954b37b7c869050591022d" } }, "dce3f0f8e8e24c73913617300da7e370": { "model_module": "@jupyter-widgets/controls", "model_name": "HTMLModel", "state": { "_view_name": "HTMLView", "style": "IPY_MODEL_4a622f003ef24eb79fd13a08f423c665", "_dom_classes": [], "description": "", "_model_name": "HTMLModel", "placeholder": "​", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": " 496M/496M [00:13<00:00, 35.8MB/s]", "_view_count": null, "_view_module_version": "1.5.0", "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_db417b1034bc405fa3ac881361f943c4" } }, "2c095740976c4bdea8645e21d277283c": { "model_module": "@jupyter-widgets/controls", "model_name": "ProgressStyleModel", "state": { "_view_name": "StyleView", "_model_name": "ProgressStyleModel", "description_width": "initial", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "bar_color": null, "_model_module": "@jupyter-widgets/controls" } }, "fbc7634200954b37b7c869050591022d": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "4a622f003ef24eb79fd13a08f423c665": { "model_module": "@jupyter-widgets/controls", "model_name": "DescriptionStyleModel", "state": { "_view_name": "StyleView", "_model_name": "DescriptionStyleModel", "description_width": "", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "_model_module": "@jupyter-widgets/controls" } }, "db417b1034bc405fa3ac881361f943c4": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "ede7dfd59ae8455689373afda2771132": { "model_module": "@jupyter-widgets/controls", "model_name": "HBoxModel", "state": { "_view_name": "HBoxView", "_dom_classes": [], "_model_name": "HBoxModel", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.5.0", "box_style": "", "layout": "IPY_MODEL_44ff19ca5bed4708aa3bf39032563b2e", "_model_module": "@jupyter-widgets/controls", "children": [ "IPY_MODEL_9be9ce8699c14d47853d40e9cd8bf7d0", "IPY_MODEL_9f3d806fbec84b179e9a49e49c905fa3" ] } }, "44ff19ca5bed4708aa3bf39032563b2e": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "9be9ce8699c14d47853d40e9cd8bf7d0": { "model_module": "@jupyter-widgets/controls", "model_name": "FloatProgressModel", "state": { "_view_name": "ProgressView", "style": "IPY_MODEL_f532cd4dccc14741a9d4e506a72507ad", "_dom_classes": [], "description": "Downloading: 100%", "_model_name": "FloatProgressModel", "bar_style": "success", "max": 898822, "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": 898822, "_view_count": null, "_view_module_version": "1.5.0", "orientation": "horizontal", "min": 0, "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_d810a6e15c1d43e0a6793c60622da504" } }, "9f3d806fbec84b179e9a49e49c905fa3": { "model_module": "@jupyter-widgets/controls", "model_name": "HTMLModel", "state": { "_view_name": "HTMLView", "style": "IPY_MODEL_4f711e53e0944446aa51e8a241017e8c", "_dom_classes": [], "description": "", "_model_name": "HTMLModel", "placeholder": "​", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": " 899k/899k [00:00<00:00, 933kB/s]", "_view_count": null, "_view_module_version": "1.5.0", "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_19b30696c5294a6b92cecb3533b10fed" } }, "f532cd4dccc14741a9d4e506a72507ad": { "model_module": "@jupyter-widgets/controls", "model_name": "ProgressStyleModel", "state": { "_view_name": "StyleView", "_model_name": "ProgressStyleModel", "description_width": "initial", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "bar_color": null, "_model_module": "@jupyter-widgets/controls" } }, "d810a6e15c1d43e0a6793c60622da504": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "4f711e53e0944446aa51e8a241017e8c": { "model_module": "@jupyter-widgets/controls", "model_name": "DescriptionStyleModel", "state": { "_view_name": "StyleView", "_model_name": "DescriptionStyleModel", "description_width": "", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "_model_module": "@jupyter-widgets/controls" } }, "19b30696c5294a6b92cecb3533b10fed": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "7fb2356ed11344af950f52ebad7c57e1": { "model_module": "@jupyter-widgets/controls", "model_name": "HBoxModel", "state": { "_view_name": "HBoxView", "_dom_classes": [], "_model_name": "HBoxModel", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.5.0", "box_style": "", "layout": "IPY_MODEL_74e56a811e4646839645cc8e1cd2945e", "_model_module": "@jupyter-widgets/controls", "children": [ "IPY_MODEL_9207a922be2c48b187d97e1575b3ac39", "IPY_MODEL_7726ea4f836642ad8ab3e19a4f087919" ] } }, "74e56a811e4646839645cc8e1cd2945e": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "9207a922be2c48b187d97e1575b3ac39": { "model_module": "@jupyter-widgets/controls", "model_name": "FloatProgressModel", "state": { "_view_name": "ProgressView", "style": "IPY_MODEL_c11075b708e84ffabe5ce867f710dbb3", "_dom_classes": [], "description": "Downloading: 100%", "_model_name": "FloatProgressModel", "bar_style": "success", "max": 456318, "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": 456318, "_view_count": null, "_view_module_version": "1.5.0", "orientation": "horizontal", "min": 0, "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_3596178648404646a454b7fb5aeb0742" } }, "7726ea4f836642ad8ab3e19a4f087919": { "model_module": "@jupyter-widgets/controls", "model_name": "HTMLModel", "state": { "_view_name": "HTMLView", "style": "IPY_MODEL_2b348979747342869ab83a3bb7d9f71a", "_dom_classes": [], "description": "", "_model_name": "HTMLModel", "placeholder": "​", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": " 456k/456k [00:02<00:00, 215kB/s]", "_view_count": null, "_view_module_version": "1.5.0", "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_9b7811f75b0e42e4b7a4fa14e089668c" } }, "c11075b708e84ffabe5ce867f710dbb3": { "model_module": "@jupyter-widgets/controls", "model_name": "ProgressStyleModel", "state": { "_view_name": "StyleView", "_model_name": "ProgressStyleModel", "description_width": "initial", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "bar_color": null, "_model_module": "@jupyter-widgets/controls" } }, "3596178648404646a454b7fb5aeb0742": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "2b348979747342869ab83a3bb7d9f71a": { "model_module": "@jupyter-widgets/controls", "model_name": "DescriptionStyleModel", "state": { "_view_name": "StyleView", "_model_name": "DescriptionStyleModel", "description_width": "", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "_model_module": "@jupyter-widgets/controls" } }, "9b7811f75b0e42e4b7a4fa14e089668c": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "9ffa302c8b604d56a4d9826fb783f786": { "model_module": "@jupyter-widgets/controls", "model_name": "HBoxModel", "state": { "_view_name": "HBoxView", "_dom_classes": [], "_model_name": "HBoxModel", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.5.0", "box_style": "", "layout": "IPY_MODEL_a37b422df0054a89a9b59d4233461b1b", "_model_module": "@jupyter-widgets/controls", "children": [ "IPY_MODEL_3f9fa7a617c74f4fa4eda55e6d2c3f3d", "IPY_MODEL_18e00a4457534250970b69f8146282e2" ] } }, "a37b422df0054a89a9b59d4233461b1b": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "3f9fa7a617c74f4fa4eda55e6d2c3f3d": { "model_module": "@jupyter-widgets/controls", "model_name": "FloatProgressModel", "state": { "_view_name": "ProgressView", "style": "IPY_MODEL_a23f46136c0e4373a9733cc7dba0c95e", "_dom_classes": [], "description": "Downloading: 100%", "_model_name": "FloatProgressModel", "bar_style": "success", "max": 772, "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": 772, "_view_count": null, "_view_module_version": "1.5.0", "orientation": "horizontal", "min": 0, "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_26b1e8ea52e041ce8a1847e4d9483434" } }, "18e00a4457534250970b69f8146282e2": { "model_module": "@jupyter-widgets/controls", "model_name": "HTMLModel", "state": { "_view_name": "HTMLView", "style": "IPY_MODEL_db5bdb301f2b48c5987c8fcb5236cf6c", "_dom_classes": [], "description": "", "_model_name": "HTMLModel", "placeholder": "​", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": " 772/772 [00:00<00:00, 774B/s]", "_view_count": null, "_view_module_version": "1.5.0", "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_6064912c6a0e41d4b0a8389f3e56b1e9" } }, "a23f46136c0e4373a9733cc7dba0c95e": { "model_module": "@jupyter-widgets/controls", "model_name": "ProgressStyleModel", "state": { "_view_name": "StyleView", "_model_name": "ProgressStyleModel", "description_width": "initial", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "bar_color": null, "_model_module": "@jupyter-widgets/controls" } }, "26b1e8ea52e041ce8a1847e4d9483434": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "db5bdb301f2b48c5987c8fcb5236cf6c": { "model_module": "@jupyter-widgets/controls", "model_name": "DescriptionStyleModel", "state": { "_view_name": "StyleView", "_model_name": "DescriptionStyleModel", "description_width": "", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "_model_module": "@jupyter-widgets/controls" } }, "6064912c6a0e41d4b0a8389f3e56b1e9": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "d5287e28a98749b5b2cd1560f157ff36": { "model_module": "@jupyter-widgets/controls", "model_name": "HBoxModel", "state": { "_view_name": "HBoxView", "_dom_classes": [], "_model_name": "HBoxModel", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.5.0", "box_style": "", "layout": "IPY_MODEL_17ffd297995d442d86a05273b39ba7c0", "_model_module": "@jupyter-widgets/controls", "children": [ "IPY_MODEL_38d8b60615fa4e649634802153ad09cd", "IPY_MODEL_c86d5ff0dd8b40a5a4ff92d16ffcb21a" ] } }, "17ffd297995d442d86a05273b39ba7c0": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "38d8b60615fa4e649634802153ad09cd": { "model_module": "@jupyter-widgets/controls", "model_name": "FloatProgressModel", "state": { "_view_name": "ProgressView", "style": "IPY_MODEL_5de028a8c6494e5d9e2c0e1134080369", "_dom_classes": [], "description": "Downloading: 100%", "_model_name": "FloatProgressModel", "bar_style": "success", "max": 79, "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": 79, "_view_count": null, "_view_module_version": "1.5.0", "orientation": "horizontal", "min": 0, "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_bb183a36134f4463a5453a043de77d93" } }, "c86d5ff0dd8b40a5a4ff92d16ffcb21a": { "model_module": "@jupyter-widgets/controls", "model_name": "HTMLModel", "state": { "_view_name": "HTMLView", "style": "IPY_MODEL_fbedf7b06f9846348e174fc2536b765f", "_dom_classes": [], "description": "", "_model_name": "HTMLModel", "placeholder": "​", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": " 79.0/79.0 [00:00<00:00, 216B/s]", "_view_count": null, "_view_module_version": "1.5.0", "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_cac99d9ef6f74c07937fc39a20157937" } }, "5de028a8c6494e5d9e2c0e1134080369": { "model_module": "@jupyter-widgets/controls", "model_name": "ProgressStyleModel", "state": { "_view_name": "StyleView", "_model_name": "ProgressStyleModel", "description_width": "initial", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "bar_color": null, "_model_module": "@jupyter-widgets/controls" } }, "bb183a36134f4463a5453a043de77d93": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "fbedf7b06f9846348e174fc2536b765f": { "model_module": "@jupyter-widgets/controls", "model_name": "DescriptionStyleModel", "state": { "_view_name": "StyleView", "_model_name": "DescriptionStyleModel", "description_width": "", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "_model_module": "@jupyter-widgets/controls" } }, "cac99d9ef6f74c07937fc39a20157937": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } } } } }, "cells": [ { "cell_type": "markdown", "metadata": { "id": "NyzjawOojIw-" }, "source": [ "#Haystack Question-Answering Framework\r\n", "\r\n", "Notebook Author: [Malte Pietsch](https://www.linkedin.com/in/maltepietsch/)\r\n", "\r\n", "[Deepset AI Haystack GitHub Repository](https://github.com/deepset-ai/haystack/)\r\n" ] }, { "cell_type": "code", "metadata": { "id": "9E7CI3wONcSo", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "56fe1747-3bb5-46e0-b8f1-c55641473dcd" }, "source": [ "# Install Haystack\n", "!pip install farm-haystack==0.6.0\n", "\n", "# Install specific versions of urllib and torch to avoid conflicts with preinstalled versions on Colab\n", "!pip install urllib3==1.25.4\n", "!pip install torch==1.6.0+cu101 -f https://download.pytorch.org/whl/torch_stable.html\n" ], "execution_count": 1, "outputs": [ { "output_type": "stream", "text": [ "Collecting farm-haystack==0.6.0\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/6d/c1/004081bfe50c20433718812321044b9d9dc7cf73bc5a63a2b335227bd21c/farm_haystack-0.6.0-py3-none-any.whl (104kB)\n", "\u001b[K |████████████████████████████████| 112kB 8.1MB/s \n", "\u001b[?25hCollecting uvloop; sys_platform != \"win32\" and sys_platform != \"cygwin\"\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/41/48/586225bbb02d3bdca475b17e4be5ce5b3f09da2d6979f359916c1592a687/uvloop-0.14.0-cp36-cp36m-manylinux2010_x86_64.whl (3.9MB)\n", "\u001b[K |████████████████████████████████| 3.9MB 13.7MB/s \n", "\u001b[?25hRequirement already satisfied: coverage in /usr/local/lib/python3.6/dist-packages (from farm-haystack==0.6.0) (3.7.1)\n", "Requirement already satisfied: pandas in /usr/local/lib/python3.6/dist-packages (from farm-haystack==0.6.0) (1.1.5)\n", "Requirement already satisfied: nltk in /usr/local/lib/python3.6/dist-packages (from farm-haystack==0.6.0) (3.2.5)\n", "Collecting elasticsearch<=7.10,>=7.7\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/14/ba/f950bdd9164fb2bbbe5093700162234fbe61f446fe2300a8993761c132ca/elasticsearch-7.10.0-py2.py3-none-any.whl (321kB)\n", "\u001b[K |████████████████████████████████| 327kB 49.8MB/s \n", "\u001b[?25hCollecting farm==0.5.0\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/a3/e4/2f47c850732a1d729e74add867e967f058370f29a313da05dc871ff8465e/farm-0.5.0-py3-none-any.whl (207kB)\n", "\u001b[K |████████████████████████████████| 215kB 56.3MB/s \n", "\u001b[?25hCollecting fastapi\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/9f/33/1b643f650688ad368983bbaf3b0658438038ea84d775dd37393d826c3833/fastapi-0.63.0-py3-none-any.whl (50kB)\n", "\u001b[K |████████████████████████████████| 51kB 8.0MB/s \n", "\u001b[?25hCollecting python-multipart\n", " Downloading https://files.pythonhosted.org/packages/46/40/a933ac570bf7aad12a298fc53458115cc74053474a72fbb8201d7dc06d3d/python-multipart-0.0.5.tar.gz\n", "Collecting langdetect\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/56/a3/8407c1e62d5980188b4acc45ef3d94b933d14a2ebc9ef3505f22cf772570/langdetect-1.0.8.tar.gz (981kB)\n", "\u001b[K |████████████████████████████████| 983kB 53.3MB/s \n", "\u001b[?25hCollecting psycopg2-binary; sys_platform != \"win32\" and sys_platform != \"cygwin\"\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/f2/1b/720b36697158113ca1b2221a8e96a470088ccf3770d182214689d1a96a07/psycopg2_binary-2.8.6-cp36-cp36m-manylinux1_x86_64.whl (3.0MB)\n", "\u001b[K |████████████████████████████████| 3.0MB 53.4MB/s \n", "\u001b[?25hRequirement already satisfied: networkx in /usr/local/lib/python3.6/dist-packages (from farm-haystack==0.6.0) (2.5)\n", "Collecting faiss-cpu==1.6.3; sys_platform != \"win32\" and sys_platform != \"cygwin\"\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/1d/84/9de38703486d9f00b1a63590887a318d08c52f10f768968bd7626aee75da/faiss_cpu-1.6.3-cp36-cp36m-manylinux2010_x86_64.whl (7.2MB)\n", "\u001b[K |████████████████████████████████| 7.2MB 28.6MB/s \n", "\u001b[?25hCollecting tika\n", " Downloading https://files.pythonhosted.org/packages/96/07/244fbb9c74c0de8a3745cc9f3f496077a29f6418c7cbd90d68fd799574cb/tika-1.24.tar.gz\n", "Requirement already satisfied: sklearn in /usr/local/lib/python3.6/dist-packages (from farm-haystack==0.6.0) (0.0)\n", "Collecting uvicorn\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/2e/02/1e2520f6999e793d5bc5c15d8057b2e829d16a148e41199e0ae519653fa0/uvicorn-0.13.3-py3-none-any.whl (45kB)\n", "\u001b[K |████████████████████████████████| 51kB 9.6MB/s \n", "\u001b[?25hCollecting gunicorn\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/69/ca/926f7cd3a2014b16870086b2d0fdc84a9e49473c68a8dff8b57f7c156f43/gunicorn-20.0.4-py2.py3-none-any.whl (77kB)\n", "\u001b[K |████████████████████████████████| 81kB 12.3MB/s \n", "\u001b[?25hCollecting sqlalchemy-utils\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/14/68/e5301c4c960c79a32333b8805e52cb69d3d237aa869a773b4157ccb3eb26/SQLAlchemy-Utils-0.36.8.tar.gz (138kB)\n", "\u001b[K |████████████████████████████████| 143kB 54.8MB/s \n", "\u001b[?25hCollecting httptools\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/b1/a6/dc1e7e8f4049ab70d52c9690ec10652e268ab2542853033cc1d539594102/httptools-0.1.1-cp36-cp36m-manylinux1_x86_64.whl (216kB)\n", "\u001b[K |████████████████████████████████| 225kB 48.2MB/s \n", "\u001b[?25hRequirement already satisfied: more-itertools in /usr/local/lib/python3.6/dist-packages (from farm-haystack==0.6.0) (8.6.0)\n", "Collecting tox\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/e0/79/5915b9dad867e89bb6495456acfe5d4e2287e74dfa29c059f7b127d5480e/tox-3.20.1-py2.py3-none-any.whl (83kB)\n", "\u001b[K |████████████████████████████████| 92kB 13.6MB/s \n", "\u001b[?25hCollecting python-docx\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/e4/83/c66a1934ed5ed8ab1dbb9931f1779079f8bca0f6bbc5793c06c4b5e7d671/python-docx-0.8.10.tar.gz (5.5MB)\n", "\u001b[K |████████████████████████████████| 5.5MB 41.0MB/s \n", "\u001b[?25hCollecting elastic-apm\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/27/c4/7bc90b3398198ea87f4316b739055f0319a0871415e561aceb4682e30a73/elastic_apm-5.10.0-cp36-cp36m-manylinux2010_x86_64.whl (318kB)\n", "\u001b[K |████████████████████████████████| 327kB 57.1MB/s \n", "\u001b[?25hRequirement already satisfied: pytz>=2017.2 in /usr/local/lib/python3.6/dist-packages (from pandas->farm-haystack==0.6.0) (2018.9)\n", "Requirement already satisfied: python-dateutil>=2.7.3 in /usr/local/lib/python3.6/dist-packages (from pandas->farm-haystack==0.6.0) (2.8.1)\n", "Requirement already satisfied: numpy>=1.15.4 in /usr/local/lib/python3.6/dist-packages (from pandas->farm-haystack==0.6.0) (1.19.4)\n", "Requirement already satisfied: six in /usr/local/lib/python3.6/dist-packages (from nltk->farm-haystack==0.6.0) (1.15.0)\n", "Requirement already satisfied: certifi in /usr/local/lib/python3.6/dist-packages (from elasticsearch<=7.10,>=7.7->farm-haystack==0.6.0) (2020.12.5)\n", "Requirement already satisfied: urllib3<2,>=1.21.1 in /usr/local/lib/python3.6/dist-packages (from elasticsearch<=7.10,>=7.7->farm-haystack==0.6.0) (1.24.3)\n", "Collecting flask-cors\n", " Downloading https://files.pythonhosted.org/packages/69/7f/d0aeaaafb5c3c76c8d2141dbe2d4f6dca5d6c31872d4e5349768c1958abc/Flask_Cors-3.0.9-py2.py3-none-any.whl\n", "Collecting dotmap==1.3.0\n", " Downloading https://files.pythonhosted.org/packages/fa/eb/ee5f0358a9e0ede90308d8f34e697e122f191c2702dc4f614eca7770b1eb/dotmap-1.3.0-py3-none-any.whl\n", "Requirement already satisfied: setuptools in /usr/local/lib/python3.6/dist-packages (from farm==0.5.0->farm-haystack==0.6.0) (51.0.0)\n", "Collecting transformers==3.3.1\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/19/22/aff234f4a841f8999e68a7a94bdd4b60b4cebcfeca5d67d61cd08c9179de/transformers-3.3.1-py3-none-any.whl (1.1MB)\n", "\u001b[K |████████████████████████████████| 1.1MB 50.3MB/s \n", "\u001b[?25hCollecting torch<1.7,>1.5\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/38/53/914885a93a44b96c0dd1c36f36ff10afe341f091230aad68f7228d61db1e/torch-1.6.0-cp36-cp36m-manylinux1_x86_64.whl (748.8MB)\n", "\u001b[K |████████████████████████████████| 748.8MB 24kB/s \n", "\u001b[?25hCollecting boto3\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/1d/da/c6eaf4c1c8eec70fea402495ee34112824241bc96e20756c0c0c6f97feab/boto3-1.16.46-py2.py3-none-any.whl (130kB)\n", "\u001b[K |████████████████████████████████| 133kB 59.5MB/s \n", "\u001b[?25hRequirement already satisfied: scipy>=1.3.2 in /usr/local/lib/python3.6/dist-packages (from farm==0.5.0->farm-haystack==0.6.0) (1.4.1)\n", "Requirement already satisfied: dill in /usr/local/lib/python3.6/dist-packages (from farm==0.5.0->farm-haystack==0.6.0) (0.3.3)\n", "Requirement already satisfied: tqdm in /usr/local/lib/python3.6/dist-packages (from farm==0.5.0->farm-haystack==0.6.0) (4.41.1)\n", "Collecting seqeval==0.0.12\n", " Downloading https://files.pythonhosted.org/packages/34/91/068aca8d60ce56dd9ba4506850e876aba5e66a6f2f29aa223224b50df0de/seqeval-0.0.12.tar.gz\n", "Requirement already satisfied: psutil in /usr/local/lib/python3.6/dist-packages (from farm==0.5.0->farm-haystack==0.6.0) (5.4.8)\n", "Collecting mlflow==1.0.0\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/01/ec/8c9448968d4662e8354b9c3a62e635f8929ed507a45af3d9fdb84be51270/mlflow-1.0.0-py3-none-any.whl (47.7MB)\n", "\u001b[K |████████████████████████████████| 47.7MB 143kB/s \n", "\u001b[?25hCollecting flask-restplus\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/c2/a6/b17c848771f96ad039ad9e3ea275e842a16c39c4f3eb9f60ee330b20b6c2/flask_restplus-0.13.0-py2.py3-none-any.whl (2.5MB)\n", "\u001b[K |████████████████████████████████| 2.5MB 50.2MB/s \n", "\u001b[?25hRequirement already satisfied: wheel in /usr/local/lib/python3.6/dist-packages (from farm==0.5.0->farm-haystack==0.6.0) (0.36.2)\n", "Collecting Werkzeug==0.16.1\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/c2/e4/a859d2fe516f466642fa5c6054fd9646271f9da26b0cac0d2f37fc858c8f/Werkzeug-0.16.1-py2.py3-none-any.whl (327kB)\n", "\u001b[K |████████████████████████████████| 327kB 50.4MB/s \n", "\u001b[?25hRequirement already satisfied: requests in /usr/local/lib/python3.6/dist-packages (from farm==0.5.0->farm-haystack==0.6.0) (2.23.0)\n", "Requirement already satisfied: flask in /usr/local/lib/python3.6/dist-packages (from farm==0.5.0->farm-haystack==0.6.0) (1.1.2)\n", "Collecting pydantic<2.0.0,>=1.0.0\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/52/ea/fae9f69b6e56407961318e8c73e203097a97c7bd71b30bf1b4f5eb448f28/pydantic-1.7.3-cp36-cp36m-manylinux2014_x86_64.whl (9.2MB)\n", "\u001b[K |████████████████████████████████| 9.2MB 25.7MB/s \n", "\u001b[?25hCollecting starlette==0.13.6\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/c5/a4/c9e228d7d47044ce4c83ba002f28ff479e542455f0499198a3f77c94f564/starlette-0.13.6-py3-none-any.whl (59kB)\n", "\u001b[K |████████████████████████████████| 61kB 10.9MB/s \n", "\u001b[?25hRequirement already satisfied: decorator>=4.3.0 in /usr/local/lib/python3.6/dist-packages (from networkx->farm-haystack==0.6.0) (4.4.2)\n", "Requirement already satisfied: scikit-learn in /usr/local/lib/python3.6/dist-packages (from sklearn->farm-haystack==0.6.0) (0.22.2.post1)\n", "Requirement already satisfied: typing-extensions; python_version < \"3.8\" in /usr/local/lib/python3.6/dist-packages (from uvicorn->farm-haystack==0.6.0) (3.7.4.3)\n", "Collecting h11>=0.8\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/b2/79/9c5f5cd738ec2a9b26453b3093915c0999f24454e2773921025c03b5509e/h11-0.11.0-py2.py3-none-any.whl (54kB)\n", "\u001b[K |████████████████████████████████| 61kB 10.0MB/s \n", "\u001b[?25hRequirement already satisfied: click==7.* in /usr/local/lib/python3.6/dist-packages (from uvicorn->farm-haystack==0.6.0) (7.1.2)\n", "Requirement already satisfied: SQLAlchemy>=1.0 in /usr/local/lib/python3.6/dist-packages (from sqlalchemy-utils->farm-haystack==0.6.0) (1.3.20)\n", "Requirement already satisfied: filelock>=3.0.0 in /usr/local/lib/python3.6/dist-packages (from tox->farm-haystack==0.6.0) (3.0.12)\n", "Requirement already satisfied: py>=1.4.17 in /usr/local/lib/python3.6/dist-packages (from tox->farm-haystack==0.6.0) (1.10.0)\n", "Requirement already satisfied: toml>=0.9.4 in /usr/local/lib/python3.6/dist-packages (from tox->farm-haystack==0.6.0) (0.10.2)\n", "Collecting virtualenv!=20.0.0,!=20.0.1,!=20.0.2,!=20.0.3,!=20.0.4,!=20.0.5,!=20.0.6,!=20.0.7,>=16.0.0\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/1a/c6/bb564f5eec616d241e85d741f00a07f5f50ea12989022ad49bc66876993c/virtualenv-20.2.2-py2.py3-none-any.whl (5.7MB)\n", "\u001b[K |████████████████████████████████| 5.7MB 58.4MB/s \n", "\u001b[?25hRequirement already satisfied: packaging>=14 in /usr/local/lib/python3.6/dist-packages (from tox->farm-haystack==0.6.0) (20.8)\n", "Collecting importlib-metadata<3,>=0.12; python_version < \"3.8\"\n", " Downloading https://files.pythonhosted.org/packages/98/b8/8ec57a8ef46fbe7f185318c7ff7df9a06c9df451d9a59a067bfa851bb828/importlib_metadata-2.1.1-py2.py3-none-any.whl\n", "Collecting pluggy>=0.12.0\n", " Downloading https://files.pythonhosted.org/packages/a0/28/85c7aa31b80d150b772fbe4a229487bc6644da9ccb7e427dd8cc60cb8a62/pluggy-0.13.1-py2.py3-none-any.whl\n", "Requirement already satisfied: lxml>=2.3.2 in /usr/local/lib/python3.6/dist-packages (from python-docx->farm-haystack==0.6.0) (4.2.6)\n", "Collecting tokenizers==0.8.1.rc2\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/80/83/8b9fccb9e48eeb575ee19179e2bdde0ee9a1904f97de5f02d19016b8804f/tokenizers-0.8.1rc2-cp36-cp36m-manylinux1_x86_64.whl (3.0MB)\n", "\u001b[K |████████████████████████████████| 3.0MB 46.4MB/s \n", "\u001b[?25hRequirement already satisfied: regex!=2019.12.17 in /usr/local/lib/python3.6/dist-packages (from transformers==3.3.1->farm==0.5.0->farm-haystack==0.6.0) (2019.12.20)\n", "Requirement already satisfied: dataclasses; python_version < \"3.7\" in /usr/local/lib/python3.6/dist-packages (from transformers==3.3.1->farm==0.5.0->farm-haystack==0.6.0) (0.8)\n", "Collecting sentencepiece!=0.1.92\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/e5/2d/6d4ca4bef9a67070fa1cac508606328329152b1df10bdf31fb6e4e727894/sentencepiece-0.1.94-cp36-cp36m-manylinux2014_x86_64.whl (1.1MB)\n", "\u001b[K |████████████████████████████████| 1.1MB 55.2MB/s \n", "\u001b[?25hCollecting sacremoses\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/7d/34/09d19aff26edcc8eb2a01bed8e98f13a1537005d31e95233fd48216eed10/sacremoses-0.0.43.tar.gz (883kB)\n", "\u001b[K |████████████████████████████████| 890kB 57.7MB/s \n", "\u001b[?25hRequirement already satisfied: future in /usr/local/lib/python3.6/dist-packages (from torch<1.7,>1.5->farm==0.5.0->farm-haystack==0.6.0) (0.16.0)\n", "Collecting botocore<1.20.0,>=1.19.46\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/5c/48/8151aad820996a46373a4ffa2268a7209c10518d1b3eb48bbd0010c5b6a3/botocore-1.19.46-py2.py3-none-any.whl (7.2MB)\n", "\u001b[K |████████████████████████████████| 7.2MB 54.2MB/s \n", "\u001b[?25hCollecting s3transfer<0.4.0,>=0.3.0\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/69/79/e6afb3d8b0b4e96cefbdc690f741d7dd24547ff1f94240c997a26fa908d3/s3transfer-0.3.3-py2.py3-none-any.whl (69kB)\n", "\u001b[K |████████████████████████████████| 71kB 11.4MB/s \n", "\u001b[?25hCollecting jmespath<1.0.0,>=0.7.1\n", " Downloading https://files.pythonhosted.org/packages/07/cb/5f001272b6faeb23c1c9e0acc04d48eaaf5c862c17709d20e3469c6e0139/jmespath-0.10.0-py2.py3-none-any.whl\n", "Requirement already satisfied: Keras>=2.2.4 in /usr/local/lib/python3.6/dist-packages (from seqeval==0.0.12->farm==0.5.0->farm-haystack==0.6.0) (2.4.3)\n", "Collecting databricks-cli>=0.8.0\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/40/88/ae1f78cf582b707c605c77df49b4c8786a4465edc51adb25d2f98ef4c4de/databricks-cli-0.14.1.tar.gz (54kB)\n", "\u001b[K |████████████████████████████████| 61kB 11.6MB/s \n", "\u001b[?25hCollecting gitpython>=2.1.0\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/24/d1/a7f8fe3df258549b303415157328bfcc63e9b11d06a7ad7a3327f3d32606/GitPython-3.1.11-py3-none-any.whl (159kB)\n", "\u001b[K |████████████████████████████████| 163kB 59.8MB/s \n", "\u001b[?25hRequirement already satisfied: cloudpickle in /usr/local/lib/python3.6/dist-packages (from mlflow==1.0.0->farm==0.5.0->farm-haystack==0.6.0) (1.3.0)\n", "Collecting simplejson\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/73/96/1e6b19045375890068d7342cbe280dd64ae73fd90b9735b5efb8d1e044a1/simplejson-3.17.2-cp36-cp36m-manylinux2010_x86_64.whl (127kB)\n", "\u001b[K |████████████████████████████████| 133kB 58.1MB/s \n", "\u001b[?25hRequirement already satisfied: protobuf>=3.6.0 in /usr/local/lib/python3.6/dist-packages (from mlflow==1.0.0->farm==0.5.0->farm-haystack==0.6.0) (3.12.4)\n", "Requirement already satisfied: entrypoints in /usr/local/lib/python3.6/dist-packages (from mlflow==1.0.0->farm==0.5.0->farm-haystack==0.6.0) (0.3)\n", "Requirement already satisfied: sqlparse in /usr/local/lib/python3.6/dist-packages (from mlflow==1.0.0->farm==0.5.0->farm-haystack==0.6.0) (0.4.1)\n", "Collecting querystring-parser\n", " Downloading https://files.pythonhosted.org/packages/88/6b/572b2590fd55114118bf08bde63c0a421dcc82d593700f3e2ad89908a8a9/querystring_parser-1.2.4-py2.py3-none-any.whl\n", "Collecting alembic\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/12/aa/c261dfd7f4ba6ce4701846a2689a46e2a172e012171de4378fc2926e3bf0/alembic-1.4.3-py2.py3-none-any.whl (159kB)\n", "\u001b[K |████████████████████████████████| 163kB 50.7MB/s \n", "\u001b[?25hRequirement already satisfied: pyyaml in /usr/local/lib/python3.6/dist-packages (from mlflow==1.0.0->farm==0.5.0->farm-haystack==0.6.0) (3.13)\n", "Collecting docker>=3.6.0\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/9f/a5/eec74d8d1016e6c2042ba31ca6fba3bba520e27d8a061e82bccd36bd64ef/docker-4.4.1-py2.py3-none-any.whl (146kB)\n", "\u001b[K |████████████████████████████████| 153kB 64.8MB/s \n", "\u001b[?25hCollecting aniso8601>=0.82\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/93/4e/760c0aaf32034e2da98e1ac6d83b6ffc6d1301132af54c3950ee07785bfa/aniso8601-8.1.0-py2.py3-none-any.whl (44kB)\n", "\u001b[K |████████████████████████████████| 51kB 10.1MB/s \n", "\u001b[?25hRequirement already satisfied: jsonschema in /usr/local/lib/python3.6/dist-packages (from flask-restplus->farm==0.5.0->farm-haystack==0.6.0) (2.6.0)\n", "Requirement already satisfied: idna<3,>=2.5 in /usr/local/lib/python3.6/dist-packages (from requests->farm==0.5.0->farm-haystack==0.6.0) (2.10)\n", "Requirement already satisfied: chardet<4,>=3.0.2 in /usr/local/lib/python3.6/dist-packages (from requests->farm==0.5.0->farm-haystack==0.6.0) (3.0.4)\n", "Requirement already satisfied: Jinja2>=2.10.1 in /usr/local/lib/python3.6/dist-packages (from flask->farm==0.5.0->farm-haystack==0.6.0) (2.11.2)\n", "Requirement already satisfied: itsdangerous>=0.24 in /usr/local/lib/python3.6/dist-packages (from flask->farm==0.5.0->farm-haystack==0.6.0) (1.1.0)\n", "Requirement already satisfied: joblib>=0.11 in /usr/local/lib/python3.6/dist-packages (from scikit-learn->sklearn->farm-haystack==0.6.0) (1.0.0)\n", "Requirement already satisfied: importlib-resources>=1.0; python_version < \"3.7\" in /usr/local/lib/python3.6/dist-packages (from virtualenv!=20.0.0,!=20.0.1,!=20.0.2,!=20.0.3,!=20.0.4,!=20.0.5,!=20.0.6,!=20.0.7,>=16.0.0->tox->farm-haystack==0.6.0) (3.3.0)\n", "Collecting distlib<1,>=0.3.1\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/f5/0a/490fa011d699bb5a5f3a0cf57de82237f52a6db9d40f33c53b2736c9a1f9/distlib-0.3.1-py2.py3-none-any.whl (335kB)\n", "\u001b[K |████████████████████████████████| 337kB 60.5MB/s \n", "\u001b[?25hCollecting appdirs<2,>=1.4.3\n", " Downloading https://files.pythonhosted.org/packages/3b/00/2344469e2084fb287c2e0b57b72910309874c3245463acd6cf5e3db69324/appdirs-1.4.4-py2.py3-none-any.whl\n", "Requirement already satisfied: pyparsing>=2.0.2 in /usr/local/lib/python3.6/dist-packages (from packaging>=14->tox->farm-haystack==0.6.0) (2.4.7)\n", "Requirement already satisfied: zipp>=0.5 in /usr/local/lib/python3.6/dist-packages (from importlib-metadata<3,>=0.12; python_version < \"3.8\"->tox->farm-haystack==0.6.0) (3.4.0)\n", "Requirement already satisfied: h5py in /usr/local/lib/python3.6/dist-packages (from Keras>=2.2.4->seqeval==0.0.12->farm==0.5.0->farm-haystack==0.6.0) (2.10.0)\n", "Requirement already satisfied: tabulate>=0.7.7 in /usr/local/lib/python3.6/dist-packages (from databricks-cli>=0.8.0->mlflow==1.0.0->farm==0.5.0->farm-haystack==0.6.0) (0.8.7)\n", "Collecting gitdb<5,>=4.0.1\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/48/11/d1800bca0a3bae820b84b7d813ad1eff15a48a64caea9c823fc8c1b119e8/gitdb-4.0.5-py3-none-any.whl (63kB)\n", "\u001b[K |████████████████████████████████| 71kB 1.9MB/s \n", "\u001b[?25hCollecting Mako\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/a6/37/0e706200d22172eb8fa17d68a7ae22dec7631a0a92266634fb518a88a5b2/Mako-1.1.3-py2.py3-none-any.whl (75kB)\n", "\u001b[K |████████████████████████████████| 81kB 14.0MB/s \n", "\u001b[?25hCollecting python-editor>=0.3\n", " Downloading https://files.pythonhosted.org/packages/c6/d3/201fc3abe391bbae6606e6f1d598c15d367033332bd54352b12f35513717/python_editor-1.0.4-py3-none-any.whl\n", "Collecting websocket-client>=0.32.0\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/4c/5f/f61b420143ed1c8dc69f9eaec5ff1ac36109d52c80de49d66e0c36c3dfdf/websocket_client-0.57.0-py2.py3-none-any.whl (200kB)\n", "\u001b[K |████████████████████████████████| 204kB 58.5MB/s \n", "\u001b[?25hRequirement already satisfied: MarkupSafe>=0.23 in /usr/local/lib/python3.6/dist-packages (from Jinja2>=2.10.1->flask->farm==0.5.0->farm-haystack==0.6.0) (1.1.1)\n", "Collecting smmap<4,>=3.0.1\n", " Downloading https://files.pythonhosted.org/packages/b0/9a/4d409a6234eb940e6a78dfdfc66156e7522262f5f2fecca07dc55915952d/smmap-3.0.4-py2.py3-none-any.whl\n", "Building wheels for collected packages: python-multipart, langdetect, tika, sqlalchemy-utils, python-docx, seqeval, sacremoses, databricks-cli\n", " Building wheel for python-multipart (setup.py) ... \u001b[?25l\u001b[?25hdone\n", " Created wheel for python-multipart: filename=python_multipart-0.0.5-cp36-none-any.whl size=31671 sha256=0a59f1ef7b4b3ef62324c163a33aa3ff58decbbc951b2021021425f040d24ec6\n", " Stored in directory: /root/.cache/pip/wheels/f0/e6/66/14a866a3cbd6a0cabfbef91f7edf40aa03595ef6c88d6d1be4\n", " Building wheel for langdetect (setup.py) ... \u001b[?25l\u001b[?25hdone\n", " Created wheel for langdetect: filename=langdetect-1.0.8-cp36-none-any.whl size=993194 sha256=21c74a416e30b2e1d7048d4363953fdd57ded891678346ef6ba4f2711b0281a5\n", " Stored in directory: /root/.cache/pip/wheels/8d/b3/aa/6d99de9f3841d7d3d40a60ea06e6d669e8e5012e6c8b947a57\n", " Building wheel for tika (setup.py) ... \u001b[?25l\u001b[?25hdone\n", " Created wheel for tika: filename=tika-1.24-cp36-none-any.whl size=32885 sha256=82254f4e17038471246232ba8f657133b805be2c90379e2aa425ab5321a3e8ff\n", " Stored in directory: /root/.cache/pip/wheels/73/9c/f5/0b1b738442fc2a2862bef95b908b374f8e80215550fb2a8975\n", " Building wheel for sqlalchemy-utils (setup.py) ... \u001b[?25l\u001b[?25hdone\n", " Created wheel for sqlalchemy-utils: filename=SQLAlchemy_Utils-0.36.8-py2.py3-none-any.whl size=93220 sha256=6d55e4d4f1adef609d0f145b405ab7b028b6866b68a0e6130d492d14b2eb9607\n", " Stored in directory: /root/.cache/pip/wheels/68/31/b6/a96bf6868f42753696d647846c9a0f8e51bd99295790d07660\n", " Building wheel for python-docx (setup.py) ... \u001b[?25l\u001b[?25hdone\n", " Created wheel for python-docx: filename=python_docx-0.8.10-cp36-none-any.whl size=184491 sha256=f3dfbdb09cc4716346ce7fc0a371d29b748d9e96a1a700ee4946a4933ab8ece2\n", " Stored in directory: /root/.cache/pip/wheels/18/0b/a0/1dd62ff812c857c9e487f27d80d53d2b40531bec1acecfa47b\n", " Building wheel for seqeval (setup.py) ... \u001b[?25l\u001b[?25hdone\n", " Created wheel for seqeval: filename=seqeval-0.0.12-cp36-none-any.whl size=7424 sha256=cf33a62c02c6d373cbd8c3841bd5085356d5da2a9a4064af58cac1aebb35748c\n", " Stored in directory: /root/.cache/pip/wheels/4f/32/0a/df3b340a82583566975377d65e724895b3fad101a3fb729f68\n", " Building wheel for sacremoses (setup.py) ... \u001b[?25l\u001b[?25hdone\n", " Created wheel for sacremoses: filename=sacremoses-0.0.43-cp36-none-any.whl size=893261 sha256=4d768e9af78825f19673b2d4b776e31a897b760396ebc71399d8b484886871fc\n", " Stored in directory: /root/.cache/pip/wheels/29/3c/fd/7ce5c3f0666dab31a50123635e6fb5e19ceb42ce38d4e58f45\n", " Building wheel for databricks-cli (setup.py) ... \u001b[?25l\u001b[?25hdone\n", " Created wheel for databricks-cli: filename=databricks_cli-0.14.1-cp36-none-any.whl size=100579 sha256=69e07ef2dad31db2c9dae884c55407849e95145d192c4d68512092c26079bbbe\n", " Stored in directory: /root/.cache/pip/wheels/82/91/ac/5d417ee5ccbb76c8cca096cf4cfb9ed9d49d889d1d1ca0fc39\n", "Successfully built python-multipart langdetect tika sqlalchemy-utils python-docx seqeval sacremoses databricks-cli\n", "\u001b[31mERROR: torchvision 0.8.1+cu101 has requirement torch==1.7.0, but you'll have torch 1.6.0 which is incompatible.\u001b[0m\n", "\u001b[31mERROR: pytest 3.6.4 has requirement pluggy<0.8,>=0.5, but you'll have pluggy 0.13.1 which is incompatible.\u001b[0m\n", "\u001b[31mERROR: datascience 0.10.6 has requirement folium==0.2.1, but you'll have folium 0.8.3 which is incompatible.\u001b[0m\n", "\u001b[31mERROR: botocore 1.19.46 has requirement urllib3<1.27,>=1.25.4; python_version != \"3.4\", but you'll have urllib3 1.24.3 which is incompatible.\u001b[0m\n", "Installing collected packages: uvloop, elasticsearch, flask-cors, dotmap, tokenizers, sentencepiece, sacremoses, transformers, torch, jmespath, botocore, s3transfer, boto3, seqeval, gunicorn, databricks-cli, smmap, gitdb, gitpython, simplejson, querystring-parser, Mako, python-editor, alembic, websocket-client, docker, mlflow, aniso8601, flask-restplus, Werkzeug, farm, pydantic, starlette, fastapi, python-multipart, langdetect, psycopg2-binary, faiss-cpu, tika, h11, uvicorn, sqlalchemy-utils, httptools, distlib, appdirs, importlib-metadata, virtualenv, pluggy, tox, python-docx, elastic-apm, farm-haystack\n", " Found existing installation: torch 1.7.0+cu101\n", " Uninstalling torch-1.7.0+cu101:\n", " Successfully uninstalled torch-1.7.0+cu101\n", " Found existing installation: Werkzeug 1.0.1\n", " Uninstalling Werkzeug-1.0.1:\n", " Successfully uninstalled Werkzeug-1.0.1\n", " Found existing installation: importlib-metadata 3.3.0\n", " Uninstalling importlib-metadata-3.3.0:\n", " Successfully uninstalled importlib-metadata-3.3.0\n", " Found existing installation: pluggy 0.7.1\n", " Uninstalling pluggy-0.7.1:\n", " Successfully uninstalled pluggy-0.7.1\n", "Successfully installed Mako-1.1.3 Werkzeug-0.16.1 alembic-1.4.3 aniso8601-8.1.0 appdirs-1.4.4 boto3-1.16.46 botocore-1.19.46 databricks-cli-0.14.1 distlib-0.3.1 docker-4.4.1 dotmap-1.3.0 elastic-apm-5.10.0 elasticsearch-7.10.0 faiss-cpu-1.6.3 farm-0.5.0 farm-haystack-0.6.0 fastapi-0.63.0 flask-cors-3.0.9 flask-restplus-0.13.0 gitdb-4.0.5 gitpython-3.1.11 gunicorn-20.0.4 h11-0.11.0 httptools-0.1.1 importlib-metadata-2.1.1 jmespath-0.10.0 langdetect-1.0.8 mlflow-1.0.0 pluggy-0.13.1 psycopg2-binary-2.8.6 pydantic-1.7.3 python-docx-0.8.10 python-editor-1.0.4 python-multipart-0.0.5 querystring-parser-1.2.4 s3transfer-0.3.3 sacremoses-0.0.43 sentencepiece-0.1.94 seqeval-0.0.12 simplejson-3.17.2 smmap-3.0.4 sqlalchemy-utils-0.36.8 starlette-0.13.6 tika-1.24 tokenizers-0.8.1rc2 torch-1.6.0 tox-3.20.1 transformers-3.3.1 uvicorn-0.13.3 uvloop-0.14.0 virtualenv-20.2.2 websocket-client-0.57.0\n", "Collecting urllib3==1.25.4\n", "\u001b[?25l Downloading https://files.pythonhosted.org/packages/91/0d/7777358f672a14b7ae0dfcd29f949f409f913e0578190d6bfa68eb55864b/urllib3-1.25.4-py2.py3-none-any.whl (125kB)\n", "\u001b[K |████████████████████████████████| 133kB 7.5MB/s \n", "\u001b[31mERROR: datascience 0.10.6 has requirement folium==0.2.1, but you'll have folium 0.8.3 which is incompatible.\u001b[0m\n", "\u001b[?25hInstalling collected packages: urllib3\n", " Found existing installation: urllib3 1.24.3\n", " Uninstalling urllib3-1.24.3:\n", " Successfully uninstalled urllib3-1.24.3\n", "Successfully installed urllib3-1.25.4\n", "Looking in links: https://download.pytorch.org/whl/torch_stable.html\n", "Collecting torch==1.6.0+cu101\n", "\u001b[?25l Downloading https://download.pytorch.org/whl/cu101/torch-1.6.0%2Bcu101-cp36-cp36m-linux_x86_64.whl (708.0MB)\n", "\u001b[K |████████████████████████████████| 708.0MB 10kB/s \n", "\u001b[?25hRequirement already satisfied: future in /usr/local/lib/python3.6/dist-packages (from torch==1.6.0+cu101) (0.16.0)\n", "Requirement already satisfied: numpy in /usr/local/lib/python3.6/dist-packages (from torch==1.6.0+cu101) (1.19.4)\n", "\u001b[31mERROR: torchvision 0.8.1+cu101 has requirement torch==1.7.0, but you'll have torch 1.6.0+cu101 which is incompatible.\u001b[0m\n", "Installing collected packages: torch\n", " Found existing installation: torch 1.6.0\n", " Uninstalling torch-1.6.0:\n", " Successfully uninstalled torch-1.6.0\n", "Successfully installed torch-1.6.0+cu101\n" ], "name": "stdout" } ] }, { "cell_type": "markdown", "metadata": { "id": "R8aBneh68pIJ" }, "source": [ "# Extractive QA in a closed domain (single text)" ] }, { "cell_type": "code", "metadata": { "pycharm": { "is_executing": false }, "id": "m7G3G4BjNcSz", "colab": { "base_uri": "https://localhost:8080/", "height": 790, "referenced_widgets": [ "8e2aa2531c9a4890ad722171e4a51122", "743ca8b7bb574fd99e3e0516ab60b7cc", "4b361abfd25e4dfc86b96384753f6cdd", "f0da7f5b445a443ca0692396bdf54062", "f480c548d8774a8abc61bd8b045fee29", "8459fc383ca741a29ea1d53190b71457", "d1b1591de71a4bd8a2909975ee82d998", "afebd0dd7f3e4e44927220ad3fc13f17", "56c4b0d2d2654b1e9470d8f0b920ae16", "d0cdfe65d369405a90a7abcf38c289c8", "d66b73d1014848db81edf429c857f9ef", "dce3f0f8e8e24c73913617300da7e370", "2c095740976c4bdea8645e21d277283c", "fbc7634200954b37b7c869050591022d", "4a622f003ef24eb79fd13a08f423c665", "db417b1034bc405fa3ac881361f943c4", "ede7dfd59ae8455689373afda2771132", "44ff19ca5bed4708aa3bf39032563b2e", "9be9ce8699c14d47853d40e9cd8bf7d0", "9f3d806fbec84b179e9a49e49c905fa3", "f532cd4dccc14741a9d4e506a72507ad", "d810a6e15c1d43e0a6793c60622da504", "4f711e53e0944446aa51e8a241017e8c", "19b30696c5294a6b92cecb3533b10fed", "7fb2356ed11344af950f52ebad7c57e1", "74e56a811e4646839645cc8e1cd2945e", "9207a922be2c48b187d97e1575b3ac39", "7726ea4f836642ad8ab3e19a4f087919", "c11075b708e84ffabe5ce867f710dbb3", "3596178648404646a454b7fb5aeb0742", "2b348979747342869ab83a3bb7d9f71a", "9b7811f75b0e42e4b7a4fa14e089668c", "9ffa302c8b604d56a4d9826fb783f786", "a37b422df0054a89a9b59d4233461b1b", "3f9fa7a617c74f4fa4eda55e6d2c3f3d", "18e00a4457534250970b69f8146282e2", "a23f46136c0e4373a9733cc7dba0c95e", "26b1e8ea52e041ce8a1847e4d9483434", "db5bdb301f2b48c5987c8fcb5236cf6c", "6064912c6a0e41d4b0a8389f3e56b1e9", "d5287e28a98749b5b2cd1560f157ff36", "17ffd297995d442d86a05273b39ba7c0", "38d8b60615fa4e649634802153ad09cd", "c86d5ff0dd8b40a5a4ff92d16ffcb21a", "5de028a8c6494e5d9e2c0e1134080369", "bb183a36134f4463a5453a043de77d93", "fbedf7b06f9846348e174fc2536b765f", "cac99d9ef6f74c07937fc39a20157937" ] }, "outputId": "2490aed8-2175-4c60-f39e-3711f21d6f1a" }, "source": [ "# Load a local model or any of the QA models on Hugging Face's model hub (https://huggingface.co/models)\n", "from haystack.reader.farm import FARMReader\n", "\n", "reader = FARMReader(model_name_or_path=\"deepset/roberta-base-squad2\", use_gpu=True, no_ans_boost=0, return_no_answer=False)\n", "\n", "\n", "# Create document which the model should scan for answers.\n", "from haystack import Document\n", "\n", "text = \"The traffic began to slow down on Pioneer Boulevard in Los Angeles, making it difficult to get out of the city. However, WBGO was playing some cool jazz, and the weather was cool, making it rather pleasant to be making it out of the city on this Friday afternoon. Nat King Cole was singing as Jo and Maria slowly made their way out of LA and drove toward Barstow. They planned to get to Las Vegas early enough in the evening to have a nice dinner and go see a show.\"\n", "doc = Document(text=text)" ], "execution_count": 2, "outputs": [ { "output_type": "stream", "text": [ "12/31/2020 16:02:14 - INFO - faiss - Loading faiss with AVX2 support.\n", "12/31/2020 16:02:14 - INFO - faiss - Loading faiss.\n", "12/31/2020 16:02:15 - INFO - farm.utils - device: cuda n_gpu: 1, distributed training: False, automatic mixed precision training: None\n", "12/31/2020 16:02:15 - INFO - farm.infer - Could not find `deepset/roberta-base-squad2` locally. Try to download from model hub ...\n", "12/31/2020 16:02:15 - INFO - filelock - Lock 139851960964880 acquired on /root/.cache/torch/transformers/f7d4b9379a9c487fa03ccf3d8e00058faa9d664cf01fc03409138246f48760da.6060f348ba2b58d6d30b5324910152ffc512e7c3891ed13f22844f1a9b5c0d0f.lock\n" ], "name": "stderr" }, { "output_type": "display_data", "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "8e2aa2531c9a4890ad722171e4a51122", "version_minor": 0, "version_major": 2 }, "text/plain": [ "HBox(children=(FloatProgress(value=0.0, description='Downloading', max=571.0, style=ProgressStyle(description_…" ] }, "metadata": { "tags": [] } }, { "output_type": "stream", "text": [ "12/31/2020 16:02:16 - INFO - filelock - Lock 139851960964880 released on /root/.cache/torch/transformers/f7d4b9379a9c487fa03ccf3d8e00058faa9d664cf01fc03409138246f48760da.6060f348ba2b58d6d30b5324910152ffc512e7c3891ed13f22844f1a9b5c0d0f.lock\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "12/31/2020 16:02:16 - INFO - filelock - Lock 139849313884144 acquired on /root/.cache/torch/transformers/8c0c8b6371111ac5fbc176aefcf9dbe129db7be654c569b8375dd3712fc4dc67.a851909c96149f062acca04d647da88d0dcd3a52cd5a8c7169e89fc6e5971c7b.lock\n" ], "name": "stderr" }, { "output_type": "display_data", "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "56c4b0d2d2654b1e9470d8f0b920ae16", "version_minor": 0, "version_major": 2 }, "text/plain": [ "HBox(children=(FloatProgress(value=0.0, description='Downloading', max=496313727.0, style=ProgressStyle(descri…" ] }, "metadata": { "tags": [] } }, { "output_type": "stream", "text": [ "12/31/2020 16:02:30 - INFO - filelock - Lock 139849313884144 released on /root/.cache/torch/transformers/8c0c8b6371111ac5fbc176aefcf9dbe129db7be654c569b8375dd3712fc4dc67.a851909c96149f062acca04d647da88d0dcd3a52cd5a8c7169e89fc6e5971c7b.lock\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "Some weights of RobertaModel were not initialized from the model checkpoint at deepset/roberta-base-squad2 and are newly initialized: ['roberta.pooler.dense.weight', 'roberta.pooler.dense.bias']\n", "You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.\n", "12/31/2020 16:02:32 - WARNING - farm.modeling.language_model - Could not automatically detect from language model name what language it is. \n", "\t We guess it's an *ENGLISH* model ... \n", "\t If not: Init the language model by supplying the 'language' param.\n", "12/31/2020 16:02:44 - INFO - filelock - Lock 139849313883528 acquired on /root/.cache/torch/transformers/1e3af82648d7190d959a9d76d727ef629b1ca51b3da6ad04039122453cb56307.6a4061e8fc00057d21d80413635a86fdcf55b6e7594ad9e25257d2f99a02f4be.lock\n" ], "name": "stderr" }, { "output_type": "display_data", "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "ede7dfd59ae8455689373afda2771132", "version_minor": 0, "version_major": 2 }, "text/plain": [ "HBox(children=(FloatProgress(value=0.0, description='Downloading', max=898822.0, style=ProgressStyle(descripti…" ] }, "metadata": { "tags": [] } }, { "output_type": "stream", "text": [ "12/31/2020 16:02:45 - INFO - filelock - Lock 139849313883528 released on /root/.cache/torch/transformers/1e3af82648d7190d959a9d76d727ef629b1ca51b3da6ad04039122453cb56307.6a4061e8fc00057d21d80413635a86fdcf55b6e7594ad9e25257d2f99a02f4be.lock\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "12/31/2020 16:02:45 - INFO - filelock - Lock 139849296850280 acquired on /root/.cache/torch/transformers/b901c69e8e7da4a24c635ad81d016d274f174261f4f5c144e43f4b00e242c3b0.70bec105b4158ed9a1747fea67a43f5dee97855c64d62b6ec3742f4cfdb5feda.lock\n" ], "name": "stderr" }, { "output_type": "display_data", "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "7fb2356ed11344af950f52ebad7c57e1", "version_minor": 0, "version_major": 2 }, "text/plain": [ "HBox(children=(FloatProgress(value=0.0, description='Downloading', max=456318.0, style=ProgressStyle(descripti…" ] }, "metadata": { "tags": [] } }, { "output_type": "stream", "text": [ "12/31/2020 16:02:46 - INFO - filelock - Lock 139849296850280 released on /root/.cache/torch/transformers/b901c69e8e7da4a24c635ad81d016d274f174261f4f5c144e43f4b00e242c3b0.70bec105b4158ed9a1747fea67a43f5dee97855c64d62b6ec3742f4cfdb5feda.lock\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "12/31/2020 16:02:46 - INFO - filelock - Lock 139849313883528 acquired on /root/.cache/torch/transformers/2d9b03b59a8af464bf4238025a3cf0e5a340b9d0ba77400011e23c130b452510.6e217123a3ada61145de1f20b1443a1ec9aac93492a4bd1ce6a695935f0fd97a.lock\n" ], "name": "stderr" }, { "output_type": "display_data", "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "9ffa302c8b604d56a4d9826fb783f786", "version_minor": 0, "version_major": 2 }, "text/plain": [ "HBox(children=(FloatProgress(value=0.0, description='Downloading', max=772.0, style=ProgressStyle(description_…" ] }, "metadata": { "tags": [] } }, { "output_type": "stream", "text": [ "12/31/2020 16:02:47 - INFO - filelock - Lock 139849313883528 released on /root/.cache/torch/transformers/2d9b03b59a8af464bf4238025a3cf0e5a340b9d0ba77400011e23c130b452510.6e217123a3ada61145de1f20b1443a1ec9aac93492a4bd1ce6a695935f0fd97a.lock\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "12/31/2020 16:02:47 - INFO - filelock - Lock 139849313883528 acquired on /root/.cache/torch/transformers/507984f2e28c7dfed5db9a20acd68beb969c7f2833abc9e582e967fa0291f3dc.ec06af3e1b426682955dab3bd553eaf178b6eafac9079fc133925e0e2654213e.lock\n" ], "name": "stderr" }, { "output_type": "display_data", "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "d5287e28a98749b5b2cd1560f157ff36", "version_minor": 0, "version_major": 2 }, "text/plain": [ "HBox(children=(FloatProgress(value=0.0, description='Downloading', max=79.0, style=ProgressStyle(description_w…" ] }, "metadata": { "tags": [] } }, { "output_type": "stream", "text": [ "12/31/2020 16:02:47 - INFO - filelock - Lock 139849313883528 released on /root/.cache/torch/transformers/507984f2e28c7dfed5db9a20acd68beb969c7f2833abc9e582e967fa0291f3dc.ec06af3e1b426682955dab3bd553eaf178b6eafac9079fc133925e0e2654213e.lock\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "12/31/2020 16:02:48 - INFO - farm.utils - device: cuda n_gpu: 1, distributed training: False, automatic mixed precision training: None\n", "12/31/2020 16:02:48 - INFO - farm.infer - Got ya 1 parallel workers to do inference ...\n", "12/31/2020 16:02:48 - INFO - farm.infer - 0 \n", "12/31/2020 16:02:48 - INFO - farm.infer - /w\\\n", "12/31/2020 16:02:48 - INFO - farm.infer - /'\\\n", "12/31/2020 16:02:48 - INFO - farm.infer - \n" ], "name": "stderr" } ] }, { "cell_type": "code", "metadata": { "id": "om3NX21XPiu1", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "f7dec3f6-68c2-49a2-d4ab-1a5e7448d050" }, "source": [ "# Some questions that \"work\":\n", "reader.predict(query=\"Where is Pioneer Boulevard located?\", documents=[doc])" ], "execution_count": 3, "outputs": [ { "output_type": "stream", "text": [ "Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 22.86 Batches/s]\n" ], "name": "stderr" }, { "output_type": "execute_result", "data": { "text/plain": [ "{'answers': [{'answer': 'Los Angeles',\n", " 'context': 'The traffic began to slow down on Pioneer Boulevard in Los Angeles, making it difficult to get out of the city. However, WBGO was playing some cool ja',\n", " 'document_id': '4fa8dd28-9694-47cb-bc5a-19a74f357403',\n", " 'offset_end': 66,\n", " 'offset_end_in_doc': 66,\n", " 'offset_start': 55,\n", " 'offset_start_in_doc': 55,\n", " 'probability': 0.8022719840448774,\n", " 'score': 11.204442024230957}],\n", " 'no_ans_gap': 10.05622935295105,\n", " 'query': 'Where is Pioneer Boulevard located?'}" ] }, "metadata": { "tags": [] }, "execution_count": 3 } ] }, { "cell_type": "code", "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "vKtgTCpRCtYd", "outputId": "7256a926-11f3-4f11-947b-68c406682c62" }, "source": [ "reader.predict(query=\"Who drove to Las Vegas?\", documents=[doc])\n" ], "execution_count": 4, "outputs": [ { "output_type": "stream", "text": [ "Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 34.52 Batches/s]\n" ], "name": "stderr" }, { "output_type": "execute_result", "data": { "text/plain": [ "{'answers': [{'answer': 'Jo and Maria',\n", " 'context': 't of the city on this Friday afternoon. Nat King Cole was singing as Jo and Maria slowly made their way out of LA and drove toward Barstow. They plann',\n", " 'document_id': '4fa8dd28-9694-47cb-bc5a-19a74f357403',\n", " 'offset_end': 81,\n", " 'offset_end_in_doc': 305,\n", " 'offset_start': 69,\n", " 'offset_start_in_doc': 293,\n", " 'probability': 0.8081116565023317,\n", " 'score': 11.50229263305664}],\n", " 'no_ans_gap': 3.7832298278808594,\n", " 'query': 'Who drove to Las Vegas?'}" ] }, "metadata": { "tags": [] }, "execution_count": 4 } ] }, { "cell_type": "code", "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "6FErhiqfC2kz", "outputId": "d7726229-fb30-4e59-843d-5fa991c80071" }, "source": [ "reader.predict(query=\"Who is singing?\", documents=[doc])\n" ], "execution_count": 5, "outputs": [ { "output_type": "stream", "text": [ "Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 34.25 Batches/s]\n" ], "name": "stderr" }, { "output_type": "execute_result", "data": { "text/plain": [ "{'answers': [{'answer': 'Nat King Cole',\n", " 'context': 'r pleasant to be making it out of the city on this Friday afternoon. Nat King Cole was singing as Jo and Maria slowly made their way out of LA and dro',\n", " 'document_id': '4fa8dd28-9694-47cb-bc5a-19a74f357403',\n", " 'offset_end': 82,\n", " 'offset_end_in_doc': 277,\n", " 'offset_start': 69,\n", " 'offset_start_in_doc': 264,\n", " 'probability': 0.8818636635368704,\n", " 'score': 16.081584930419922}],\n", " 'no_ans_gap': 12.141630411148071,\n", " 'query': 'Who is singing?'}" ] }, "metadata": { "tags": [] }, "execution_count": 5 } ] }, { "cell_type": "code", "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "bUasgTO7DFN2", "outputId": "f533b27f-7d6c-4160-e93e-deaf30e50b11" }, "source": [ "reader.predict(query=\"What is the plan for the night?\", documents=[doc])\n" ], "execution_count": 6, "outputs": [ { "output_type": "stream", "text": [ "Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 32.32 Batches/s]\n" ], "name": "stderr" }, { "output_type": "execute_result", "data": { "text/plain": [ "{'answers': [{'answer': 'They planned to get to Las Vegas early enough in the evening to have a nice dinner and go see a show',\n", " 'context': 'de their way out of LA and drove toward Barstow. They planned to get to Las Vegas early enough in the evening to have a nice dinner and go see a show.',\n", " 'document_id': '4fa8dd28-9694-47cb-bc5a-19a74f357403',\n", " 'offset_end': 149,\n", " 'offset_end_in_doc': 464,\n", " 'offset_start': 49,\n", " 'offset_start_in_doc': 364,\n", " 'probability': 0.7315710454025786,\n", " 'score': 8.020864486694336}],\n", " 'no_ans_gap': 6.077347040176392,\n", " 'query': 'What is the plan for the night?'}" ] }, "metadata": { "tags": [] }, "execution_count": 6 } ] }, { "cell_type": "code", "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "EAbgAUqe9wmN", "outputId": "e88a40ec-2e79-45a9-b934-1be2bf25e801" }, "source": [ "# Some questions where the answer is not in the text (and the model therefore cannot find it)\n", "# If you inspect the results, you will see that the value \"no_ans_gap\" is negative for all these questions and actually indicates that the likelihood of \"no answer\" is higher than the best textual answer\n", "questions = [\"Where is Los Angeles located?\",\"Where is LA located?\",\"Where is Barstow located?\",\"Where is Las Vegas located ?\"]\n", "for q in questions:\n", " result = reader.predict(query=q, documents=[doc])\n", " print(result)\n", " print(\"\\n\")" ], "execution_count": 7, "outputs": [ { "output_type": "stream", "text": [ "Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 32.07 Batches/s]\n", "Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 31.49 Batches/s]\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "{'query': 'Where is Los Angeles located?', 'no_ans_gap': -0.41483497619628906, 'answers': [{'answer': 'Pioneer Boulevard in Los Angeles, making it difficult to get out of the city. However, WBGO was playing some cool jazz, and the weather was cool, making it rather pleasant to be making it out of the city on this Friday afternoon. Nat King Cole was singing as Jo and Maria slowly made their way out of LA and drove toward Barstow', 'score': 1.0702476501464844, 'probability': 0.5333954464343146, 'context': 'Pioneer Boulevard in Los Angeles, making it difficult to get out of the city. However, WBGO was playing some cool jazz, and the weather was cool, making it rather pleasant to be making it out of the city on this Friday afternoon. Nat King Cole was singing as Jo and Maria slowly made their way out of LA and drove toward Barstow', 'offset_start': 0, 'offset_end': 328, 'offset_start_in_doc': 34, 'offset_end_in_doc': 362, 'document_id': '4fa8dd28-9694-47cb-bc5a-19a74f357403'}]}\n", "\n", "\n", "{'query': 'Where is LA located?', 'no_ans_gap': -0.19409167766571045, 'answers': [{'answer': 'Pioneer Boulevard in Los Angeles, making it difficult to get out of the city. However, WBGO was playing some cool jazz, and the weather was cool, making it rather pleasant to be making it out of the city on this Friday afternoon. Nat King Cole was singing as Jo and Maria slowly made their way out of LA and drove toward Barstow', 'score': 1.6217641830444336, 'probability': 0.5505072801165964, 'context': 'Pioneer Boulevard in Los Angeles, making it difficult to get out of the city. However, WBGO was playing some cool jazz, and the weather was cool, making it rather pleasant to be making it out of the city on this Friday afternoon. Nat King Cole was singing as Jo and Maria slowly made their way out of LA and drove toward Barstow', 'offset_start': 0, 'offset_end': 328, 'offset_start_in_doc': 34, 'offset_end_in_doc': 362, 'document_id': '4fa8dd28-9694-47cb-bc5a-19a74f357403'}]}\n", "\n", "\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 30.91 Batches/s]\n", "Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 32.63 Batches/s]" ], "name": "stderr" }, { "output_type": "stream", "text": [ "{'query': 'Where is Barstow located?', 'no_ans_gap': -1.593643844127655, 'answers': [{'answer': 'Las Vegas', 'score': 0.7261489033699036, 'probability': 0.522676586113031, 'context': 'de their way out of LA and drove toward Barstow. They planned to get to Las Vegas early enough in the evening to have a nice dinner and go see a show.', 'offset_start': 72, 'offset_end': 81, 'offset_start_in_doc': 387, 'offset_end_in_doc': 396, 'document_id': '4fa8dd28-9694-47cb-bc5a-19a74f357403'}]}\n", "\n", "\n", "{'query': 'Where is Las Vegas located ?', 'no_ans_gap': -2.1370767652988434, 'answers': [{'answer': 'Los Angeles', 'score': -0.025329262018203735, 'probability': 0.49920846122316637, 'context': 'The traffic began to slow down on Pioneer Boulevard in Los Angeles, making it difficult to get out of the city. However, WBGO was playing some cool ja', 'offset_start': 55, 'offset_end': 66, 'offset_start_in_doc': 55, 'offset_end_in_doc': 66, 'document_id': '4fa8dd28-9694-47cb-bc5a-19a74f357403'}]}\n", "\n", "\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "\n" ], "name": "stderr" } ] }, { "cell_type": "code", "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "F174S7VV-xRZ", "outputId": "45eacf23-6e02-43af-d11a-7c3e2cf9400b" }, "source": [ "# We can also directly make use of this \"no answer\" option and allow our reader to return \"no answer\" (indicated via \"answer: None\" in the results) by enabling the arg in the FARMreader:\n", "reader = FARMReader(model_name_or_path=\"deepset/roberta-base-squad2\", use_gpu=True, no_ans_boost=0, return_no_answer=True)\n", "for q in questions:\n", " result = reader.predict(query=q, documents=[doc])\n", " print(result)\n", " print(\"\\n\")" ], "execution_count": 8, "outputs": [ { "output_type": "stream", "text": [ "12/31/2020 16:02:49 - INFO - farm.utils - device: cuda n_gpu: 1, distributed training: False, automatic mixed precision training: None\n", "12/31/2020 16:02:49 - INFO - farm.infer - Could not find `deepset/roberta-base-squad2` locally. Try to download from model hub ...\n", "Some weights of RobertaModel were not initialized from the model checkpoint at deepset/roberta-base-squad2 and are newly initialized: ['roberta.pooler.dense.weight', 'roberta.pooler.dense.bias']\n", "You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.\n", "12/31/2020 16:02:53 - WARNING - farm.modeling.language_model - Could not automatically detect from language model name what language it is. \n", "\t We guess it's an *ENGLISH* model ... \n", "\t If not: Init the language model by supplying the 'language' param.\n", "12/31/2020 16:03:00 - INFO - farm.utils - device: cuda n_gpu: 1, distributed training: False, automatic mixed precision training: None\n", "12/31/2020 16:03:00 - INFO - farm.infer - Got ya 1 parallel workers to do inference ...\n", "12/31/2020 16:03:00 - INFO - farm.infer - 0 \n", "12/31/2020 16:03:00 - INFO - farm.infer - /w\\\n", "12/31/2020 16:03:00 - INFO - farm.infer - /'\\\n", "12/31/2020 16:03:00 - INFO - farm.infer - \n", "Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 30.13 Batches/s]\n", "Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 33.39 Batches/s]\n" ], "name": "stderr" }, { "output_type": "stream", "text": [ "{'query': 'Where is Los Angeles located?', 'no_ans_gap': -0.41483497619628906, 'answers': [{'answer': None, 'score': 1.4850826263427734, 'probability': 0.5462760172072342, 'context': None, 'offset_start': 0, 'offset_end': 0, 'document_id': None, 'meta': None}, {'answer': 'Pioneer Boulevard in Los Angeles, making it difficult to get out of the city. However, WBGO was playing some cool jazz, and the weather was cool, making it rather pleasant to be making it out of the city on this Friday afternoon. Nat King Cole was singing as Jo and Maria slowly made their way out of LA and drove toward Barstow', 'score': 1.0702476501464844, 'probability': 0.5333954464343146, 'context': 'Pioneer Boulevard in Los Angeles, making it difficult to get out of the city. However, WBGO was playing some cool jazz, and the weather was cool, making it rather pleasant to be making it out of the city on this Friday afternoon. Nat King Cole was singing as Jo and Maria slowly made their way out of LA and drove toward Barstow', 'offset_start': 0, 'offset_end': 328, 'offset_start_in_doc': 34, 'offset_end_in_doc': 362, 'document_id': '4fa8dd28-9694-47cb-bc5a-19a74f357403'}]}\n", "\n", "\n", "{'query': 'Where is LA located?', 'no_ans_gap': -0.19409167766571045, 'answers': [{'answer': None, 'score': 1.815855860710144, 'probability': 0.5565031131376446, 'context': None, 'offset_start': 0, 'offset_end': 0, 'document_id': None, 'meta': None}, {'answer': 'Pioneer Boulevard in Los Angeles, making it difficult to get out of the city. However, WBGO was playing some cool jazz, and the weather was cool, making it rather pleasant to be making it out of the city on this Friday afternoon. Nat King Cole was singing as Jo and Maria slowly made their way out of LA and drove toward Barstow', 'score': 1.6217641830444336, 'probability': 0.5505072801165964, 'context': 'Pioneer Boulevard in Los Angeles, making it difficult to get out of the city. However, WBGO was playing some cool jazz, and the weather was cool, making it rather pleasant to be making it out of the city on this Friday afternoon. Nat King Cole was singing as Jo and Maria slowly made their way out of LA and drove toward Barstow', 'offset_start': 0, 'offset_end': 328, 'offset_start_in_doc': 34, 'offset_end_in_doc': 362, 'document_id': '4fa8dd28-9694-47cb-bc5a-19a74f357403'}]}\n", "\n", "\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 33.63 Batches/s]\n", "Inferencing Samples: 100%|██████████| 1/1 [00:00<00:00, 32.21 Batches/s]" ], "name": "stderr" }, { "output_type": "stream", "text": [ "{'query': 'Where is Barstow located?', 'no_ans_gap': -1.593643844127655, 'answers': [{'answer': None, 'score': 2.3197927474975586, 'probability': 0.5719897905641838, 'context': None, 'offset_start': 0, 'offset_end': 0, 'document_id': None, 'meta': None}, {'answer': 'Las Vegas', 'score': 0.7261489033699036, 'probability': 0.522676586113031, 'context': 'de their way out of LA and drove toward Barstow. They planned to get to Las Vegas early enough in the evening to have a nice dinner and go see a show.', 'offset_start': 72, 'offset_end': 81, 'offset_start_in_doc': 387, 'offset_end_in_doc': 396, 'document_id': '4fa8dd28-9694-47cb-bc5a-19a74f357403'}]}\n", "\n", "\n", "{'query': 'Where is Las Vegas located ?', 'no_ans_gap': -2.1370767652988434, 'answers': [{'answer': None, 'score': 2.1370767652988434, 'probability': 0.5663893175959525, 'context': None, 'offset_start': 0, 'offset_end': 0, 'document_id': None, 'meta': None}, {'answer': 'Los Angeles', 'score': -0.025329262018203735, 'probability': 0.49920846122316637, 'context': 'The traffic began to slow down on Pioneer Boulevard in Los Angeles, making it difficult to get out of the city. However, WBGO was playing some cool ja', 'offset_start': 55, 'offset_end': 66, 'offset_start_in_doc': 55, 'offset_end_in_doc': 66, 'document_id': '4fa8dd28-9694-47cb-bc5a-19a74f357403'}]}\n", "\n", "\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "\n" ], "name": "stderr" } ] } ] } ================================================ FILE: Chapter10/QA.ipynb ================================================ { "nbformat": 4, "nbformat_minor": 0, "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.7.6" }, "pycharm": { "stem_cell": { "cell_type": "raw", "source": [], "metadata": { "collapsed": false } } }, "colab": { "name": "QA.ipynb", "provenance": [], "collapsed_sections": [] }, "widgets": { "application/vnd.jupyter.widget-state+json": { "ec5480ed053b46cdb517d77899900a2f": { "model_module": "@jupyter-widgets/controls", "model_name": "HBoxModel", "state": { "_view_name": "HBoxView", "_dom_classes": [], "_model_name": "HBoxModel", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.5.0", "box_style": "", "layout": "IPY_MODEL_85a1571687b84d19ae442c5f81f26f7a", "_model_module": "@jupyter-widgets/controls", "children": [ "IPY_MODEL_e2eb771a977f44c180f99f87ca99fd77", "IPY_MODEL_cb3ee57a490d4a9592e4b122d0d81948" ] } }, "85a1571687b84d19ae442c5f81f26f7a": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "e2eb771a977f44c180f99f87ca99fd77": { "model_module": "@jupyter-widgets/controls", "model_name": "FloatProgressModel", "state": { "_view_name": "ProgressView", "style": "IPY_MODEL_3da87e6c040b440988d93c43ac3a2c09", "_dom_classes": [], "description": "Downloading: 100%", "_model_name": "FloatProgressModel", "bar_style": "success", "max": 463, "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": 463, "_view_count": null, "_view_module_version": "1.5.0", "orientation": "horizontal", "min": 0, "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_dceac99c8e52482e9f62a5a55898641e" } }, "cb3ee57a490d4a9592e4b122d0d81948": { "model_module": "@jupyter-widgets/controls", "model_name": "HTMLModel", "state": { "_view_name": "HTMLView", "style": "IPY_MODEL_a07d84c9acd24a909d65f8b16f85fbe9", "_dom_classes": [], "description": "", "_model_name": "HTMLModel", "placeholder": "​", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": " 463/463 [00:00<00:00, 982B/s]", "_view_count": null, "_view_module_version": "1.5.0", "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_1f9e54937bcc49858ae9a938d899379b" } }, "3da87e6c040b440988d93c43ac3a2c09": { "model_module": "@jupyter-widgets/controls", "model_name": "ProgressStyleModel", "state": { "_view_name": "StyleView", "_model_name": "ProgressStyleModel", "description_width": "initial", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "bar_color": null, "_model_module": "@jupyter-widgets/controls" } }, "dceac99c8e52482e9f62a5a55898641e": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "a07d84c9acd24a909d65f8b16f85fbe9": { "model_module": "@jupyter-widgets/controls", "model_name": "DescriptionStyleModel", "state": { "_view_name": "StyleView", "_model_name": "DescriptionStyleModel", "description_width": "", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "_model_module": "@jupyter-widgets/controls" } }, "1f9e54937bcc49858ae9a938d899379b": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "27a07215928f497db5e317b82e9e5922": { "model_module": "@jupyter-widgets/controls", "model_name": "HBoxModel", "state": { "_view_name": "HBoxView", "_dom_classes": [], "_model_name": "HBoxModel", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.5.0", "box_style": "", "layout": "IPY_MODEL_7c2581de98954b79b19ee3c6a2259ba7", "_model_module": "@jupyter-widgets/controls", "children": [ "IPY_MODEL_7b616c40df534e3fa921221fe620a3d9", "IPY_MODEL_23afbc66e2dd43eb84d3d731c46263f1" ] } }, "7c2581de98954b79b19ee3c6a2259ba7": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "7b616c40df534e3fa921221fe620a3d9": { "model_module": "@jupyter-widgets/controls", "model_name": "FloatProgressModel", "state": { "_view_name": "ProgressView", "style": "IPY_MODEL_e084d2d75eb14fe3aacedbd5ecf711bc", "_dom_classes": [], "description": "Downloading: 100%", "_model_name": "FloatProgressModel", "bar_style": "success", "max": 54236116, "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": 54236116, "_view_count": null, "_view_module_version": "1.5.0", "orientation": "horizontal", "min": 0, "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_a4b121058c934cf081e8af64e893b913" } }, "23afbc66e2dd43eb84d3d731c46263f1": { "model_module": "@jupyter-widgets/controls", "model_name": "HTMLModel", "state": { "_view_name": "HTMLView", "style": "IPY_MODEL_b7c0ba54037049e7a9ce91a85c41c580", "_dom_classes": [], "description": "", "_model_name": "HTMLModel", "placeholder": "​", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": " 54.2M/54.2M [00:02<00:00, 21.2MB/s]", "_view_count": null, "_view_module_version": "1.5.0", "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_d7d6eb6e2945450fb1c0590a506222e5" } }, "e084d2d75eb14fe3aacedbd5ecf711bc": { "model_module": "@jupyter-widgets/controls", "model_name": "ProgressStyleModel", "state": { "_view_name": "StyleView", "_model_name": "ProgressStyleModel", "description_width": "initial", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "bar_color": null, "_model_module": "@jupyter-widgets/controls" } }, "a4b121058c934cf081e8af64e893b913": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "b7c0ba54037049e7a9ce91a85c41c580": { "model_module": "@jupyter-widgets/controls", "model_name": "DescriptionStyleModel", "state": { "_view_name": "StyleView", "_model_name": "DescriptionStyleModel", "description_width": "", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "_model_module": "@jupyter-widgets/controls" } }, "d7d6eb6e2945450fb1c0590a506222e5": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "a7f35783ec6249be8ccfba1c83ed0e9f": { "model_module": "@jupyter-widgets/controls", "model_name": "HBoxModel", "state": { "_view_name": "HBoxView", "_dom_classes": [], "_model_name": "HBoxModel", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.5.0", "box_style": "", "layout": "IPY_MODEL_c9bd7a41ada546e88509a60576dcbd81", "_model_module": "@jupyter-widgets/controls", "children": [ "IPY_MODEL_dad65f326c614a58a0c13c94accab562", "IPY_MODEL_f853299184d54861874b30a6087c6e3b" ] } }, "c9bd7a41ada546e88509a60576dcbd81": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "dad65f326c614a58a0c13c94accab562": { "model_module": "@jupyter-widgets/controls", "model_name": "FloatProgressModel", "state": { "_view_name": "ProgressView", "style": "IPY_MODEL_863fcf057e7d4fe984c531d6b1291814", "_dom_classes": [], "description": "Downloading: 100%", "_model_name": "FloatProgressModel", "bar_style": "success", "max": 231508, "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": 231508, "_view_count": null, "_view_module_version": "1.5.0", "orientation": "horizontal", "min": 0, "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_058caba18b6d4a1e8fb52d573465255e" } }, "f853299184d54861874b30a6087c6e3b": { "model_module": "@jupyter-widgets/controls", "model_name": "HTMLModel", "state": { "_view_name": "HTMLView", "style": "IPY_MODEL_6234b7e5ca9c4067ae359a31f9b38e27", "_dom_classes": [], "description": "", "_model_name": "HTMLModel", "placeholder": "​", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": " 232k/232k [00:00<00:00, 436kB/s]", "_view_count": null, "_view_module_version": "1.5.0", "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_a475b921ef3d4315990e07512ff759a3" } }, "863fcf057e7d4fe984c531d6b1291814": { "model_module": "@jupyter-widgets/controls", "model_name": "ProgressStyleModel", "state": { "_view_name": "StyleView", "_model_name": "ProgressStyleModel", "description_width": "initial", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "bar_color": null, "_model_module": "@jupyter-widgets/controls" } }, "058caba18b6d4a1e8fb52d573465255e": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "6234b7e5ca9c4067ae359a31f9b38e27": { "model_module": "@jupyter-widgets/controls", "model_name": "DescriptionStyleModel", "state": { "_view_name": "StyleView", "_model_name": "DescriptionStyleModel", "description_width": "", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "_model_module": "@jupyter-widgets/controls" } }, "a475b921ef3d4315990e07512ff759a3": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "469aaef964d644198b9cf9b878c56178": { "model_module": "@jupyter-widgets/controls", "model_name": "HBoxModel", "state": { "_view_name": "HBoxView", "_dom_classes": [], "_model_name": "HBoxModel", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.5.0", "box_style": "", "layout": "IPY_MODEL_045cce4661714b078350aa8c12f86680", "_model_module": "@jupyter-widgets/controls", "children": [ "IPY_MODEL_4f43fcc08b664d0d9a9c5edbec57dfe2", "IPY_MODEL_a61f3ca504574a4db912806d920daad9" ] } }, "045cce4661714b078350aa8c12f86680": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "4f43fcc08b664d0d9a9c5edbec57dfe2": { "model_module": "@jupyter-widgets/controls", "model_name": "FloatProgressModel", "state": { "_view_name": "ProgressView", "style": "IPY_MODEL_da55a1ee4c1547c180fc2fa62e8908d2", "_dom_classes": [], "description": "Downloading: 100%", "_model_name": "FloatProgressModel", "bar_style": "success", "max": 466062, "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": 466062, "_view_count": null, "_view_module_version": "1.5.0", "orientation": "horizontal", "min": 0, "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_9d754136f43641b4996383d99a3163c3" } }, "a61f3ca504574a4db912806d920daad9": { "model_module": "@jupyter-widgets/controls", "model_name": "HTMLModel", "state": { "_view_name": "HTMLView", "style": "IPY_MODEL_fd4c6744a60340b4b004710cf2e9c96c", "_dom_classes": [], "description": "", "_model_name": "HTMLModel", "placeholder": "​", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": " 466k/466k [00:00<00:00, 1.40MB/s]", "_view_count": null, "_view_module_version": "1.5.0", "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_3a196782e8624fca9baf0561b48cf0b8" } }, "da55a1ee4c1547c180fc2fa62e8908d2": { "model_module": "@jupyter-widgets/controls", "model_name": "ProgressStyleModel", "state": { "_view_name": "StyleView", "_model_name": "ProgressStyleModel", "description_width": "initial", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "bar_color": null, "_model_module": "@jupyter-widgets/controls" } }, "9d754136f43641b4996383d99a3163c3": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "fd4c6744a60340b4b004710cf2e9c96c": { "model_module": "@jupyter-widgets/controls", "model_name": "DescriptionStyleModel", "state": { "_view_name": "StyleView", "_model_name": "DescriptionStyleModel", "description_width": "", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "_model_module": "@jupyter-widgets/controls" } }, "3a196782e8624fca9baf0561b48cf0b8": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } } } } }, "cells": [ { "cell_type": "markdown", "metadata": { "id": "eSCTm5d8MhJA" }, "source": [ "Question Answering Transformers with Hugging Face\n", "\n", "Copyright 2020 Denis Rothman\n", "\n", "[Hugging Face notebook Resources and Documentation](https://huggingface.co/)" ] }, { "cell_type": "code", "metadata": { "pycharm": { "name": "#%% code\n" }, "id": "4maAknWNrl_N" }, "source": [ "!pip install -q transformers==4.0.0" ], "execution_count": null, "outputs": [] }, { "cell_type": "code", "metadata": { "pycharm": { "is_executing": false, "name": "#%% code \n" }, "id": "uKaqzCh6rl_V" }, "source": [ "from transformers import pipeline" ], "execution_count": null, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "4VYUquAoa2eT" }, "source": [ "nlp_qa = pipeline('question-answering')" ], "execution_count": null, "outputs": [] }, { "cell_type": "markdown", "metadata": { "id": "ZxKBah-9iYF7" }, "source": [ "Sample 1:The traffic began to slow down on Pioneer Boulevard in Los Angeles, making it difficult to get out of the city. However, WBGO was playing some cool jazz, and the weather was cool, making it rather pleasant to be making it out of the city on this Friday afternoon. Nat King Cole was singing as Jo and Maria slowly made their way out of LA and drove toward Barstow. They planned to get to Las Vegas early enough in the evening to have a nice dinner and go see a show." ] }, { "cell_type": "code", "metadata": { "id": "MqvL7FP6bhzv" }, "source": [ "sequence = \"The traffic began to slow down on Pioneer Boulevard in Los Angeles, making it difficult to get out of the city. However, WBGO was playing some cool jazz, and the weather was cool, making it rather pleasant to be making it out of the city on this Friday afternoon. Nat King Cole was singing as Jo and Maria slowly made their way out of LA and drove toward Barstow. They planned to get to Las Vegas early enough in the evening to have a nice dinner and go see a show.\"" ], "execution_count": null, "outputs": [] }, { "cell_type": "markdown", "metadata": { "id": "syYGx5ZF6rkL" }, "source": [ "Question-Answering" ] }, { "cell_type": "code", "metadata": { "id": "04tFdSHTbsFQ", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "4a720d1b-2764-4868-dbb8-e4ed662915ac" }, "source": [ "nlp_qa(context=sequence, question='Where is Pioneer Boulevard ?')" ], "execution_count": null, "outputs": [ { "output_type": "execute_result", "data": { "text/plain": [ "{'answer': 'Los Angeles', 'end': 66, 'score': 0.9879737496376038, 'start': 55}" ] }, "metadata": { "tags": [] }, "execution_count": 14 } ] }, { "cell_type": "markdown", "metadata": { "id": "Mqt2Z8qN6vNz" }, "source": [ "Named Entity Recognition(NER)" ] }, { "cell_type": "code", "metadata": { "id": "V5GJSN_ui3J6", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "2f21cb0f-4d86-4851-bef4-003a6f67ecab" }, "source": [ "nlp_ner = pipeline(\"ner\")\n", "print(nlp_ner(sequence))" ], "execution_count": null, "outputs": [ { "output_type": "stream", "text": [ "[{'word': 'Pioneer', 'score': 0.9735257029533386, 'entity': 'I-LOC', 'index': 8}, {'word': 'Boulevard', 'score': 0.9944824576377869, 'entity': 'I-LOC', 'index': 9}, {'word': 'Los', 'score': 0.9995775818824768, 'entity': 'I-LOC', 'index': 11}, {'word': 'Angeles', 'score': 0.9995693564414978, 'entity': 'I-LOC', 'index': 12}, {'word': 'W', 'score': 0.991984486579895, 'entity': 'I-ORG', 'index': 26}, {'word': '##B', 'score': 0.990750253200531, 'entity': 'I-ORG', 'index': 27}, {'word': '##G', 'score': 0.9884582161903381, 'entity': 'I-ORG', 'index': 28}, {'word': '##O', 'score': 0.9722681641578674, 'entity': 'I-ORG', 'index': 29}, {'word': 'Nat', 'score': 0.9966881275177002, 'entity': 'I-PER', 'index': 59}, {'word': 'King', 'score': 0.997648298740387, 'entity': 'I-PER', 'index': 60}, {'word': 'Cole', 'score': 0.9986170530319214, 'entity': 'I-PER', 'index': 61}, {'word': 'Jo', 'score': 0.9978788495063782, 'entity': 'I-PER', 'index': 65}, {'word': 'Maria', 'score': 0.9988164901733398, 'entity': 'I-PER', 'index': 67}, {'word': 'LA', 'score': 0.998134434223175, 'entity': 'I-LOC', 'index': 74}, {'word': 'Bar', 'score': 0.9970266819000244, 'entity': 'I-LOC', 'index': 78}, {'word': '##sto', 'score': 0.8573915958404541, 'entity': 'I-LOC', 'index': 79}, {'word': '##w', 'score': 0.9920249581336975, 'entity': 'I-LOC', 'index': 80}, {'word': 'Las', 'score': 0.9993551969528198, 'entity': 'I-LOC', 'index': 87}, {'word': 'Vegas', 'score': 0.9989539384841919, 'entity': 'I-LOC', 'index': 88}]\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "ye1D9aYaun7y", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "e49f9baa-b5e4-401c-d051-951ff090a209" }, "source": [ "nlp_qa = pipeline('question-answering')\n", "print(\"Question 1.\",nlp_qa(context=sequence, question='Where is Pioneer Boulevard ?'))\n", "print(\"Question 2.\",nlp_qa(context=sequence, question='Where is Los Angeles located?'))\n", "print(\"Question 3.\",nlp_qa(context=sequence, question='Where is LA ?'))\n", "print(\"Question 4.\",nlp_qa(context=sequence, question='Where is Barstow ?'))\n", "print(\"Question 5.\",nlp_qa(context=sequence, question='Where is Las Vegas located ?'))" ], "execution_count": null, "outputs": [ { "output_type": "stream", "text": [ "Question 1. {'score': 0.9879737496376038, 'start': 55, 'end': 66, 'answer': 'Los Angeles'}\n", "Question 2. {'score': 0.9875388741493225, 'start': 34, 'end': 51, 'answer': 'Pioneer Boulevard'}\n", "Question 3. {'score': 0.5090540647506714, 'start': 55, 'end': 66, 'answer': 'Los Angeles'}\n", "Question 4. {'score': 0.3695431649684906, 'start': 387, 'end': 396, 'answer': 'Las Vegas'}\n", "Question 5. {'score': 0.21839778125286102, 'start': 355, 'end': 362, 'answer': 'Barstow'}\n" ], "name": "stdout" } ] }, { "cell_type": "markdown", "metadata": { "id": "TPd42T7TrhVH" }, "source": [ "Question-answering applied to NER person entities" ] }, { "cell_type": "code", "metadata": { "id": "6yQyrSjsv6dJ", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "4780dda5-4485-417e-c0e1-1c4ca8bd9cb5" }, "source": [ "nlp_qa = pipeline('question-answering')\n", "nlp_qa(context=sequence, question='Who was singing ?')" ], "execution_count": null, "outputs": [ { "output_type": "execute_result", "data": { "text/plain": [ "{'answer': 'Nat King Cole',\n", " 'end': 277,\n", " 'score': 0.9653680324554443,\n", " 'start': 264}" ] }, "metadata": { "tags": [] }, "execution_count": 17 } ] }, { "cell_type": "code", "metadata": { "id": "CfOlUtS0wapC", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "4cce8650-5d46-4374-d987-8111c6e81cbc" }, "source": [ "nlp_qa(context=sequence, question='Who was going to Las Vegas ?')" ], "execution_count": null, "outputs": [ { "output_type": "execute_result", "data": { "text/plain": [ "{'answer': 'Nat King Cole',\n", " 'end': 277,\n", " 'score': 0.4316245913505554,\n", " 'start': 264}" ] }, "metadata": { "tags": [] }, "execution_count": 18 } ] }, { "cell_type": "code", "metadata": { "id": "DI_8OcAdx7Rp", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "212ae5d4-23e1-4016-e846-fe74cef78a26" }, "source": [ "nlp_qa(context=sequence, question='Who are they?')" ], "execution_count": null, "outputs": [ { "output_type": "execute_result", "data": { "text/plain": [ "{'answer': 'Jo and Maria',\n", " 'end': 305,\n", " 'score': 0.8486908078193665,\n", " 'start': 293}" ] }, "metadata": { "tags": [] }, "execution_count": 19 } ] }, { "cell_type": "code", "metadata": { "id": "Oc3Pe7CByyhc", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "5625a1f6-1b12-4ab4-813b-74a0fa3f727d" }, "source": [ "nlp_qa(context=sequence, question='Who drove to Las Vegas?')" ], "execution_count": null, "outputs": [ { "output_type": "execute_result", "data": { "text/plain": [ "{'answer': 'Nat King Cole was singing as Jo and Maria',\n", " 'end': 305,\n", " 'score': 0.35941559076309204,\n", " 'start': 264}" ] }, "metadata": { "tags": [] }, "execution_count": 20 } ] }, { "cell_type": "markdown", "metadata": { "id": "TmF96wthzwWT" }, "source": [ "Description of the Default Model" ] }, { "cell_type": "code", "metadata": { "id": "_EMgV9dnz60s", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "86ed2cfb-d7e1-4038-fd6a-daba48a80414" }, "source": [ "print(nlp_qa.model)" ], "execution_count": null, "outputs": [ { "output_type": "stream", "text": [ "DistilBertForQuestionAnswering(\n", " (distilbert): DistilBertModel(\n", " (embeddings): Embeddings(\n", " (word_embeddings): Embedding(28996, 768, padding_idx=0)\n", " (position_embeddings): Embedding(512, 768)\n", " (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " )\n", " (transformer): Transformer(\n", " (layer): ModuleList(\n", " (0): TransformerBlock(\n", " (attention): MultiHeadSelfAttention(\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " (q_lin): Linear(in_features=768, out_features=768, bias=True)\n", " (k_lin): Linear(in_features=768, out_features=768, bias=True)\n", " (v_lin): Linear(in_features=768, out_features=768, bias=True)\n", " (out_lin): Linear(in_features=768, out_features=768, bias=True)\n", " )\n", " (sa_layer_norm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " (ffn): FFN(\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " (lin1): Linear(in_features=768, out_features=3072, bias=True)\n", " (lin2): Linear(in_features=3072, out_features=768, bias=True)\n", " )\n", " (output_layer_norm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " )\n", " (1): TransformerBlock(\n", " (attention): MultiHeadSelfAttention(\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " (q_lin): Linear(in_features=768, out_features=768, bias=True)\n", " (k_lin): Linear(in_features=768, out_features=768, bias=True)\n", " (v_lin): Linear(in_features=768, out_features=768, bias=True)\n", " (out_lin): Linear(in_features=768, out_features=768, bias=True)\n", " )\n", " (sa_layer_norm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " (ffn): FFN(\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " (lin1): Linear(in_features=768, out_features=3072, bias=True)\n", " (lin2): Linear(in_features=3072, out_features=768, bias=True)\n", " )\n", " (output_layer_norm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " )\n", " (2): TransformerBlock(\n", " (attention): MultiHeadSelfAttention(\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " (q_lin): Linear(in_features=768, out_features=768, bias=True)\n", " (k_lin): Linear(in_features=768, out_features=768, bias=True)\n", " (v_lin): Linear(in_features=768, out_features=768, bias=True)\n", " (out_lin): Linear(in_features=768, out_features=768, bias=True)\n", " )\n", " (sa_layer_norm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " (ffn): FFN(\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " (lin1): Linear(in_features=768, out_features=3072, bias=True)\n", " (lin2): Linear(in_features=3072, out_features=768, bias=True)\n", " )\n", " (output_layer_norm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " )\n", " (3): TransformerBlock(\n", " (attention): MultiHeadSelfAttention(\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " (q_lin): Linear(in_features=768, out_features=768, bias=True)\n", " (k_lin): Linear(in_features=768, out_features=768, bias=True)\n", " (v_lin): Linear(in_features=768, out_features=768, bias=True)\n", " (out_lin): Linear(in_features=768, out_features=768, bias=True)\n", " )\n", " (sa_layer_norm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " (ffn): FFN(\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " (lin1): Linear(in_features=768, out_features=3072, bias=True)\n", " (lin2): Linear(in_features=3072, out_features=768, bias=True)\n", " )\n", " (output_layer_norm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " )\n", " (4): TransformerBlock(\n", " (attention): MultiHeadSelfAttention(\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " (q_lin): Linear(in_features=768, out_features=768, bias=True)\n", " (k_lin): Linear(in_features=768, out_features=768, bias=True)\n", " (v_lin): Linear(in_features=768, out_features=768, bias=True)\n", " (out_lin): Linear(in_features=768, out_features=768, bias=True)\n", " )\n", " (sa_layer_norm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " (ffn): FFN(\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " (lin1): Linear(in_features=768, out_features=3072, bias=True)\n", " (lin2): Linear(in_features=3072, out_features=768, bias=True)\n", " )\n", " (output_layer_norm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " )\n", " (5): TransformerBlock(\n", " (attention): MultiHeadSelfAttention(\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " (q_lin): Linear(in_features=768, out_features=768, bias=True)\n", " (k_lin): Linear(in_features=768, out_features=768, bias=True)\n", " (v_lin): Linear(in_features=768, out_features=768, bias=True)\n", " (out_lin): Linear(in_features=768, out_features=768, bias=True)\n", " )\n", " (sa_layer_norm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " (ffn): FFN(\n", " (dropout): Dropout(p=0.1, inplace=False)\n", " (lin1): Linear(in_features=768, out_features=3072, bias=True)\n", " (lin2): Linear(in_features=3072, out_features=768, bias=True)\n", " )\n", " (output_layer_norm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n", " )\n", " )\n", " )\n", " )\n", " (qa_outputs): Linear(in_features=768, out_features=2, bias=True)\n", " (dropout): Dropout(p=0.1, inplace=False)\n", ")\n" ], "name": "stdout" } ] }, { "cell_type": "markdown", "metadata": { "id": "vAlWY5E6TfKL" }, "source": [ "Question-Answering with ELECTRA" ] }, { "cell_type": "code", "metadata": { "id": "BFNSvGN0znq9", "colab": { "base_uri": "https://localhost:8080/", "height": 386, "referenced_widgets": [ "ec5480ed053b46cdb517d77899900a2f", "85a1571687b84d19ae442c5f81f26f7a", "e2eb771a977f44c180f99f87ca99fd77", "cb3ee57a490d4a9592e4b122d0d81948", "3da87e6c040b440988d93c43ac3a2c09", "dceac99c8e52482e9f62a5a55898641e", "a07d84c9acd24a909d65f8b16f85fbe9", "1f9e54937bcc49858ae9a938d899379b", "27a07215928f497db5e317b82e9e5922", "7c2581de98954b79b19ee3c6a2259ba7", "7b616c40df534e3fa921221fe620a3d9", "23afbc66e2dd43eb84d3d731c46263f1", "e084d2d75eb14fe3aacedbd5ecf711bc", "a4b121058c934cf081e8af64e893b913", "b7c0ba54037049e7a9ce91a85c41c580", "d7d6eb6e2945450fb1c0590a506222e5", "a7f35783ec6249be8ccfba1c83ed0e9f", "c9bd7a41ada546e88509a60576dcbd81", "dad65f326c614a58a0c13c94accab562", "f853299184d54861874b30a6087c6e3b", "863fcf057e7d4fe984c531d6b1291814", "058caba18b6d4a1e8fb52d573465255e", "6234b7e5ca9c4067ae359a31f9b38e27", "a475b921ef3d4315990e07512ff759a3", "469aaef964d644198b9cf9b878c56178", "045cce4661714b078350aa8c12f86680", "4f43fcc08b664d0d9a9c5edbec57dfe2", "a61f3ca504574a4db912806d920daad9", "da55a1ee4c1547c180fc2fa62e8908d2", "9d754136f43641b4996383d99a3163c3", "fd4c6744a60340b4b004710cf2e9c96c", "3a196782e8624fca9baf0561b48cf0b8" ] }, "outputId": "1f9b1b3e-f51d-48dd-a97e-d5f43d3207c9" }, "source": [ "nlp_qa = pipeline('question-answering', model='google/electra-small-generator', tokenizer='google/electra-small-generator')\n", "nlp_qa(context=sequence, question='Who drove to Las Vegas ?')" ], "execution_count": null, "outputs": [ { "output_type": "display_data", "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "ec5480ed053b46cdb517d77899900a2f", "version_minor": 0, "version_major": 2 }, "text/plain": [ "HBox(children=(FloatProgress(value=0.0, description='Downloading', max=463.0, style=ProgressStyle(description_…" ] }, "metadata": { "tags": [] } }, { "output_type": "stream", "text": [ "\n" ], "name": "stdout" }, { "output_type": "display_data", "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "27a07215928f497db5e317b82e9e5922", "version_minor": 0, "version_major": 2 }, "text/plain": [ "HBox(children=(FloatProgress(value=0.0, description='Downloading', max=54236116.0, style=ProgressStyle(descrip…" ] }, "metadata": { "tags": [] } }, { "output_type": "stream", "text": [ "\n" ], "name": "stdout" }, { "output_type": "display_data", "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "a7f35783ec6249be8ccfba1c83ed0e9f", "version_minor": 0, "version_major": 2 }, "text/plain": [ "HBox(children=(FloatProgress(value=0.0, description='Downloading', max=231508.0, style=ProgressStyle(descripti…" ] }, "metadata": { "tags": [] } }, { "output_type": "stream", "text": [ "\n" ], "name": "stdout" }, { "output_type": "display_data", "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "469aaef964d644198b9cf9b878c56178", "version_minor": 0, "version_major": 2 }, "text/plain": [ "HBox(children=(FloatProgress(value=0.0, description='Downloading', max=466062.0, style=ProgressStyle(descripti…" ] }, "metadata": { "tags": [] } }, { "output_type": "stream", "text": [ "\n" ], "name": "stdout" }, { "output_type": "stream", "text": [ "Some weights of the model checkpoint at google/electra-small-generator were not used when initializing ElectraForQuestionAnswering: ['generator_predictions.LayerNorm.weight', 'generator_predictions.LayerNorm.bias', 'generator_predictions.dense.weight', 'generator_predictions.dense.bias', 'generator_lm_head.weight', 'generator_lm_head.bias']\n", "- This IS expected if you are initializing ElectraForQuestionAnswering from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).\n", "- This IS NOT expected if you are initializing ElectraForQuestionAnswering from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).\n", "Some weights of ElectraForQuestionAnswering were not initialized from the model checkpoint at google/electra-small-generator and are newly initialized: ['qa_outputs.weight', 'qa_outputs.bias']\n", "You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.\n" ], "name": "stderr" }, { "output_type": "execute_result", "data": { "text/plain": [ "{'answer': 'rather pleasant to be making it out of the city on this',\n", " 'end': 245,\n", " 'score': 0.00034621506347320974,\n", " 'start': 190}" ] }, "metadata": { "tags": [] }, "execution_count": 22 } ] }, { "cell_type": "markdown", "metadata": { "id": "49PDRpKHsc41" }, "source": [ "Question Answering with default Model and SRL" ] }, { "cell_type": "code", "metadata": { "id": "W8kGz5ihz96g", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "393a5a3e-ea75-4c7e-84f3-9f9930edd164" }, "source": [ "nlp_qa = pipeline('question-answering')\n", "nlp_qa(context=sequence, question='What was slow?')" ], "execution_count": null, "outputs": [ { "output_type": "execute_result", "data": { "text/plain": [ "{'answer': 'The traffic', 'end': 11, 'score': 0.46530455350875854, 'start': 0}" ] }, "metadata": { "tags": [] }, "execution_count": 23 } ] }, { "cell_type": "code", "metadata": { "id": "4mycOJhdugbL", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "aeeb1e97-51e7-4658-da75-cbeba382521b" }, "source": [ "nlp_qa = pipeline('question-answering')\n", "nlp_qa(context=sequence, question='What was playing')" ], "execution_count": null, "outputs": [ { "output_type": "execute_result", "data": { "text/plain": [ "{'answer': 'cool jazz', 'end': 152, 'score': 0.3511938154697418, 'start': 143}" ] }, "metadata": { "tags": [] }, "execution_count": 24 } ] }, { "cell_type": "code", "metadata": { "id": "bniJUNoxwtiw", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "b3ea4b39-8c69-4cef-9a57-401aefd9065e" }, "source": [ "nlp_qa = pipeline('question-answering')\n", "nlp_qa(context=sequence, question='Who sees a show?')" ], "execution_count": null, "outputs": [ { "output_type": "execute_result", "data": { "text/plain": [ "{'answer': 'Nat King Cole',\n", " 'end': 277,\n", " 'score': 0.5588219165802002,\n", " 'start': 264}" ] }, "metadata": { "tags": [] }, "execution_count": 25 } ] } ] } ================================================ FILE: Chapter11/SentimentAnalysis.ipynb ================================================ { "nbformat": 4, "nbformat_minor": 0, "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.7.6" }, "pycharm": { "stem_cell": { "cell_type": "raw", "source": [], "metadata": { "collapsed": false } } }, "colab": { "name": "SentimentAnalysis.ipynb", "provenance": [], "collapsed_sections": [] }, "widgets": { "application/vnd.jupyter.widget-state+json": { "491c9ee2f443495dba7465ab25a7ba70": { "model_module": "@jupyter-widgets/controls", "model_name": "HBoxModel", "state": { "_view_name": "HBoxView", "_dom_classes": [], "_model_name": "HBoxModel", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.5.0", "box_style": "", "layout": "IPY_MODEL_ea8709edb3204155b74956ae66fd9d78", "_model_module": "@jupyter-widgets/controls", "children": [ "IPY_MODEL_d7677297d78940de840ebfe90e20aac8", "IPY_MODEL_b95468d3abe24608997d4ea2a26a6449" ] } }, "ea8709edb3204155b74956ae66fd9d78": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "d7677297d78940de840ebfe90e20aac8": { "model_module": "@jupyter-widgets/controls", "model_name": "FloatProgressModel", "state": { "_view_name": "ProgressView", "style": "IPY_MODEL_52d2aae401344ddea76b1032e10582a4", "_dom_classes": [], "description": "Downloading: 100%", "_model_name": "FloatProgressModel", "bar_style": "success", "max": 230, "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": 230, "_view_count": null, "_view_module_version": "1.5.0", "orientation": "horizontal", "min": 0, "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_99579edca7fd4e5aa93491630aac72c0" } }, "b95468d3abe24608997d4ea2a26a6449": { "model_module": "@jupyter-widgets/controls", "model_name": "HTMLModel", "state": { "_view_name": "HTMLView", "style": "IPY_MODEL_1f0b1e493558418b9ff5558cfc7baae3", "_dom_classes": [], "description": "", "_model_name": "HTMLModel", "placeholder": "​", "_view_module": "@jupyter-widgets/controls", "_model_module_version": "1.5.0", "value": " 230/230 [00:00<00:00, 541B/s]", "_view_count": null, "_view_module_version": "1.5.0", "description_tooltip": null, "_model_module": "@jupyter-widgets/controls", "layout": "IPY_MODEL_bc86f7386abf4e1fb2aba758a2982039" } }, "52d2aae401344ddea76b1032e10582a4": { "model_module": "@jupyter-widgets/controls", "model_name": "ProgressStyleModel", "state": { "_view_name": "StyleView", "_model_name": "ProgressStyleModel", "description_width": "initial", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "bar_color": null, "_model_module": "@jupyter-widgets/controls" } }, "99579edca7fd4e5aa93491630aac72c0": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } }, "1f0b1e493558418b9ff5558cfc7baae3": { "model_module": "@jupyter-widgets/controls", "model_name": "DescriptionStyleModel", "state": { "_view_name": "StyleView", "_model_name": "DescriptionStyleModel", "description_width": "", "_view_module": "@jupyter-widgets/base", "_model_module_version": "1.5.0", "_view_count": null, "_view_module_version": "1.2.0", "_model_module": "@jupyter-widgets/controls" } }, "bc86f7386abf4e1fb2aba758a2982039": { "model_module": "@jupyter-widgets/base", "model_name": "LayoutModel", "state": { "_view_name": "LayoutView", "grid_template_rows": null, "right": null, "justify_content": null, "_view_module": "@jupyter-widgets/base", "overflow": null, "_model_module_version": "1.2.0", "_view_count": null, "flex_flow": null, "width": null, "min_width": null, "border": null, "align_items": null, "bottom": null, "_model_module": "@jupyter-widgets/base", "top": null, "grid_column": null, "overflow_y": null, "overflow_x": null, "grid_auto_flow": null, "grid_area": null, "grid_template_columns": null, "flex": null, "_model_name": "LayoutModel", "justify_items": null, "grid_row": null, "max_height": null, "align_content": null, "visibility": null, "align_self": null, "height": null, "min_height": null, "padding": null, "grid_auto_rows": null, "grid_gap": null, "max_width": null, "order": null, "_view_module_version": "1.2.0", "grid_template_areas": null, "object_position": null, "object_fit": null, "grid_auto_columns": null, "margin": null, "display": null, "left": null } } } } }, "cells": [ { "cell_type": "markdown", "metadata": { "id": "rFCzxMzfG2Jh", "colab_type": "text" }, "source": [ "#Sentiment Analysis with Hugging Face Models and AllenNLP\n", "\n", "Copyright 2020, Denis Rothman\n", "\n", "Resources\n", "\n", "[AllenNLP](https://demo.allennlp.org/sentiment-analysis)\n", "\n", "[Hugging Face Pipelines](https://huggingface.co/transformers/main_classes/pipelines.html)\n", "\n", "[Hugging Face Models](https://huggingface.co/models)\n" ] }, { "cell_type": "code", "metadata": { "id": "TssYtycqPQSW", "colab_type": "code", "colab": {} }, "source": [ "!pip install allennlp==1.0.0 allennlp-models==1.0.0" ], "execution_count": null, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "4-nbsdFAQyVj", "colab_type": "code", "colab": {} }, "source": [ "!echo '{\"sentence\": \"Whether or not you're enlightened by any of Derrida's lectures on the other and the self, Derrida is an undeniably fascinating and playful fellow.\"}' | \\\n", "allennlp predict https://storage.googleapis.com/allennlp-public-models/sst-roberta-large-2020.06.08.tar.gz -" ], "execution_count": null, "outputs": [] }, { "cell_type": "code", "metadata": { "pycharm": { "name": "#%% code\n" }, "id": "4maAknWNrl_N", "colab_type": "code", "colab": {} }, "source": [ "!pip install -q transformers" ], "execution_count": null, "outputs": [] }, { "cell_type": "code", "metadata": { "pycharm": { "is_executing": false, "name": "#%% code \n" }, "id": "uKaqzCh6rl_V", "colab_type": "code", "colab": {} }, "source": [ "from transformers import pipeline" ], "execution_count": null, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "CRUQAGAzA1Vr", "colab_type": "code", "colab": {} }, "source": [ "def classify(sequence,M):\n", " #DistilBertForSequenceClassification(default model)\n", " nlp_cls = pipeline('sentiment-analysis') \n", " if M==1:\n", " print(nlp_cls.model.config)\n", " return nlp_cls(sequence)\n" ], "execution_count": null, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "BitkJ4tM5C9p", "colab_type": "code", "colab": { "base_uri": "https://localhost:8080/", "height": 100, "referenced_widgets": [ "491c9ee2f443495dba7465ab25a7ba70", "ea8709edb3204155b74956ae66fd9d78", "d7677297d78940de840ebfe90e20aac8", "b95468d3abe24608997d4ea2a26a6449", "52d2aae401344ddea76b1032e10582a4", "99579edca7fd4e5aa93491630aac72c0", "1f0b1e493558418b9ff5558cfc7baae3", "bc86f7386abf4e1fb2aba758a2982039" ] }, "outputId": "75b4010f-686f-4e1d-f5d8-dc9d286af660" }, "source": [ "seq=3\n", "if seq==1:\n", " sequence=\"The battery on my Model9X phone doesn't last more than 6 hours and I'm unhappy about that.\"\n", "if seq==2:\n", " sequence=\"The battery on my Model9X phone doesn't last more than 6 hours and I'm unhappy about that. I was really mad! I bought a Moel10x and things seem to be better. I'm super satisfied now.\"\n", "if seq==3:\n", " sequence=\"The customer was very unhappy\"\n", "if seq==4:\n", " sequence=\"The customer was very satisfied\"\n", "print(sequence)\n", "M=0 #display model cofiguration=1, default=0\n", "CS=classify(sequence,M) \n", "print(CS)" ], "execution_count": null, "outputs": [ { "output_type": "stream", "text": [ "The customer was very unhappy\n" ], "name": "stdout" }, { "output_type": "display_data", "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "491c9ee2f443495dba7465ab25a7ba70", "version_minor": 0, "version_major": 2 }, "text/plain": [ "HBox(children=(FloatProgress(value=0.0, description='Downloading', max=230.0, style=ProgressStyle(description_…" ] }, "metadata": { "tags": [] } }, { "output_type": "stream", "text": [ "\n", "[{'label': 'NEGATIVE', 'score': 0.9997098445892334}]\n" ], "name": "stdout" } ] } ] } ================================================ FILE: Chapter12/Fake_News.ipynb ================================================ { "nbformat": 4, "nbformat_minor": 0, "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.7.6" }, "pycharm": { "stem_cell": { "cell_type": "raw", "source": [], "metadata": { "collapsed": false } } }, "colab": { "name": "Fake_News.ipynb", "provenance": [], "collapsed_sections": [] } }, "cells": [ { "cell_type": "markdown", "metadata": { "id": "rFCzxMzfG2Jh", "colab_type": "text" }, "source": [ "#Fake News\n", "\n", "Copyright 2020, Denis Rothman\n", "\n", "## Notebook resources:\n", "\n", "[Hugging Face](https://huggingface.co)\n", "\n", "[The Allen Institute for AI](https://allennlp.org/)
\n", "Some of Allen NLP resources come from Hugging Face\n", "\n" ] }, { "cell_type": "markdown", "metadata": { "id": "_UlmKCHMttiY", "colab_type": "text" }, "source": [ "## Fake News from an emotional perspective \n", "\n", "Allen NLP Sentiment Analysis with RoBERTa-large \n" ] }, { "cell_type": "code", "metadata": { "id": "TssYtycqPQSW", "colab_type": "code", "colab": {} }, "source": [ "!pip install allennlp==1.0.0 allennlp-models==1.0.0" ], "execution_count": null, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "4-nbsdFAQyVj", "colab_type": "code", "colab": {} }, "source": [ "!echo '{\"sentence\":\"Climate change is bogus. It’s a plot by the liberals to take the economy down.\"}' | \\\n", "allennlp predict https://storage.googleapis.com/allennlp-public-models/sst-roberta-large-2020.06.08.tar.gz -" ], "execution_count": null, "outputs": [] }, { "cell_type": "code", "metadata": { "colab_type": "code", "id": "jFgnbGsEwXUe", "colab": {} }, "source": [ "!echo '{\"sentence\":\"I am a Republican and think that climate change consciousness is a great thing!\"}' | \\\n", "allennlp predict https://storage.googleapis.com/allennlp-public-models/sst-roberta-large-2020.06.08.tar.gz -" ], "execution_count": null, "outputs": [] }, { "cell_type": "markdown", "metadata": { "id": "5TWl4dpzbGe6", "colab_type": "text" }, "source": [ "## GUN CONTROL" ] }, { "cell_type": "code", "metadata": { "colab_type": "code", "id": "pDX7NjFVa31H", "colab": {} }, "source": [ "!echo '{\"sentence\":\"I have had rifles and guns for years and never had a problem. I raised my kids right so they have guns too and never hurt anything except rabbits.\"}' | \\\n", "allennlp predict https://storage.googleapis.com/allennlp-public-models/sst-roberta-large-2020.06.08.tar.gz -" ], "execution_count": null, "outputs": [] }, { "cell_type": "code", "metadata": { "colab_type": "code", "id": "PpdecpJPfTl6", "colab": {} }, "source": [ "!echo '{\"sentence\":\"I have heard gunshots all my life in my neighborhood, have lost many friends, and am afraid to go out at night.\"}' | \\\n", "allennlp predict https://storage.googleapis.com/allennlp-public-models/sst-roberta-large-2020.06.08.tar.gz -" ], "execution_count": null, "outputs": [] }, { "cell_type": "code", "metadata": { "pycharm": { "name": "#%% code\n" }, "id": "4maAknWNrl_N", "colab_type": "code", "colab": {} }, "source": [ "!pip install -q transformers" ], "execution_count": null, "outputs": [] }, { "cell_type": "code", "metadata": { "pycharm": { "is_executing": false, "name": "#%% code \n" }, "id": "uKaqzCh6rl_V", "colab_type": "code", "colab": {} }, "source": [ "from transformers import pipeline\n", "from transformers import AutoTokenizer, AutoModelForSequenceClassification,AutoModel" ], "execution_count": null, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "6bWOwVgIh9Ai", "colab_type": "code", "colab": {} }, "source": [ "nlp_token_class = pipeline('ner')\n", "nlp_token_class('I have had rifles and guns for years and never had a problem. I raised my kids right so they have guns too and never hurt anything except rabbits.')" ], "execution_count": null, "outputs": [] }, { "cell_type": "code", "metadata": { "colab_type": "code", "id": "69Az4owA2UQv", "colab": {} }, "source": [ "nlp_token_class = pipeline('ner')\n", "nlp_token_class('I have had rifles and guns for years and never had a problem. I raised my kids right so they have guns too and never hurt anything except rabbits.')" ], "execution_count": null, "outputs": [] }, { "cell_type": "code", "metadata": { "colab_type": "code", "id": "VmJw_cAmkI43", "colab": {} }, "source": [ "nlp_token_class.model.config" ], "execution_count": null, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "IE-TRXfppsKJ", "colab_type": "code", "colab": {} }, "source": [ "!echo '{\"sentence\": \"I have heard gunshots all my life in my neighborhood, have lost many friends, and am afraid to go out at night.\"}' | \\\n", "allennlp predict https://storage.googleapis.com/allennlp-public-models/bert-base-srl-2020.03.24.tar.gz -" ], "execution_count": null, "outputs": [] } ] } ================================================ FILE: README.md ================================================ --- ## Join Our Newsletters 📬 ### DataPro *The future of AI is unfolding. Don’t fall behind.*

DataPro QR

Stay ahead with [**DataPro**](https://landing.packtpub.com/subscribe-datapronewsletter/?link_from_packtlink=yes), the free weekly newsletter for data scientists, AI/ML researchers, and data engineers. From trending tools like **PyTorch**, **scikit-learn**, **XGBoost**, and **BentoML** to hands-on insights on **database optimization** and real-world **ML workflows**, you’ll get what matters, fast. > Stay sharp with [DataPro](https://landing.packtpub.com/subscribe-datapronewsletter/?link_from_packtlink=yes). Join **115K+ data professionals** who never miss a beat. --- ### BIPro *Business runs on data. Make sure yours tells the right story.*

BIPro QR

[**BIPro**](https://landing.packtpub.com/subscribe-bipro-newsletter/?link_from_packtlink=yes) is your free weekly newsletter for BI professionals, analysts, and data leaders. Get practical tips on **dashboarding**, **data visualization**, and **analytics strategy** with tools like **Power BI**, **Tableau**, **Looker**, **SQL**, and **dbt**. > Get smarter with [BIPro](https://landing.packtpub.com/subscribe-bipro-newsletter/?link_from_packtlink=yes). Trusted by **35K+ BI professionals**, see what you’re missing. # Transformers for Natural Language Processing This is the code repository for [Transformers for Natural Language Processing](https://www.packtpub.com/product/transformers-for-natural-language-processing/9781800565791), published by [Packt](https://www.packtpub.com/?utm_source=github). It contains all the supporting project files necessary to work through the book from start to finish. * **Paperback**: 384 pages * **ISBN-13**: 9781800565791 * **Date Of Publication**: January 2021 [](https://www.amazon.com/Transformers-Natural-Language-Processing-architectures-ebook/dp/B08S977X8K/) ## Links * [Amazon](https://www.amazon.com/Transformers-Natural-Language-Processing-architectures-ebook/dp/B08S977X8K/) * [Packt Publishing](https://www.packtpub.com/product/transformers-for-natural-language-processing/9781800565791) ## About the Book Transformers for Natural Language Processing investigates in vast detail the deep learning for machine translations, speech-to-text, text-to-speech, language modeling, question answering, and many more NLP domains in context with the Transformers. The book takes you through Natural language processing with Python and examines various eminent models and datasets in the transformer technology created by pioneers such as Google, Facebook, Microsoft, OpenAI, Hugging Face, and other contributors. The book trains you in three stages. The first stage introduces you to transformer architectures, starting with the original Transformer, before moving on to RoBERTa, BERT, and DistilBERT models. You will discover training methods for smaller Transformers that can outperform GPT-3 in some cases. In the second stage, you will apply Transformers for Natural Language Understanding (NLU) and Generation. Finally, the third stage will help you grasp advanced language understanding techniques such as optimizing social network datasets and fake news identification. By the end of this NLP book, you will understand transformers from a cognitive science perspective and be proficient in applying pre-trained transformer models by tech giants to various datasets. ## Things you will learn * Use the latest pretrained transformer models * Grasp the workings of the original Transformer, GPT-2, BERT, T5, and other transformer models * Create language understanding Python programs using concepts that outperform classical deep learning models * Use a variety of NLP platforms, including Hugging Face, Trax, and AllenNLP * Apply Python, TensorFlow, and Keras programs to sentiment analysis, text summarization, speech recognition, machine translations, and more * Measure the productivity of key transformers to define their scope, potential, and limits in production ## Instructions and Navigation All of the code is organized into folders that are named chapter-wise, for example: `Chapter02`. The code will look like the following: ```python #@title Activating the GPU # Main menu->Runtime->Change Runtime Type import tensorflow as tf device_name = tf.test.gpu_device_name() if device_name != ‘/device:GPU:0’: raise SystemError(‘GPU device not found’) print(‘Found GPU at: {}’.format(device_name)) ``` ## Software Requirements Check this file for the hardware and software requirements: [technical_requirements.md](./.other/technical_requirements.md) ## Related Products * [Python Machine Learning - Third Edition](https://www.packtpub.com/product/python-machine-learning-third-edition/9781789955750) * [Hands-On Explainable AI (XAI) with Python - Second Edition](https://www.packtpub.com/product/hands-on-explainable-ai-xai-with-python/9781800208131) ### Download a free PDF If you have already purchased a print or Kindle version of this book, you can get a DRM-free PDF version at no cost.
Simply click on the link to claim your free PDF.

https://packt.link/free-ebook/9781800565791