[
  {
    "path": ".gitignore",
    "content": ".idea\n**/__pycache__\n*.so\noutput/TD500/*\nvis/TD500_test/\n"
  },
  {
    "path": "README.md",
    "content": "# TextBPN-Puls-Plus \nThis is a Pytorch implementation of TextBPN++: [Arbitrary Shape Text Detection via Boundary Transformer](https://arxiv.org/abs/2205.05320);  This project is based on [TextBPN](https://github.com/GXYM/TextBPN)       \n![](https://github.com/GXYM/TextBPN-Plus-Plus/blob/main/vis/framework.png)  \n\n*Please light up the stars, thank you! Your encouragement is the energy for us to constantly update！*\n\n\n## News\n- [x] 2025.06.30 Our model [TextBPN-MLOCR](https://github.com/GXYM/TextBPN-MLOCR) achieved first place on [ArT2019](https://rrc.cvc.uab.es/?ch=14&com=evaluation&task=1)_ and [MLT2019](https://rrc.cvc.uab.es/?ch=15&com=evaluation&task=1)\n- [x] 2025.05.16 We are excited to announce the release of our **TextBPN-MLOCR: Advanced Multi-Lingual Scene Text Detection** ([**TextBPN-MLOCR\n**](https://github.com/GXYM/TextBPN-MLOCR))! Designed for robust performance in complex, multilingual environments, this model excels at detecting text of various shapes in any scene. Leveraging a powerful architecture based on DCN and ResNet50, it has been trained on 1.5 million synthetic images and over 0.5 million real-world scene samples. Despite its strong detection capabilities, the model remains lightweight and efficient, making it ideal for a wide range of applications.  \n- [x] 2025.03.13 Added a script for multi-GPU training [train_textBPN_DDP.py](https://github.com/GXYM/TextBPN-Plus-Plus/blob/main/train_textBPN_DDP.py) \n- [x] 2023.06.14 Our paper of TextBPN++ has been accepted by IEEE Transactions on Multimedia (T-MM 2023), which can be obtained at [IEEE Xplore](https://ieeexplore.ieee.org/document/10153666).  \n- [x] 2022.11.19 Displayed the visualization results of Reading Order Module.  \n- [x] 2022.11.18 **Important update! Fixed some bugs in training code.**  These bugs make it difficult to reproduce the perfromance in the paper without pre-training. The updated code has more stable training and better performance. Thanks to the developers who made me aware of these bugs, and push us to fix these bugs. The description of the code change and the details of the reproduction are [here](https://github.com/GXYM/TextBPN-Plus-Plus/tree/main/output).  \n- [x] 2022.10.18 The performance of our TextBPN++ on Document OCR have been proved by the paper [\"Text Detection Forgot About Document OCR\"](https://arxiv.org/pdf/2210.07903.pdf) with University of Oxford, which also reported the compared results for other text detection methods, such as PAN, DBNet, DBNet++, DCLNet, CRAFT etc. The comparison experiments is very detailed and very referential!\n- [x] 2022.06.20 Updated the illustration of framework. \n- [x] 2022.06.16 Uploaded the missing file because of the naming problem（[CTW1500_Text_New](https://github.com/GXYM/TextBPN-Plus-Plus/blob/main/dataset/CTW1500_Text_New.py), [Total_Text_New](https://github.com/GXYM/TextBPN-Plus-Plus/blob/main/dataset/Total_Text_New.py)), which support the new label formats for Total_Text (mat) and CTW1500 (xml).  \n- [x] 2022.06.16 We updated the Google cloud links so that it can be downloaded without permission.\n\n## ToDo List\n\n- [x] Release code\n- [x] scripts for training and testing\n- [x] Demo script\n- [x] Evaluation\n- [x] Release pre-trained models\n- [x] Release trained models on each benchmarks\n- [ ] The reading order module will be released at a certain time\n- [ ] Prepare TextSpotter codes based on TextBPN++\n<!-- - [ ] The number of control points is determined adaptively  -->\n\n\n\n## Prerequisites \n  Python 3.8 – 3.12;  \n  PyTorch >= 2.1 (validated with 2.9.0);   \n  TorchVision matching the PyTorch release;  \n  NumPy >=1.20;   \n  CUDA toolkit >=11.8 (tested with CUDA 12.x);  \n  GCC >=10.0;   \n  *opencv-python < 4.5.0*  \n  NVIDIA GPU (Turing/Volta class or newer);  \n  \n  NOTE: We currently validate builds on Arch Linux + Python 3.9 and Ubuntu 20.04/22.04 with Python 3.10–3.12, using PyTorch 2.9.0 and CUDA 12.x. Other environments may require minor adjustments but should behave similarly in performance.\n  \n\n## Makefile\n\nIf DCN is used, some CUDA files need to be compiled\n\n```\n  # make sure that the PATH of CUDA in setup.py is set properly with the right environment\n  # Set the GPU you need in setup.py, if If you have different GPUs in your machine.\n  \n  cd network/backbone/assets/dcn\n  sh Makefile.sh\n  \n  # setup.py \n  import os\n  PATH =\"{}:{}\".format(os.environ['PATH'], \"/opt/cuda/bin\")\n  # os.environ['CUDA_VISIBLE_DEVICES'] = \"1\"\n  os.environ['PATH'] = PATH\n  from setuptools import setup\n  from torch.utils.cpp_extension import BuildExtension, CUDAExtension\n```\n\n## Dataset Links  \n1. [CTW1500](https://drive.google.com/file/d/1A2s3FonXq4dHhD64A2NCWc8NQWMH2NFR/view?usp=sharing) （new version dataset at https://github.com/Yuliang-Liu/Curve-Text-Detector）\n2. [TD500](https://drive.google.com/file/d/1ByluLnyd8-Ltjo9AC-1m7omZnI-FA1u0/view?usp=sharing)    \n3. [Total-Text](https://drive.google.com/file/d/17_7T_-2Bu3KSSg2OkXeCxj97TBsjvueC/view?usp=sharing) （new version dataset at: https://github.com/cs-chan/Total-Text-Dataset）\n\nNOTE: The images of each dataset can be obtained from their official website.\n\n\n## Training \n### Prepar dataset\nWe provide a simple example for each dataset in data, such as [Total-Text](https://github.com/GXYM/TextBPN-Plus-Plus/tree/main/data/Total-Text), [CTW-1500](https://github.com/GXYM/TextBPN-Plus-Plus/tree/main/data/CTW-1500), and [ArT](https://github.com/GXYM/TextBPN-Plus-Plus/tree/main/data/ArT) ...\n\n\n### Pre-training models\nWe provide some pre-tarining models on SynText [Baidu Drive](https://pan.baidu.com/s/1PsLrQWAVLDy0fSVp5HL9Ng) (download code: r1ff), [Google Drive](https://drive.google.com/file/d/1x64Lu_dMnQqs9gORQOMFL2wnXUzPWWGT/view?usp=sharing) and MLT-2017 [Baidu Drive](https://pan.baidu.com/s/19V9zqSMdgCHMvPTFngz8Fg) (download code: srym), [Google Drive](https://drive.google.com/file/d/1seVzFT657YzP-lc--yUqsVUek2mpw9tP/view?usp=sharing)\n\n```\n├── pretrain\n│   ├──Syn\n│       ├── TextBPN_resnet50_0.pth  #1s\n│       ├── TextBPN_resnet18_0.pth  #4s\n│       ├── TextBPN_deformable_resnet50_0.pth  #1s\n│   ├── MLT\n│       ├── TextBPN_resnet50_300.pth  #1s\n│       ├── TextBPN_resnet18_300.pth  #4s\n│       ├── TextBPN_deformable_resnet50_300.pth #1s\n``` \nNOTE: we also provide the  pre-tarining scripts for [SynText](https://github.com/GXYM/TextBPN-Plus-Plus/tree/main/scripts-train).\n\n### Runing the training scripts\nWe provide training scripts for each dataset in scripts-train, such as [Total-Text](https://github.com/GXYM/TextBPN-Plus-Plus/tree/main/scripts-train), [CTW-1500](https://github.com/GXYM/TextBPN-Plus-Plus/tree/main/scripts-train), and [ArT](https://github.com/GXYM/TextBPN-Plus-Plus/tree/main/scripts-train) ...\n\n```\n# train_Totaltext_res50_1s.sh\n#!/bin/bash\ncd ../\nCUDA_LAUNCH_BLOCKING=1 python3 train_textBPN.py --exp_name Totaltext --net resnet50 --scale 1 --max_epoch 660 --batch_size 12 --gpu 0 --input_size 640 --optim Adam --lr 0.001 --num_workers 30\n\n# train_Totaltext_res50_1s_fine_mlt.sh\n#!/bin/bash\ncd ../\nCUDA_LAUNCH_BLOCKING=1 python3 train_textBPN.py --exp_name Totaltext --net resnet50 --scale 1 --max_epoch 660 --batch_size 12 --gpu 0 --input_size 640 --optim Adam --lr 0.0001 --num_workers 30 --load_memory True --resume model/pretrain/MLT/TextBPN_resnet50_300.pth\n\n```\n\n## Testing \n\n### Runing the Testing scripts\nWe provide testing scripts for each dataset in scripts-eval, such as [Total-Text](https://github.com/GXYM/TextBPN-Plus-Plus/blob/main/scripts-eval/Eval_Totaltext.sh), [CTW-1500](https://github.com/GXYM/TextBPN-Plus-Plus/blob/main/scripts-eval/Eval_CTW1500.sh), and [ArT](https://github.com/GXYM/TextBPN-Plus-Plus/blob/main/scripts-eval/Eval_ArT.sh) ...\n\n```\n# Eval_ArT.sh\n#!/bin/bash\ncd ../\n##################### eval for ArT with ResNet50 1s ###################################\n#CUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN.py --net resnet50 --scale 1 --exp_name ArT --checkepoch 605 --test_size 960 2880 --dis_threshold 0.4 --cls_threshold 0.4 --gpu 0;\n\n\n##################### eval for ArT with ResNet50-DCN 1s ###################################\nCUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN.py --net deformable_resnet50 --scale 1 --exp_name ArT --checkepoch 480 --test_size 960 2880 --dis_threshold 0.4 --cls_threshold 0.8 --gpu 0;\n\n\n##################### batch eval for ArT ###################################\n#for ((i=660; i>=300; i=i-5));\n#do \n#CUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN.py --exp_name ArT --net deformable_resnet50 --checkepoch $i --test_size 960 2880 --dis_threshold 0.45 --cls_threshold 0.8 --gpu 0;\n#done\n```\n\nNOTE：If you want to save the visualization results, you need to open “--viz”.  Here is an example:\n\n``` \nCUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN.py --net resnet18 --scale 4 --exp_name TD500 --checkepoch 1135 --test_size 640 960 --dis_threshold 0.35 --cls_threshold 0.9 --gpu 0 --viz;\n```\n\n\n### Demo\nYou can also run prediction on your own dataset without annotations. Here is an example:\n\n``` \n#demo.sh\n#!/bin/bash\nCUDA_LAUNCH_BLOCKING=1 python3 demo.py --net resnet18 --scale 4 --exp_name TD500 --checkepoch 1135 --test_size 640 960 --dis_threshold 0.35 --cls_threshold 0.9 --gpu 0 --viz --img_root /path/to/image \n```\n\n### Evaluate the performance\n\nNote that we provide some the protocols for benchmarks ([Total-Text](https://github.com/GXYM/TextBPN-Plus-Plus/tree/main/dataset/total_text), [CTW-1500](https://github.com/GXYM/TextBPN-Plus-Plus/tree/main/dataset/ctw1500), [MSRA-TD500](https://github.com/GXYM/TextBPN-Plus-Plus/tree/main/dataset/TD500), [ICDAR2015](https://github.com/GXYM/TextBPN-Plus-Plus/tree/main/dataset/icdar15)). The embedded evaluation protocol in the code are obtatined from the official protocols. You don't need to run these protocols alone, because our test code will automatically call these scripts, please refer to \"[util/eval.py](https://github.com/GXYM/TextBPN-Plus-Plus/blob/main/util/eval.py)\"\n\n\n### Evaluate the speed \n We refer to the speed testing scheme of DB. The speed is evaluated by performing a testing image for 50 times to exclude extra IO time.\n\n```\nCUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN_speed.py --net resnet18 --scale 4 --exp_name TD500 --checkepoch 1135 --test_size 640 960 --dis_threshold 0.35 --cls_threshold 0.9 --gpu 1;\n```\n\nNote that the speed is related to both to the GPU and the CPU.\n\n### Model performance  \n\nOur replicated performance (without extra data for pre-training) by using the updated codes:\n|  Datasets   |  pre-training |  Model        |recall | precision |F-measure| FPS |\n|:----------:\t|:-----------:  |:-----------:\t|:-------:|:-----:|:-------:|:---:|\n| Total-Text \t|       -       | [Res50-1s](https://drive.google.com/file/d/1wZOXGrM74CbrGY_lo4yF907VpbwnLf9d/view?usp=sharing) \t    | 84.22 |91.88 |87.88 |13|\n| CTW-1500  \t|       -       | [Res50-1s](https://drive.google.com/file/d/1wZOXGrM74CbrGY_lo4yF907VpbwnLf9d/view?usp=sharing)\t    | 82.69 |86.53 |84.57 |14|\n\n\nThe results are reported in our paper as follows:\n|  Datasets   |  Model        |recall | precision |F-measure| FPS |\n|:----------:\t|:-----------:\t|:-------:|:-----:|:-------:|:---:|\n| Total-Text \t| [Res18-4s](https://drive.google.com/file/d/11AtAA429JCha8AZLrp3xVURYZOcCC2s1/view?usp=sharing) \t    |  81.90 |89.88 |85.70 |**32.5**|\n| Total-Text \t| [Res50-1s](https://drive.google.com/file/d/11AtAA429JCha8AZLrp3xVURYZOcCC2s1/view?usp=sharing)\t    |  85.34 |91.81 |88.46 |13.3|\n| Total-Text \t| [Res50-1s+DCN](https://drive.google.com/file/d/11AtAA429JCha8AZLrp3xVURYZOcCC2s1/view?usp=sharing)\t|  **87.93** |**92.44** |**90.13** |13.2|\n| CTW-1500 \t  | [Res18-4s](https://drive.google.com/file/d/1pEaY7eU7esq5p_ZKcAfDA-fNqQskroUH/view?usp=sharing) \t    |  81.62 |87.55 |84.48 |**35.3**|\n| CTW-1500  \t| [Res50-1s](https://drive.google.com/file/d/1pEaY7eU7esq5p_ZKcAfDA-fNqQskroUH/view?usp=sharing)\t    |  83.77 |87.30 |85.50 |14.1|\n| CTW-1500  \t| [Res50-1s+DCN](https://drive.google.com/file/d/1pEaY7eU7esq5p_ZKcAfDA-fNqQskroUH/view?usp=sharing)\t|  **84.71** |**88.34** |**86.49** |16.5|\n| MSRA-TD500 \t| [Res18-4s](https://drive.google.com/file/d/1zNdqcXwplor3T6yQuw1f2euYZex63K0b/view?usp=sharing) \t    |  87.46 |92.38 |89.85 |**38.5**|\n| MSRA-TD500  | [Res50-1s](https://drive.google.com/file/d/1zNdqcXwplor3T6yQuw1f2euYZex63K0b/view?usp=sharing)\t    |  85.40 |89.23 |87.27 |15.2|\n| MSRA-TD500  | [Res50-1s+DCN](https://drive.google.com/file/d/1zNdqcXwplor3T6yQuw1f2euYZex63K0b/view?usp=sharing)\t|  **86.77** |**93.69** |**90.10** |15.3|\n| MLT-2017\t  | [Res50-1s](https://drive.google.com/file/d/12_umKAo-eM0NYbAAuJBABlkIbyr5QNCy/view?usp=sharing) \t    |  65.67 |80.49 |72.33 |  - |\n| MLT-2017    | [Res50-1s+DCN](https://drive.google.com/file/d/12_umKAo-eM0NYbAAuJBABlkIbyr5QNCy/view?usp=sharing)\t|  **72.10** |**83.74** |**77.48** |  - |\n| ICDAR-ArT   | [Res50-1s](https://drive.google.com/file/d/1rn8_YKI9B0JO52uA_4nL4TQE0jjgqm9i/view?usp=sharing)\t    |  71.07 |81.14 |75.77 |  - |\n| ICDAR-ArT   | [Res50-1s+DCN](https://drive.google.com/file/d/1rn8_YKI9B0JO52uA_4nL4TQE0jjgqm9i/view?usp=sharing)\t|  **77.05** |**84.48** |**80.59** |  - |\n\nNOTE: The results on ICDAR-ArT and MLT-2017 are aslo can be found on the official competition website ([ICDAR-ArT](https://rrc.cvc.uab.es/?ch=14&com=evaluation&task=1) and [MLT-2017](https://rrc.cvc.uab.es/?ch=8&com=evaluation&task=1)). You can also download the models for each benchmarks\nfrom [Baidu Drive](https://pan.baidu.com/s/1lFgRE_qiAv8ww8pbRTTh6w) (download code: wct1)\n\n### Visual comparison\n\n![](https://github.com/GXYM/TextBPN-Plus-Plus/blob/main/vis/vis.png)\n\nQualitative comparisons with [TextRay](https://github.com/LianaWang/TextRay), [ABCNet](https://github.com/aim-uofa/AdelaiDet), and [FCENet](https://github.com/open-mmlab/mmocr) on selected challenging samples from CTW-1500. The images (a)-(d) are borrowed from [FCENet](https://arxiv.org/abs/2104.10442).\n\n\n## Reading Order Model\n\n![](https://github.com/GXYM/TextBPN-Plus-Plus/blob/main/vis/order.png)\n\n**Using the Reading Order Model to predict four key corners on the boundary proposal in turn. The reading order module will be released at a certain time.**\n\n\n## Citing the related works\n\nPlease cite the related works in your publications if it helps your research:\n``` \n  @inproceedings{DBLP:conf/iccv/Zhang0YWY21,\n  author    = {Shi{-}Xue Zhang and\n               Xiaobin Zhu and\n               Chun Yang and\n               Hongfa Wang and\n               Xu{-}Cheng Yin},\n  title     = {Adaptive Boundary Proposal Network for Arbitrary Shape Text Detection},\n  booktitle = {2021 {IEEE/CVF} International Conference on Computer Vision, {ICCV} 2021, Montreal, QC, Canada, October 10-17, 2021},\n  pages     = {1285--1294},\n  publisher = {{IEEE}},\n  year      = {2021},\n}\n\n@article{zhang2023arbitrary,\n  title={Arbitrary shape text detection via boundary transformer},\n  author={Zhang, Shi-Xue and Yang, Chun and Zhu, Xiaobin and Yin, Xu-Cheng},\n  journal={IEEE Transactions on Multimedia},\n  year={2023},\n  publisher={IEEE}\n}\n\n@article{DBLP:journals/pami/ZhangZCHY23,\n  author       = {Shi{-}Xue Zhang and\n                  Xiaobin Zhu and\n                  Lei Chen and\n                  Jie{-}Bo Hou and\n                  Xu{-}Cheng Yin},\n  title        = {Arbitrary Shape Text Detection via Segmentation With Probability Maps},\n  journal      = {{IEEE} Trans. Pattern Anal. Mach. Intell.},\n  volume       = {45},\n  number       = {3},\n  pages        = {2736--2750},\n  year         = {2023},\n  url          = {https://doi.org/10.1109/TPAMI.2022.3176122},\n  doi          = {10.1109/TPAMI.2022.3176122},\n}\n\n@article{zhang2022kernel,\n  title={Kernel proposal network for arbitrary shape text detection},\n  author={Zhang, Shi-Xue and Zhu, Xiaobin and Hou, Jie-Bo and Yang, Chun and Yin, Xu-Cheng},\n  journal={IEEE Transactions on Neural Networks and Learning Systems},\n  year={2022},\n  publisher={IEEE}\n}\n\n@inproceedings{DBLP:conf/cvpr/ZhangZHLYWY20,\n  author       = {Shi{-}Xue Zhang and\n                  Xiaobin Zhu and\n                  Jie{-}Bo Hou and\n                  Chang Liu and\n                  Chun Yang and\n                  Hongfa Wang and\n                  Xu{-}Cheng Yin},\n  title        = {Deep Relational Reasoning Graph Network for Arbitrary Shape Text Detection},\n  booktitle    = {2020 {IEEE/CVF} Conference on Computer Vision and Pattern Recognition,\n                  {CVPR} 2020, Seattle, WA, USA, June 13-19, 2020},\n  pages        = {9696--9705},\n  publisher    = {Computer Vision Foundation / {IEEE}},\n  year         = {2020},\n  doi          = {10.1109/CVPR42600.2020.00972},\n}\n  ``` \n \n ## License\nThis project is licensed under the MIT License - see the [LICENSE.md](https://github.com/GXYM/DRRG/blob/master/LICENSE.md) file for details  \n\n<!---->\n## ✨ Star History\n[![Star History](https://api.star-history.com/svg?repos=GXYM/TextBPN-Plus-Plus&type=Date)](https://star-history.com/#GXYM/TextBPN-Plus-Plus&Date)\n\n\n"
  },
  {
    "path": "cache/README.md",
    "content": "Download the pre-training weights on ImgNet for ResNet18 or ResNet50 ...\n"
  },
  {
    "path": "cfglib/config.py",
    "content": "from easydict import EasyDict\nimport torch\nimport os\n\nconfig = EasyDict()\n\n\n# Normalize image\nconfig.means = (0.485, 0.456, 0.406)\nconfig.stds = (0.229, 0.224, 0.225)\n\nconfig.gpu = \"1\"\n\n# Experiment name #\nconfig.exp_name = \"Synthtext\"\n\n# dataloader jobs number\nconfig.num_workers = 24\n\n# batch_size\nconfig.batch_size = 12\n\n# training epoch number\nconfig.max_epoch = 200\n\nconfig.start_epoch = 0\n\n# learning rate\nconfig.lr = 1e-4\n\n# using GPU\nconfig.cuda = True\n\nconfig.output_dir = 'output'\n\nconfig.input_size = 640\n\n# max polygon per image\n# synText, total-text:64; CTW1500: 64; icdar: 64;  MLT: 32; TD500: 64.\nconfig.max_annotation = 64\n\n# adj num for graph\nconfig.adj_num = 4\n\n# control points number\nconfig.num_points = 20\n\n# use hard examples (annotated as '#')\nconfig.use_hard = True\n\n# Load data into memory at one time\nconfig.load_memory = False\n\n# prediction on 1/scale feature map\nconfig.scale = 1\n\n# # clip gradient of loss\nconfig.grad_clip = 25\n\n# demo tcl threshold\nconfig.dis_threshold = 0.3\n\nconfig.cls_threshold = 0.8\n\n# Contour approximation factor\nconfig.approx_factor = 0.004\n\n\ndef update_config(config, extra_config):\n    for k, v in vars(extra_config).items():\n        config[k] = v\n    # print(config.gpu)\n    config.device = torch.device('cuda') if config.cuda else torch.device('cpu')\n\n\ndef print_config(config):\n    print('==========Options============')\n    for k, v in config.items():\n        print('{}: {}'.format(k, v))\n    print('=============End=============')\n"
  },
  {
    "path": "cfglib/option.py",
    "content": "import argparse\nimport torch\nimport os\nimport torch.backends.cudnn as cudnn\n\nfrom datetime import datetime\n\n\ndef str2bool(v):\n    return v.lower() in (\"yes\", \"true\", \"t\", \"1\")\n\n\ndef arg2str(args):\n    args_dict = vars(args)\n    option_str = datetime.now().strftime('%b%d_%H-%M-%S') + '\\n'\n\n    for k, v in sorted(args_dict.items()):\n        option_str += ('{}: {}\\n'.format(str(k), str(v)))\n\n    return option_str\n\n\nclass BaseOptions(object):\n\n    def __init__(self):\n\n        self.parser = argparse.ArgumentParser()\n\n        # basic opts\n        self.parser.add_argument('--exp_name', default=\"TD500\", type=str,\n                                 choices=['Synthtext', 'Totaltext', 'Ctw1500','Icdar2015',\n                                          \"MLT2017\", 'TD500', \"MLT2019\", \"ArT\", \"ALL\"], help='Experiment name')\n        self.parser.add_argument(\"--gpu\", default=\"1\", help=\"set gpu id\", type=str)\n        self.parser.add_argument('--resume', default=None, type=str, help='Path to target resume checkpoint')\n        self.parser.add_argument('--num_workers', default=24, type=int, help='Number of workers used in dataloading')\n        self.parser.add_argument('--cuda', default=True, type=str2bool, help='Use cuda to train model')\n        self.parser.add_argument('--mgpu', action='store_true', help='Use multi-gpu to train model')\n        self.parser.add_argument('--save_dir', default='./model/', help='Path to save checkpoint models')\n        self.parser.add_argument('--vis_dir', default='./vis/', help='Path to save visualization images')\n        self.parser.add_argument('--log_dir', default='./logs/', help='Path to tensorboard log')\n        self.parser.add_argument('--loss', default='CrossEntropyLoss', type=str, help='Training Loss')\n        # self.parser.add_argument('--input_channel', default=1, type=int, help='number of input channels' )\n        self.parser.add_argument('--pretrain', default=False, type=str2bool, help='Pretrained AutoEncoder model')\n        self.parser.add_argument('--verbose', '-v', default=True, type=str2bool, help='Whether to output debug info')\n        self.parser.add_argument('--viz', action='store_true', help='Whether to output debug info')\n        # self.parser.add_argument('--viz', default=True, type=str2bool, help='Whether to output debug info')\n\n        # train opts\n        self.parser.add_argument('--max_epoch', default=250, type=int, help='Max epochs')\n        self.parser.add_argument('--lr', '--learning-rate', default=1e-3, type=float, help='initial learning rate')\n        self.parser.add_argument('--lr_adjust', default='fix',\n                                 choices=['fix', 'poly'], type=str, help='Learning Rate Adjust Strategy')\n        self.parser.add_argument('--stepvalues', default=[], nargs='+', type=int, help='# of iter to change lr')\n        self.parser.add_argument('--weight_decay', '--wd', default=0., type=float, help='Weight decay for SGD')\n        self.parser.add_argument('--gamma', default=0.1, type=float, help='Gamma update for SGD lr')\n        self.parser.add_argument('--momentum', default=0.9, type=float, help='momentum')\n        self.parser.add_argument('--batch_size', default=6, type=int, help='Batch size for training')\n        self.parser.add_argument('--optim', default='Adam', type=str, choices=['SGD', 'Adam'], help='Optimizer')\n        self.parser.add_argument('--save_freq', default=5, type=int, help='save weights every # epoch')\n        self.parser.add_argument('--display_freq', default=10, type=int, help='display training metrics every # iter')\n        self.parser.add_argument('--viz_freq', default=50, type=int, help='visualize training process every # iter')\n        self.parser.add_argument('--log_freq', default=10000, type=int, help='log to tensorboard every # iterations')\n        self.parser.add_argument('--val_freq', default=1000, type=int, help='do validation every # iterations')\n\n        # backbone\n        self.parser.add_argument('--scale', default=1, type=int, help='prediction on 1/scale feature map')\n        self.parser.add_argument('--net', default='resnet50', type=str,\n                                 choices=['vgg', 'resnet50', 'resnet18',\n                                          \"deformable_resnet18\", \"deformable_resnet50\"],\n                                 help='Network architecture')\n        # data args\n        self.parser.add_argument('--load_memory', default=False, type=str2bool, help='Load data into memory')\n        self.parser.add_argument('--rescale', type=float, default=255.0, help='rescale factor')\n        self.parser.add_argument('--input_size', default=640, type=int, help='model input size')\n        self.parser.add_argument('--test_size', default=[640, 960], type=int, nargs='+', help='test size')\n\n        # eval args00\n        self.parser.add_argument('--checkepoch', default=1070, type=int, help='Load checkpoint number')\n        self.parser.add_argument('--start_epoch', default=0, type=int, help='start epoch number')\n        self.parser.add_argument('--cls_threshold', default=0.875, type=float, help='threshold of pse')\n        self.parser.add_argument('--dis_threshold', default=0.35, type=float, help='filter the socre < score_i')\n\n        # demo args\n        self.parser.add_argument('--img_root', default=None,   type=str, help='Path to deploy images')\n\n    def parse(self, fixed=None):\n\n        if fixed is not None:\n            args = self.parser.parse_args(fixed)\n        else:\n            args = self.parser.parse_args()\n\n        return args\n\n    def initialize(self, fixed=None):\n\n        # Parse options\n        self.args = self.parse(fixed)\n        os.environ['CUDA_VISIBLE_DEVICES'] = self.args.gpu\n\n        # Setting default torch Tensor type\n        if self.args.cuda and torch.cuda.is_available():\n            torch.set_default_tensor_type('torch.cuda.FloatTensor')\n            cudnn.benchmark = True\n        else:\n            torch.set_default_tensor_type('torch.FloatTensor')\n\n        # Create weights saving directory\n        if not os.path.exists(self.args.save_dir):\n            os.mkdir(self.args.save_dir)\n\n        # Create weights saving directory of target model\n        model_save_path = os.path.join(self.args.save_dir, self.args.exp_name)\n\n        if not os.path.exists(model_save_path):\n            os.mkdir(model_save_path)\n\n        return self.args\n\n    def update(self, args, extra_options):\n\n        for k, v in extra_options.items():\n            setattr(args, k, v)\n"
  },
  {
    "path": "data/ArT/README.md",
    "content": "## ArT\n\nAn example of the path of ArT dataset: \n\n```\n├── ArT\n│   ├──Images\n│       ├── Train\n│       ├── Test\n│   ├── gt\n│       ├── train_labels.json\n│       ├── Train # convert the \"train_labels.json\" to \".txt\"\n```  \n"
  },
  {
    "path": "data/CTW-1500/README.md",
    "content": "## CTW-1500 (new)\n\nAn example of the path of CTW-1500 dataset: \n\n```\n├── CTW-1500\n│   ├──Images\n│       ├── Train\n│       ├── Test\n│   ├── gt\n│       ├── train_labels # gt with \".xml\"\n│       ├── test_labels\n```  \n"
  },
  {
    "path": "data/Icdar2015/README.md",
    "content": "## ICDAR 2015  \n\nAn example of the path of ICDAR2015 dataset: \n\n```\n├── Icdar2015\n│   ├──Train (include img and gt files)\n│   ├──Test  (include img and gt files)\n│      \n```  \n"
  },
  {
    "path": "data/LSVT/README.md",
    "content": "## LSVT  \n\nAn example of the path of LSVT dataset: \n\n```\n├── LSVT\n│   ├──Images\n│       ├── Train\n│       ├── ...\n│   ├── gt\n│       ├── train_full_labels.json \n│       ├── ...\n```  \n`  \n\n"
  },
  {
    "path": "data/MLT-2019/README.md",
    "content": "## MLT-2019\n\nAn example of the path of MLT-2019 dataset: \n\n```\n├── MLT-2019\n│   ├──TrainImages\n│   ├──Train_gt  # gt with \".txt\"\n│   ├──TestImages \n│  \n│   ├──train_list.txt \n│   ├──test_list.txt    \n```  \n"
  },
  {
    "path": "data/MLT2017/README.md",
    "content": "## MLT2017\n\nAn example of the path of MLT2017 dataset: \n\n```\n├── MLT2017\n│   ├──mlt_train\n│   ├──mlt_val\n│   ├──mlt_test\n│   \n│   ├──train_list.txt \n│   ├──val_list.txt\n│   ├──test_list.txt \n```  \n"
  },
  {
    "path": "data/README.md",
    "content": "Put the datasets here\n"
  },
  {
    "path": "data/SynthText/README.md",
    "content": "## SynthText\n\nAn example of the path of SynthText dataset: \n\n```\n├── SynthText\n│   ├──1\n│   ├──...\n│   ├──200\n│\n│   ├──gt # gt with \".txt\" \n│   ├──image_list.txt  \n```  \n"
  },
  {
    "path": "data/SynthText/findPath.sh",
    "content": "#!/bin/bash\nroot_dir=\"./gt_e2e/\"\nfn=\"/*.txt\"\nout_fn=\"gt_e2e.txt\"\n\nfor element in `ls $root_dir`\n    do\n    echo $root_dir$element$fn\n    `ls $root_dir$element$fn >>$out_fn`\ndone\n\n"
  },
  {
    "path": "data/TD500/README.md",
    "content": "## TD500\n\nAn example of the path of TD500 dataset: \n\n```\n├── TD500\n│   ├──Train (include img and gt(\".txt\") files)\n│   ├──Test  (include img and gt(\".txt\") files)\n│      \n```  "
  },
  {
    "path": "data/Total-Text/README.md",
    "content": "## Total-Text (new)\n\nAn example of the path of Total-Text dataset: \n\n```\n├── Total-Text\n│   ├──Images\n│       ├── Train\n│       ├── Test\n│   ├── gt\n│       ├──Rectangular\n│             ├── Train  # gt with \".mat\"\n│             ├── Test   # gt with \".mat\"\n│       ├──Polygon\n│             ├── Train  # gt with \".mat\"\n│             ├── Test   # gt with \".mat\"\n│       ├──gt_pixel\n│             ├── Train  # gt with binary img\n│             ├── Test   # gt with binary img\n```  \n\n\n"
  },
  {
    "path": "data/ctw1500/README.md",
    "content": "## CTW-1500 (old)\n\nAn example of the path of CTW-1500 dataset: \n\n```\n├── ctw1500\n│   ├──train\n│       ├── text_image\n│       ├── text_label_circum  # gt with \".txt\"\n│   ├── test\n│       ├── text_image\n│       ├── text_label_circum\n```  \n"
  },
  {
    "path": "data/total-text-mat/README.md",
    "content": "## Total-Text (old)\n\nAn example of the path of Total-Text dataset: \n\n```\n├── total-text-mat\n│   ├──Images\n│       ├── Train\n│       ├── Test\n│   ├── gt\n│       ├── Train # gt with \".mat\"\n│       ├── Test\n```  \n\n\n"
  },
  {
    "path": "dataset/CTW1500_Text_New.py",
    "content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n__author__ = '古溪'\nimport os\nimport cv2\nimport numpy as np\nfrom dataset.data_util import pil_load_img\nfrom dataset.dataload import TextDataset, TextInstance\nfrom util.io import read_lines\nfrom lxml import etree as ET\n\n\nclass Ctw1500Text_New(TextDataset):\n    def __init__(self, data_root, is_training=True, load_memory=False, transform=None, ignore_list=None):\n        super().__init__(transform, is_training)\n        self.data_root = data_root\n        self.is_training = is_training\n        self.load_memory = load_memory\n\n        self.image_root = os.path.join(data_root, 'Images', 'Train' if is_training else 'Test')\n        self.annotation_root = os.path.join(data_root, 'gt', 'train_labels' if is_training else 'test_labels')\n        self.image_list = os.listdir(self.image_root)\n        self.annotation_list = ['{}'.format(img_name.replace('.jpg', '')) for img_name in self.image_list]\n\n        if self.load_memory:\n            self.datas = list()\n            for item in range(len(self.image_list)):\n                self.datas.append(self.load_img_gt(item))\n\n    @staticmethod\n    def parse_carve_txt(gt_path):\n        \"\"\"\n        .mat file parser\n        :param gt_path: (str), mat file path\n        :return: (list), TextInstance\n        \"\"\"\n        lines = read_lines(gt_path + \".txt\")\n        polygons = []\n        for line in lines:\n            line = line.split(\",\")\n            gt = list(map(int, line[:-1]))\n            pts = np.stack([gt[0::2], gt[1::2]]).T.astype(np.int32)\n            label = line[-1].split(\"###\")[-1].replace(\"###\", \"#\")\n            polygons.append(TextInstance(pts, 'c', label))\n\n        return polygons\n\n    @staticmethod\n    def parse_carve_xml(gt_path):\n        \"\"\"\n        .mat file parser\n        :param gt_path: (str), mat file path\n        :return: (list), TextInstance\n        \"\"\"\n        root = ET.parse(gt_path + \".xml\").getroot()\n\n        polygons = []\n        for tag in root.findall('image/box'):\n            label = tag.find(\"label\").text.replace(\"###\", \"#\")\n            gt = list(map(int, tag.find(\"segs\").text.split(\",\")))\n            pts = np.stack([gt[0::2], gt[1::2]]).T.astype(np.int32)\n\n            polygons.append(TextInstance(pts, 'c', label))\n\n        return polygons\n\n    def load_img_gt(self, item):\n        image_id = self.image_list[item]\n        image_path = os.path.join(self.image_root, image_id)\n\n        # Read image data\n        image = pil_load_img(image_path)\n        try:\n            h, w, c = image.shape\n            assert (c == 3)\n        except:\n            image = cv2.imread(image_path)\n            image = cv2.cvtColor(image, cv2.COLOR_BGR2RGB)\n            image = np.array(image)\n\n        # Read annotation\n        if self.is_training:\n            annotation_id = self.annotation_list[item]\n            annotation_path = os.path.join(self.annotation_root, annotation_id)\n            polygons = self.parse_carve_xml(annotation_path)\n            pass\n        else:\n            annotation_id = self.annotation_list[item]\n            annotation_path = os.path.join(self.annotation_root, \"000\" + annotation_id)\n            polygons = self.parse_carve_txt(annotation_path)\n\n        data = dict()\n        data[\"image\"] = image\n        data[\"polygons\"] = polygons\n        data[\"image_id\"] = image_id.split(\"/\")[-1]\n        data[\"image_path\"] = image_path\n\n        return data\n\n    def __getitem__(self, item):\n        if self.load_memory:\n            data = self.datas[item]\n        else:\n            data = self.load_img_gt(item)\n\n        if self.is_training:\n            return self.get_training_data(data[\"image\"], data[\"polygons\"],\n                                          image_id=data[\"image_id\"], image_path=data[\"image_path\"])\n        else:\n            return self.get_test_data(data[\"image\"], data[\"polygons\"],\n                                      image_id=data[\"image_id\"], image_path=data[\"image_path\"])\n\n    def __len__(self):\n        return len(self.image_list)\n\n\nif __name__ == '__main__':\n    from util.augmentation import Augmentation\n    from util import canvas as cav\n    import time\n\n    means = (0.485, 0.456, 0.406)\n    stds = (0.229, 0.224, 0.225)\n\n    transform = Augmentation(\n        size=640, mean=means, std=stds\n    )\n\n    trainset = Ctw1500Text_New(\n        data_root='../data/CTW-1500',\n        is_training=True,\n        transform=transform\n    )\n\n    for idx in range(0, len(trainset)):\n        t0 = time.time()\n        img, train_mask, tr_mask, distance_field, \\\n        direction_field, weight_matrix, ctrl_points, proposal_points, ignore_tags = trainset[idx]\n        img, train_mask, tr_mask, distance_field, \\\n        direction_field, weight_matrix, ctrl_points, proposal_points, ignore_tags\\\n            = map(lambda x: x.cpu().numpy(),\n                  (img, train_mask, tr_mask, distance_field,\n                   direction_field, weight_matrix, ctrl_points, proposal_points, ignore_tags))\n\n        img = img.transpose(1, 2, 0)\n        img = ((img * stds + means) * 255).astype(np.uint8)\n\n\n        boundary_point = ctrl_points[np.where(ignore_tags != 0)[0]]\n        for i, bpts in enumerate(boundary_point):\n            cv2.drawContours(img, [bpts.astype(np.int32)], -1, (0, 255, 0), 1)\n            for j, pp in enumerate(bpts):\n                if j == 0:\n                    cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (255, 0, 255), -1)\n                elif j == 1:\n                    cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (0, 255, 255), -1)\n                else:\n                    cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (0, 0, 255), -1)\n\n            ppts = proposal_points[i]\n            cv2.drawContours(img, [ppts.astype(np.int32)], -1, (0, 0, 255), 1)\n            for j, pp in enumerate(ppts):\n                if j == 0:\n                    cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (255, 0, 255), -1)\n                elif j == 1:\n                    cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (0, 255, 255), -1)\n                else:\n                    cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (0, 0, 255), -1)\n            cv2.imshow('imgs', img)\n            cv2.waitKey(0)\n\n        # from util.misc import split_edge_seqence\n        # from cfglib.config import config as cfg\n        #\n        # ret, labels = cv2.connectedComponents(np.array(distance_field >0.35, dtype=np.uint8), connectivity=4)\n        # for idx in range(1, ret):\n        #     text_mask = labels == idx\n        #     ist_id = int(np.sum(text_mask*tr_mask)/np.sum(text_mask))-1\n        #     contours, _ = cv2.findContours(text_mask.astype(np.uint8), cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE)\n        #     epsilon = 0.007 * cv2.arcLength(contours[0], True)\n        #     approx = cv2.approxPolyDP(contours[0], epsilon, True).reshape((-1, 2))\n        #\n        #     pts_num = approx.shape[0]\n        #     e_index = [(i, (i + 1) % pts_num) for i in range(pts_num)]\n        #     control_points = split_edge_seqence(approx, e_index, cfg.num_points)\n        #     control_points = np.array(control_points[:cfg.num_points, :]).astype(np.int32)\n        #\n        #     cv2.drawContours(img, [ctrl_points[ist_id].astype(np.int32)], -1, (0, 255, 0), 1)\n        #     cv2.drawContours(img, [control_points.astype(np.int32)], -1, (0, 0, 255), 1)\n        #     for j,  pp in enumerate(control_points):\n        #         if j == 0:\n        #             cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (255, 0, 255), -1)\n        #         elif j == 1:\n        #             cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (0, 255, 255), -1)\n        #         else:\n        #             cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (0, 255, 0), -1)\n        #\n        #     cv2.imshow('imgs', img)\n        #     cv2.waitKey(0)\n"
  },
  {
    "path": "dataset/Icdar15_Text.py",
    "content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n__author__ = '古溪'\nimport re\nimport os\nimport numpy as np\nfrom util import strs\nfrom dataset.data_util import pil_load_img\nfrom dataset.dataload import TextDataset, TextInstance\nfrom util.io import read_lines\nfrom util.misc import norm2\n\n\nclass Icdar15Text(TextDataset):\n\n    def __init__(self, data_root, is_training=True, load_memory=False, transform=None, ignore_list=None):\n        super().__init__(transform, is_training)\n        self.data_root = data_root\n        self.is_training = is_training\n        self.load_memory = load_memory\n\n        self.image_root = os.path.join(data_root, 'Train' if is_training else 'Test')\n        self.annotation_root = os.path.join(data_root,'Train' if is_training else 'Test')\n        self.image_list = os.listdir(self.image_root)\n\n        p = re.compile('.rar|.txt')\n        self.image_list = [x for x in self.image_list if not p.findall(x)]\n        p = re.compile('(.jpg|.JPG|.PNG|.JPEG)')\n        self.annotation_list = ['{}'.format(p.sub(\"\", img_name)) for img_name in self.image_list]\n\n        if self.load_memory:\n            self.datas = list()\n            for item in range(len(self.image_list)):\n                self.datas.append(self.load_img_gt(item))\n\n    @staticmethod\n    def parse_txt(gt_path):\n        \"\"\"\n        .mat file parser\n        :param gt_path: (str), mat file path\n        :return: (list), TextInstance\n        \"\"\"\n        lines = read_lines(gt_path+\".txt\")\n        polygons = []\n        for line in lines:\n            line = strs.remove_all(line.strip('\\ufeff'), '\\xef\\xbb\\xbf')\n            gt = line.split(',')\n            x1, y1, x2, y2, x3, y3, x4, y4 = list(map(int, gt[:8]))\n            xx = [x1, x2, x3, x4]\n            yy = [y1, y2, y3, y4]\n\n            label = gt[-1].strip().replace(\"###\", \"#\")\n\n            pts = np.stack([xx, yy]).T.astype(np.int32)\n            # d1 = norm2(pts[0] - pts[1])\n            # d2 = norm2(pts[1] - pts[2])\n            # d3 = norm2(pts[2] - pts[3])\n            # d4 = norm2(pts[3] - pts[0])\n            # if min([d1, d2, d3, d4]) < 2:\n            #     continue\n            polygons.append(TextInstance(pts, 'c', label))\n\n        return polygons\n\n    def load_img_gt(self, item):\n        image_id = self.image_list[item]\n        image_path = os.path.join(self.image_root, image_id)\n\n        # Read image data\n        image = pil_load_img(image_path)\n        try:\n            # Read annotation\n            annotation_id = self.annotation_list[item]\n            annotation_path = os.path.join(self.annotation_root, annotation_id)\n            polygons = self.parse_txt(annotation_path)\n        except Exception as e:\n            print(e)\n            polygons = None\n\n        data = dict()\n        data[\"image\"] = image\n        data[\"polygons\"] = polygons\n        data[\"image_id\"] = image_id.split(\"/\")[-1]\n        data[\"image_path\"] = image_path\n\n        return data\n\n    def __getitem__(self, item):\n\n        if self.load_memory:\n            data = self.datas[item]\n        else:\n            data = self.load_img_gt(item)\n\n        if self.is_training:\n            return self.get_training_data(data[\"image\"], data[\"polygons\"],\n                                          image_id=data[\"image_id\"], image_path=data[\"image_path\"])\n        else:\n            return self.get_test_data(data[\"image\"], data[\"polygons\"],\n                                      image_id=data[\"image_id\"], image_path=data[\"image_path\"])\n\n    def __len__(self):\n        return len(self.image_list)\n\n\nif __name__ == '__main__':\n    import cv2\n    from util.augmentation import Augmentation\n    from util import canvas as cav\n    import time\n\n    means = (0.485, 0.456, 0.406)\n    stds = (0.229, 0.224, 0.225)\n\n    transform = Augmentation(\n        size=640, mean=means, std=stds\n    )\n\n    trainset = Icdar15Text(\n        data_root='../data/Icdar2015',\n        is_training=True,\n        transform=transform\n    )\n\n    # img, train_mask, tr_mask, tcl_mask, radius_map, sin_map, cos_map, meta = trainset[944]\n    for idx in range(0, len(trainset)):\n        t0 = time.time()\n        img, train_mask, tr_mask = trainset[idx]\n        img, train_mask, tr_mask = map(lambda x: x.cpu().numpy(), (img, train_mask, tr_mask))\n\n        img = img.transpose(1, 2, 0)\n        img = ((img * stds + means) * 255).astype(np.uint8)\n        print(idx, img.shape)\n\n        for i in range(tr_mask.shape[2]):\n            cv2.imshow(\"tr_mask_{}\".format(i),\n                       cav.heatmap(np.array(tr_mask[:, :, i] * 255 / np.max(tr_mask[:, :, i]), dtype=np.uint8)))\n\n        cv2.imshow(\"train_mask\", cav.heatmap(np.array(train_mask * 255 / np.max(train_mask), dtype=np.uint8)))\n\n        cv2.imshow('imgs', img)\n        cv2.waitKey(0)\n"
  },
  {
    "path": "dataset/Icdar17_Text.py",
    "content": "# -*- coding: utf-8 -*-\n__author__ = \"S.X.Zhang\"\nimport os\nimport numpy as np\nfrom dataset.data_util import pil_load_img\nfrom dataset.dataload import TextDataset, TextInstance\nfrom util.io import read_lines\nfrom util.misc import norm2\nfrom util import strs\nimport cv2\n\n\nclass Mlt2017Text(TextDataset):\n\n    def __init__(self, data_root, is_training=True, load_memory=False, transform=None, ignore_list=None):\n        super().__init__(transform, is_training)\n        self.data_root = data_root\n        self.is_training = is_training\n        self.load_memory = load_memory\n\n        if is_training:\n            with open(os.path.join(data_root, 'train_list.txt')) as f:\n                self.img_train_list = [line.strip() for line in f.readlines()]\n\n            with open(os.path.join(data_root, 'val_list.txt')) as f:\n                self.img_val_list = [line.strip() for line in f.readlines()]\n\n            # with open(os.path.join(data_root, 'ic15_list.txt')) as f:\n            #     self.img_15_list = [line.strip() for line in f.readlines()]\n\n            if ignore_list:\n                with open(ignore_list) as f:\n                    ignore_list = f.readlines()\n                    ignore_list = [line.strip() for line in ignore_list]\n            else:\n                ignore_list = []\n\n            self.img_list = self.img_val_list + self.img_train_list #+self.img_15_list\n            # self.img_list = list(filter(lambda img: img.replace('', '') not in ignore_list, self.img_list))\n        else:\n            with open(os.path.join(data_root, 'test_list.txt')) as f:\n                self.img_list = [line.strip() for line in f.readlines()]\\\n\n        if self.load_memory:\n            self.datas = list()\n            for item in range(len(self.image_list)):\n                self.datas.append(self.load_img_gt(item))\n\n    @staticmethod\n    def parse_txt(gt_path):\n        \"\"\"\n        .mat file parser\n        :param gt_path: (str), mat file path\n        :return: (list), TextInstance\n        \"\"\"\n        lines = read_lines(gt_path + \".txt\")\n        polygons = []\n        for line in lines:\n            line = strs.remove_all(line.strip('\\ufeff'), '\\xef\\xbb\\xbf')\n            gt = line.split(',')\n            x1, y1, x2, y2, x3, y3, x4, y4 = list(map(int, gt[:8]))\n            xx = [x1, x2, x3, x4]\n            yy = [y1, y2, y3, y4]\n            if gt[-1].strip() == \"###\":\n                label = gt[-1].strip().replace(\"###\", \"#\")\n            else:\n                label = \"GG\"\n            pts = np.stack([xx, yy]).T.astype(np.int32)\n            polygons.append(TextInstance(pts, 'c', label))\n\n        return polygons\n\n    def load_img_gt(self, item):\n        image_id = self.img_list[item]\n        if self.is_training:\n            # Read annotation\n            annotation_id = \"{}/gt_{}\".format(\"/\".join(image_id.split(\"/\")[0:-1]),\n                                              image_id.split(\"/\")[-1].replace(\".jpg\", ''))\n            annotation_path = os.path.join(self.data_root, annotation_id)\n\n            polygons = self.parse_txt(annotation_path)\n        else:\n            polygons = None\n\n        # Read image data\n        image_path = os.path.join(self.data_root, image_id)\n        image = pil_load_img(image_path)\n        try:\n            h, w, c = image.shape\n            assert (c == 3)\n        except:\n            image = cv2.imread(image_path)\n            image = cv2.cvtColor(image, cv2.COLOR_BGR2RGB)\n            image = np.array(image)\n\n        data = dict()\n        data[\"image\"] = image\n        data[\"polygons\"] = polygons\n        data[\"image_id\"] = image_id.split(\"/\")[-1]\n        data[\"image_path\"] = image_path\n\n        return data\n\n    def __getitem__(self, item):\n\n        if self.load_memory:\n            data = self.datas[item]\n        else:\n            data = self.load_img_gt(item)\n\n        if self.is_training:\n            return self.get_training_data(data[\"image\"], data[\"polygons\"],\n                                          image_id=data[\"image_id\"], image_path=data[\"image_path\"])\n        else:\n            return self.get_test_data(data[\"image\"], data[\"polygons\"],\n                                      image_id=data[\"image_id\"], image_path=data[\"image_path\"])\n\n    def __len__(self):\n        return len(self.img_list)\n\n\nif __name__ == '__main__':\n    import os\n    from util.augmentation import BaseTransform, Augmentation\n    from util import canvas as cav\n    import time\n\n    means = (0.485, 0.456, 0.406)\n    stds = (0.229, 0.224, 0.225)\n\n    transform = Augmentation(\n        size=640, mean=means, std=stds\n    )\n\n    trainset = Mlt2017Text(\n        data_root='../data/MLT2017',\n        is_training=True,\n        transform=transform\n    )\n    # img, train_mask, tr_mask, tcl_mask, radius_map, sin_map, cos_map, meta = trainset[944]\n    for idx in range(0, len(trainset)):\n        t0 = time.time()\n        img, train_mask, tr_mask = trainset[idx]\n        img, train_mask, tr_mask = map(lambda x: x.cpu().numpy(), (img, train_mask, tr_mask))\n\n        img = img.transpose(1, 2, 0)\n        img = ((img * stds + means) * 255).astype(np.uint8)\n        print(idx, img.shape)\n\n        for i in range(tr_mask.shape[2]):\n            cv2.imshow(\"tr_mask_{}\".format(i),\n                       cav.heatmap(np.array(tr_mask[:, :, i] * 255 / np.max(tr_mask[:, :, i]), dtype=np.uint8)))\n        cv2.imshow(\"tr_mask\",\n                   cav.heatmap(np.array(train_mask * 255 / np.max(train_mask), dtype=np.uint8)))\n\n        cv2.imshow('imgs', img)\n        cv2.waitKey(0)\n\n"
  },
  {
    "path": "dataset/Icdar19ArT_Json.py",
    "content": "# -*- coding: utf-8 -*-\n__author__ = \"S.X.Zhang\"\nimport os\nimport numpy as np\nfrom dataset.data_util import pil_load_img\nfrom dataset.dataload import TextDataset, TextInstance\nfrom util.io import read_lines, load_json\nimport cv2\n\n\nclass ArtTextJson(TextDataset):\n    def __init__(self, data_root, is_training=True, ignore_list=None, load_memory=False, transform=None):\n        super().__init__(transform, is_training)\n\n        self.data_root = data_root\n        self.is_training = is_training\n        self.load_memory = load_memory\n\n        if ignore_list:\n            with open(ignore_list) as f:\n                ignore_list = f.readlines()\n                ignore_list = [line.strip() for line in ignore_list]\n        else:\n            ignore_list = []\n\n        self.image_root = os.path.join(data_root, \"Images\", \"Train\" if is_training else \"Test\")\n        self.image_list = os.listdir(self.image_root)\n        self.image_list = list(filter(lambda img: img.replace(\".jpg\", \"\") not in ignore_list, self.image_list))\n\n        if self.is_training:\n            annotation_file = os.path.join(data_root, \"gt\", \"train_labels.json\" if is_training else \"None\")\n            self.annotation_data = load_json(annotation_file)\n            self.image_list, self.annotationdata_list = self.preprocess(self.image_list, self.annotation_data)\n\n        if self.load_memory:\n            self.datas = list()\n            for item in range(len(self.image_list)):\n                self.datas.append(self.load_img_gt(item))\n\n    @staticmethod\n    def preprocess(image_list: list, annotation_data: dict):\n        \"\"\"\n        Decompose the all in one annotation_dict into seperate list element(annotation_list).\n        The order of the annotation_list will be the same with image_list. To keep it simple,\n        here both image_list and annotationdata_list will be sorted following the same criteria.\n        \"\"\"\n        annotationdata_list = [\n            v for _, v in sorted(annotation_data.items(), key=lambda item: item[0])\n        ]\n        image_list = sorted(image_list)\n        return image_list, annotationdata_list\n\n    def parse_curve_txt(self, gt_data):\n        polygons = []\n        for candidate in gt_data:\n            text = candidate.get(\"transcription\").strip().replace(\"###\", \"#\")\n            pts = candidate.get(\"points\")\n            pts = np.array(pts).astype(np.int32)\n            if pts.shape[0] < 4:\n                continue\n            polygons.append(TextInstance(pts, \"c\", text))\n\n        return polygons\n\n    def load_img_gt(self, item):\n        image_id = self.image_list[item]\n        image_path = os.path.join(self.image_root, image_id)\n\n        image = pil_load_img(image_path)\n        try:\n            assert image.shape[-1] == 3\n        except:\n            image = cv2.imread(image_path)\n            image = cv2.cvtColor(image, cv2.COLOR_BGR2RGB)\n            image = np.array(image)\n\n        if self.is_training:\n            # Read annotation\n            annotation_data = self.annotationdata_list[item]\n            polygons = self.parse_curve_txt(annotation_data)\n        else:\n            polygons = None\n\n        data = dict()\n        data[\"image\"] = image\n        data[\"polygons\"] = polygons\n        data[\"image_id\"] = image_id\n        data[\"image_path\"] = image_path\n\n        return data\n\n    def __getitem__(self, item):\n\n        if self.load_memory:\n            data = self.datas[item]\n        else:\n            data = self.load_img_gt(item)\n\n        if self.is_training:\n            return self.get_training_data(data[\"image\"], data[\"polygons\"],\n                                          image_id=data[\"image_id\"], image_path=data[\"image_path\"])\n        else:\n            return self.get_test_data(data[\"image\"], data[\"polygons\"],\n                                      image_id=data[\"image_id\"], image_path=data[\"image_path\"])\n\n    def __len__(self):\n        return len(self.image_list)\n\n\nif __name__ == \"__main__\":\n    # execute in base dir `PYTHONPATH=.:$PYTHONPATH python dataset/Icdar19ArT_Text.py`\n    from util.augmentation import Augmentation\n    from util.misc import regularize_sin_cos\n    from util.pbox import bbox_transfor_inv, minConnectPath\n    from util import canvas as cav\n    import time\n\n    means = (0.485, 0.456, 0.406)\n    stds = (0.229, 0.224, 0.225)\n\n    transform = Augmentation(size=640, mean=means, std=stds)\n\n    trainset = ArtTextJson(\n        data_root=\"/home/prir1005/pubdata/ArT\",\n        is_training=True,\n        transform=transform,\n    )\n\n    t0 = time.time()\n    img, train_mask, tr_mask, distance_field, \\\n    direction_field, weight_matrix, ctrl_points, proposal_points, ignore_tags = trainset[1000]\n    img, train_mask, tr_mask, distance_field, \\\n    direction_field, weight_matrix, ctrl_points, proposal_points, ignore_tags \\\n        = map(lambda x: x.cpu().numpy(),\n              (img, train_mask, tr_mask, distance_field,\n               direction_field, weight_matrix, ctrl_points, proposal_points, ignore_tags))\n\n    img = img.transpose(1, 2, 0)\n    img = ((img * stds + means) * 255).astype(np.uint8)\n\n    distance_map = cav.heatmap(np.array(distance_field * 255 / np.max(distance_field), dtype=np.uint8))\n    cv2.imshow(\"distance_map\", distance_map)\n    cv2.waitKey(0)\n\n    direction_map = cav.heatmap(np.array(direction_field[0] * 255 / np.max(direction_field[0]), dtype=np.uint8))\n    cv2.imshow(\"direction_field\", direction_map)\n    cv2.waitKey(0)\n\n    from util.vis_flux import vis_direction_field\n\n    vis_direction_field(direction_field)\n\n    weight_map = cav.heatmap(np.array(weight_matrix * 255 / np.max(weight_matrix), dtype=np.uint8))\n    cv2.imshow(\"weight_matrix\", weight_map)\n    # cv2.waitKey(0)\n\n    boundary_point = ctrl_points[np.where(ignore_tags != 0)[0]]\n    for i, bpts in enumerate(boundary_point):\n        cv2.drawContours(img, [bpts.astype(np.int32)], -1, (0, 255, 0), 1)\n        for j, pp in enumerate(bpts):\n            if j == 0:\n                cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (255, 0, 255), -1)\n            elif j == 1:\n                cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (0, 255, 255), -1)\n            else:\n                cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (0, 0, 255), -1)\n\n        ppts = proposal_points[i]\n        cv2.drawContours(img, [ppts.astype(np.int32)], -1, (0, 0, 255), 1)\n        for j, pp in enumerate(ppts):\n            if j == 0:\n                cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (255, 0, 255), -1)\n            elif j == 1:\n                cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (0, 255, 255), -1)\n            else:\n                cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (0, 0, 255), -1)\n        cv2.imshow('imgs', img)\n        cv2.waitKey(0)\n"
  },
  {
    "path": "dataset/Icdar19ArT_Text.py",
    "content": "# -*- coding: utf-8 -*-\n__author__ = \"S.X.Zhang\"\nimport warnings\nwarnings.filterwarnings(\"ignore\")\nimport os\nimport re\nimport numpy as np\nimport scipy.io as io\nfrom util import strs\nfrom dataset.data_util import pil_load_img\nfrom dataset.dataload import TextDataset, TextInstance\nimport cv2\nfrom util import io as libio\n\n\nclass ArtText(TextDataset):\n\n    def __init__(self, data_root, ignore_list=None, is_training=True, load_memory=False, transform=None):\n        super().__init__(transform, is_training)\n        self.data_root = data_root\n        self.is_training = is_training\n        self.load_memory = load_memory\n\n        if ignore_list:\n            with open(ignore_list) as f:\n                ignore_list = f.readlines()\n                ignore_list = [line.strip() for line in ignore_list]\n        else:\n            ignore_list = []\n\n        self.image_root = os.path.join(data_root, 'Images', 'Train' if is_training else 'Test')\n        self.annotation_root = os.path.join(data_root, 'gt', 'Train' if is_training else 'Test')\n        self.image_list = os.listdir(self.image_root)\n        self.image_list = list(filter(lambda img: img.replace('.jpg', '') not in ignore_list, self.image_list))\n        self.annotation_list = ['{}'.format(img_name.replace('.jpg', '')) for img_name in self.image_list]\n\n        if self.load_memory:\n            self.datas = list()\n            for item in range(len(self.image_list)):\n                self.datas.append(self.load_img_gt(item))\n\n    @staticmethod\n    def parse_carve_txt(gt_path):\n        \"\"\"\n        .mat file parser\n        :param gt_path: (str), mat file path\n        :return: (list), TextInstance\n        \"\"\"\n        lines = libio.read_lines(gt_path + \".txt\")\n        polygons = []\n        for line in lines:\n            line = strs.remove_all(line, '\\xef\\xbb\\xbf')\n            gt = line.split(',')\n            gt_corrdinate = gt[:-3]\n            if len(gt_corrdinate) < 6:\n                continue\n            pts = np.stack([gt_corrdinate[0::2], gt_corrdinate[1::2]]).T.astype(np.int32)\n            text = gt[-1].replace(\"\\n\",\"\")\n            polygons.append(TextInstance(pts, 'c', text))\n        # print(polygon)\n        return polygons\n\n    def load_img_gt(self, item):\n        image_id = self.image_list[item]\n        image_path = os.path.join(self.image_root, image_id)\n\n        # Read image data\n        image = pil_load_img(image_path)\n        try:\n            h, w, c = image.shape\n            assert (c == 3)\n        except:\n            image = cv2.imread(image_path)\n            image = cv2.cvtColor(image, cv2.COLOR_BGR2RGB)\n            image = np.array(image)\n\n        # Read annotation\n        annotation_id = self.annotation_list[item]\n        annotation_path = os.path.join(self.annotation_root, annotation_id)\n        # polygons = self.parse_mat(annotation_path)\n        polygons = self.parse_carve_txt(annotation_path)\n\n        data = dict()\n        data[\"image\"] = image\n        data[\"polygons\"] = polygons\n        data[\"image_id\"] = image_id\n        data[\"image_path\"] = image_path\n\n        return data\n\n    def __getitem__(self, item):\n\n        if self.load_memory:\n            data = self.datas[item]\n        else:\n            data = self.load_img_gt(item)\n\n        if self.is_training:\n            return self.get_training_data(data[\"image\"], data[\"polygons\"],\n                                          image_id=data[\"image_id\"], image_path=data[\"image_path\"])\n        else:\n            return self.get_test_data(data[\"image\"], data[\"polygons\"],\n                                      image_id=data[\"image_id\"], image_path=data[\"image_path\"])\n\n    def __len__(self):\n        return len(self.image_list)\n\n\nif __name__ == '__main__':\n    import time\n    from util.augmentation import Augmentation, BaseTransformNresize\n    from util import canvas as cav\n\n    means = (0.485, 0.456, 0.406)\n    stds = (0.229, 0.224, 0.225)\n\n    transform = Augmentation(\n        size=640, mean=means, std=stds\n    )\n\n    trainset = ArtText(\n        data_root=\"/home/prir1005/pubdata/ArT\",\n        is_training=True,\n        transform=transform,\n    )\n\n    t0 = time.time()\n    image,  tr_mask, train_mask, label_mask, gt_points, ignore_tags = trainset[30]\n    image,  tr_mask, train_mask, label_mask, gt_points, ignore_tags = \\\n        map(lambda x: x.cpu().numpy(), (image,  tr_mask, train_mask, label_mask, gt_points, ignore_tags))\n\n    img = image.transpose(1, 2, 0)\n    img = ((img * stds + means) * 255).astype(np.uint8)\n\n    for i in range(tr_mask.shape[0]):\n        heatmap = cav.heatmap(np.array(tr_mask[i, :, :] * 255 / np.max(tr_mask[i, :, :]), dtype=np.uint8))\n        cv2.imshow(\"tr_mask_{}\".format(i), heatmap)\n        cv2.imshow('train_mask_{}'.format(i), cav.heatmap(np.array(train_mask[i] * 255 / np.max(train_mask[i]), dtype=np.uint8)))\n\n    boundary_points = gt_points[np.where(ignore_tags != 0)[0]]\n    ignore_points = gt_points[np.where(ignore_tags == -1)[0]]\n    for i in range(tr_mask.shape[0]):\n        im = img.copy()\n        gt_point = boundary_points[:, i, :, :]\n        ignore_point = ignore_points[:, i, :, :]\n        cv2.drawContours(im, gt_point.astype(np.int32), -1, (0, 255, 0), 1)\n        cv2.drawContours(im, ignore_point.astype(np.int32), -1, (0, 0, 255), 1)\n        cv2.imshow('imgs_{}'.format(i), im)\n        cv2.waitKey(0)\n\n\n\n\n\n\n"
  },
  {
    "path": "dataset/Icdar19LSVT_Json.py",
    "content": "# -*- coding: utf-8 -*-\n__author__ = \"S.X.Zhang\"\nimport os\nimport numpy as np\nfrom dataset.data_util import pil_load_img\nfrom dataset.dataload import TextDataset, TextInstance\nfrom util.io import read_lines, load_json\nimport cv2\n\n\nclass LsvtTextJson(TextDataset):\n    def __init__(self, data_root, is_training=True, ignore_list=None,\n                 care_flag=True, load_memory=False, transform=None):\n        super().__init__(transform, is_training)\n\n        self.data_root = data_root\n        self.is_training = is_training\n        self.care_flag = care_flag\n        self.load_memory = load_memory\n\n        if ignore_list:\n            with open(ignore_list) as f:\n                ignore_list = f.readlines()\n                ignore_list = [line.strip() for line in ignore_list]\n        else:\n            ignore_list = []\n\n        self.image_root = os.path.join(data_root, \"Images\", \"Train\" if is_training else \"Test\")\n        self.image_list = os.listdir(self.image_root)\n        self.image_list = list(filter(lambda img: img.replace(\".jpg\", \"\") not in ignore_list, self.image_list))\n\n        if self.is_training:\n            annotation_file = os.path.join(data_root, \"gt\", \"train_full_labels.json\" if is_training else \"None\")\n            self.annotation_data = load_json(annotation_file)\n            self.image_list, self.annotationdata_list = self.preprocess(self.image_list, self.annotation_data)\n\n        if self.load_memory:\n            self.datas = list()\n            for item in range(len(self.image_list)):\n                self.datas.append(self.load_img_gt(item))\n\n    @staticmethod\n    def preprocess(image_list: list, annotation_data: dict):\n        \"\"\"\n        Decompose the all in one annotation_dict into seperate list element(annotation_list).\n        The order of the annotation_list will be the same with image_list. To keep it simple,\n        here both image_list and annotationdata_list will be sorted following the same criteria.\n        \"\"\"\n        annotationdata_list = [\n            v for _, v in sorted(annotation_data.items(), key=lambda item: item[0])\n        ]\n        image_list = sorted(image_list)\n        return image_list, annotationdata_list\n\n    def parse_curve_txt(self, gt_data):\n        polygons = []\n        for candidate in gt_data:\n            text = candidate.get(\"transcription\").strip().replace(\"###\", \"#\")\n            pts = candidate.get(\"points\")\n            pts = np.array(pts).astype(np.int32)\n            if pts.shape[0] < 4:\n                continue\n            polygons.append(TextInstance(pts, \"c\", text))\n\n        return polygons\n\n    def load_img_gt(self, item):\n        image_id = self.image_list[item]\n        image_path = os.path.join(self.image_root, image_id)\n\n        image = pil_load_img(image_path)\n        try:\n            assert image.shape[-1] == 3\n        except:\n            image = cv2.imread(image_path)\n            image = cv2.cvtColor(image, cv2.COLOR_BGR2RGB)\n            image = np.array(image)\n\n        if self.is_training:\n            # Read annotation\n            annotation_data = self.annotationdata_list[item]\n            polygons = self.parse_curve_txt(annotation_data)\n        else:\n            polygons = None\n\n        data = dict()\n        data[\"image\"] = image\n        data[\"polygons\"] = polygons\n        data[\"image_id\"] = image_id\n        data[\"image_path\"] = image_path\n\n        return data\n\n    def __getitem__(self, item):\n\n        if self.load_memory:\n            data = self.datas[item]\n        else:\n            data = self.load_img_gt(item)\n\n        if self.is_training:\n            return self.get_training_data(data[\"image\"], data[\"polygons\"],\n                                          image_id=data[\"image_id\"], image_path=data[\"image_path\"])\n        else:\n            return self.get_test_data(data[\"image\"], data[\"polygons\"],\n                                      image_id=data[\"image_id\"], image_path=data[\"image_path\"])\n\n    def __len__(self):\n        return len(self.image_list)\n\n\nif __name__ == \"__main__\":\n    # execute in base dir `PYTHONPATH=.:$PYTHONPATH python dataset/Icdar19ArT_Text.py`\n    from util.augmentation import Augmentation\n    from util.misc import regularize_sin_cos\n    from util.pbox import bbox_transfor_inv, minConnectPath\n    from util import canvas as cav\n    import time\n\n    means = (0.485, 0.456, 0.406)\n    stds = (0.229, 0.224, 0.225)\n\n    transform = Augmentation(size=640, mean=means, std=stds)\n\n    trainset = LsvtTextJson(\n        data_root=\"/home/prir1005/pubdata/LSVT\",\n        is_training=True,\n        transform=transform,\n    )\n\n    t0 = time.time()\n    img, train_mask, tr_mask, distance_field, \\\n    direction_field, weight_matrix, ctrl_points, proposal_points, ignore_tags = trainset[1567]\n    img, train_mask, tr_mask, distance_field, \\\n    direction_field, weight_matrix, ctrl_points, proposal_points, ignore_tags \\\n        = map(lambda x: x.cpu().numpy(),\n              (img, train_mask, tr_mask, distance_field,\n               direction_field, weight_matrix, ctrl_points, proposal_points, ignore_tags))\n\n    img = img.transpose(1, 2, 0)\n    img = ((img * stds + means) * 255).astype(np.uint8)\n\n    distance_map = cav.heatmap(np.array(distance_field * 255 / np.max(distance_field), dtype=np.uint8))\n    cv2.imshow(\"distance_map\", distance_map)\n    cv2.waitKey(0)\n\n    direction_map = cav.heatmap(np.array(direction_field[0] * 255 / np.max(direction_field[0]), dtype=np.uint8))\n    cv2.imshow(\"direction_field\", direction_map)\n    cv2.waitKey(0)\n\n    from util.vis_flux import vis_direction_field\n\n    vis_direction_field(direction_field)\n\n    weight_map = cav.heatmap(np.array(weight_matrix * 255 / np.max(weight_matrix), dtype=np.uint8))\n    cv2.imshow(\"weight_matrix\", weight_map)\n    # cv2.waitKey(0)\n\n    boundary_point = ctrl_points[np.where(ignore_tags != 0)[0]]\n    for i, bpts in enumerate(boundary_point):\n        cv2.drawContours(img, [bpts.astype(np.int32)], -1, (0, 255, 0), 1)\n        for j, pp in enumerate(bpts):\n            if j == 0:\n                cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (255, 0, 255), -1)\n            elif j == 1:\n                cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (0, 255, 255), -1)\n            else:\n                cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (0, 0, 255), -1)\n\n        ppts = proposal_points[i]\n        cv2.drawContours(img, [ppts.astype(np.int32)], -1, (0, 0, 255), 1)\n        for j, pp in enumerate(ppts):\n            if j == 0:\n                cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (255, 0, 255), -1)\n            elif j == 1:\n                cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (0, 255, 255), -1)\n            else:\n                cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (0, 0, 255), -1)\n        cv2.imshow('imgs', img)\n        cv2.waitKey(0)\n"
  },
  {
    "path": "dataset/Icdar19_Text.py",
    "content": "# -*- coding: utf-8 -*-\n__author__ = \"S.X.Zhang\"\nimport os\nimport numpy as np\nfrom dataset.data_util import pil_load_img\nfrom dataset.dataload import TextDataset, TextInstance\nfrom util.io import read_lines\nfrom util.misc import norm2\nfrom util import strs\nimport cv2\n\n\nclass Mlt2019Text(TextDataset):\n\n    def __init__(self, data_root, is_training=True, transform=None, load_memory=False, ignore_list=None):\n        super().__init__(transform, is_training)\n        self.data_root = data_root\n        self.is_training = is_training\n        self.load_memory = load_memory\n\n        self.image_root = os.path.join(data_root, 'TrainImages' if is_training else 'TestImages')\n        self.annotation_root = os.path.join(data_root, 'Train_gt' if is_training else None)\n\n        if is_training:\n            with open(os.path.join(data_root, 'train_list.txt')) as f:\n                self.img_list = [line.strip() for line in f.readlines()]\n\n            if ignore_list:\n                with open(ignore_list) as f:\n                    ignore_list = f.readlines()\n                    ignore_list = [line.strip() for line in ignore_list]\n            else:\n                ignore_list = []\n        else:\n            with open(os.path.join(data_root, 'test_list.txt')) as f:\n                self.img_list = [line.strip() for line in f.readlines()]\n\n        if self.load_memory:\n            self.datas = list()\n            for item in range(len(self.image_list)):\n                self.datas.append(self.load_img_gt(item))\n\n    @staticmethod\n    def parse_txt(gt_path):\n        \"\"\"\n        .mat file parser\n        :param gt_path: (str), mat file path\n        :return: (list), TextInstance\n        \"\"\"\n        lines = read_lines(gt_path + \".txt\")\n        polygons = []\n        for line in lines:\n            line = strs.remove_all(line.strip('\\ufeff'), '\\xef\\xbb\\xbf')\n            gt = line.split(',')\n            x1, y1, x2, y2, x3, y3, x4, y4 = list(map(int, gt[:8]))\n            xx = [x1, x2, x3, x4]\n            yy = [y1, y2, y3, y4]\n            if gt[-1].strip() == \"###\":\n                label = gt[-1].strip().replace(\"###\", \"#\")\n            else:\n                label = \"GG\"\n            pts = np.stack([xx, yy]).T.astype(np.int32)\n\n            polygons.append(TextInstance(pts, 'c', label))\n\n        return polygons\n\n    def load_img_gt(self, item):\n        image_id = self.img_list[item]\n\n        if self.is_training:\n            # Read annotation\n            annotation_path = os.path.join(self.annotation_root, image_id.split(\".\")[0])\n\n            polygons = self.parse_txt(annotation_path)\n        else:\n            polygons = None\n\n        # Read image data\n        image_path = os.path.join(self.image_root, image_id)\n        image = pil_load_img(image_path)\n        try:\n            h, w, c = image.shape\n            assert (c == 3)\n        except:\n            # print(image_path)\n            image = cv2.imread(image_path)\n            image = cv2.cvtColor(image, cv2.COLOR_BGR2RGB)\n            image = np.array(image)\n\n        data = dict()\n        data[\"image\"] = image\n        data[\"polygons\"] = polygons\n        data[\"image_id\"] = image_id.split(\"/\")[-1]\n        data[\"image_path\"] = image_path\n\n        return data\n\n    def __getitem__(self, item):\n\n        if self.load_memory:\n            data = self.datas[item]\n        else:\n            data = self.load_img_gt(item)\n\n        if self.is_training:\n            return self.get_training_data(data[\"image\"], data[\"polygons\"],\n                                          image_id=data[\"image_id\"], image_path=data[\"image_path\"])\n        else:\n            return self.get_test_data(data[\"image\"], data[\"polygons\"],\n                                      image_id=data[\"image_id\"], image_path=data[\"image_path\"])\n\n    def __len__(self):\n        return len(self.img_list)\n\n\nif __name__ == '__main__':\n    import os\n    from util.augmentation import BaseTransform, Augmentation\n    from util import canvas as cav\n    import time\n\n    means = (0.485, 0.456, 0.406)\n    stds = (0.229, 0.224, 0.225)\n\n    transform = Augmentation(\n        size=640, mean=means, std=stds\n    )\n\n    trainset = Mlt2019Text(\n        data_root='../data/MLT-2019',\n        is_training=True,\n        transform=transform\n    )\n    for idx in range(0, len(trainset)):\n        t0 = time.time()\n        img, train_mask, tr_mask, distance_field, \\\n        direction_field, weight_matrix, ctrl_points, proposal_points, ignore_tags = trainset[idx]\n        img, train_mask, tr_mask, distance_field, \\\n        direction_field, weight_matrix, ctrl_points, proposal_points, ignore_tags \\\n            = map(lambda x: x.cpu().numpy(),\n                  (img, train_mask, tr_mask, distance_field,\n                   direction_field, weight_matrix, ctrl_points, proposal_points, ignore_tags))\n\n        img = img.transpose(1, 2, 0)\n        img = ((img * stds + means) * 255).astype(np.uint8)\n\n        distance_map = cav.heatmap(np.array(distance_field * 255 / np.max(distance_field), dtype=np.uint8))\n        cv2.imshow(\"distance_map\", distance_map)\n        cv2.waitKey(0)\n\n        direction_map = cav.heatmap(np.array(direction_field[0] * 255 / np.max(direction_field[0]), dtype=np.uint8))\n        cv2.imshow(\"direction_field\", direction_map)\n        cv2.waitKey(0)\n\n        from util.vis_flux import vis_direction_field\n\n        vis_direction_field(direction_field)\n\n        weight_map = cav.heatmap(np.array(weight_matrix * 255 / np.max(weight_matrix), dtype=np.uint8))\n        cv2.imshow(\"weight_matrix\", weight_map)\n        # cv2.waitKey(0)\n\n        boundary_point = ctrl_points[np.where(ignore_tags != 0)[0]]\n        for i, bpts in enumerate(boundary_point):\n            cv2.drawContours(img, [bpts.astype(np.int32)], -1, (0, 255, 0), 1)\n            for j, pp in enumerate(bpts):\n                if j == 0:\n                    cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (255, 0, 255), -1)\n                elif j == 1:\n                    cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (0, 255, 255), -1)\n                else:\n                    cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (0, 0, 255), -1)\n\n            ppts = proposal_points[i]\n            cv2.drawContours(img, [ppts.astype(np.int32)], -1, (0, 0, 255), 1)\n            for j, pp in enumerate(ppts):\n                if j == 0:\n                    cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (255, 0, 255), -1)\n                elif j == 1:\n                    cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (0, 255, 255), -1)\n                else:\n                    cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (0, 0, 255), -1)\n            cv2.imshow('imgs', img)\n            cv2.waitKey(0)\n\n        # from util.misc import split_edge_seqence\n        # from cfglib.config import config as cfg\n        #\n        # ret, labels = cv2.connectedComponents(np.array(distance_field >0.35, dtype=np.uint8), connectivity=4)\n        # for idx in range(1, ret):\n        #     text_mask = labels == idx\n        #     ist_id = int(np.sum(text_mask*tr_mask)/np.sum(text_mask))-1\n        #     contours, _ = cv2.findContours(text_mask.astype(np.uint8), cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE)\n        #     epsilon = 0.007 * cv2.arcLength(contours[0], True)\n        #     approx = cv2.approxPolyDP(contours[0], epsilon, True).reshape((-1, 2))\n        #\n        #     control_points = split_edge_seqence(approx, cfg.num_points)\n        #     control_points = np.array(control_points[:cfg.num_points, :]).astype(np.int32)\n        #\n        #     cv2.drawContours(img, [ctrl_points[ist_id].astype(np.int32)], -1, (0, 255, 0), 1)\n        #     cv2.drawContours(img, [control_points.astype(np.int32)], -1, (0, 0, 255), 1)\n        #     for j,  pp in enumerate(control_points):\n        #         if j == 0:\n        #             cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (255, 0, 255), -1)\n        #         elif j == 1:\n        #             cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (0, 255, 255), -1)\n        #         else:\n        #             cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (0, 255, 0), -1)\n        #\n        #     cv2.imshow('imgs', img)\n        #     cv2.waitKey(0)\n\n"
  },
  {
    "path": "dataset/README.md",
    "content": "You can add your dataset load scripts here, and details you refer to the loading script of other datasets, such as CTW1500, Total-Text, and MLT2017 ..."
  },
  {
    "path": "dataset/TD500/075eval.sh",
    "content": "rm submit/*\ncp $1/*.txt submit\ncd submit/;zip -r  submit.zip *;mv submit.zip ../; cd ../\npython2 script075.py -g=gt.zip -s=submit.zip\n"
  },
  {
    "path": "dataset/TD500/Evaluation_Protocol/readme.txt",
    "content": "INSTRUCTIONS FOR THE STANDALONE SCRIPTS\nRequirements:\n-         Python version 2.7.\n-         Each Task requires different Python modules. When running the script, if some module is not installed you will see a notification and installation instructions.\n \nProcedure:\nDownload the ZIP file for the requested script and unzip it to a directory.\n \nOpen a terminal in the directory and run the command:\npython script.py –g=gt.zip –s=submit.zip\n \nIf you have already installed all the required modules, then you will see the method’s results or an error message if the submitted file is not correct.\n \nparameters:\n-g: Path of the Ground Truth file. In most cases, the Ground Truth will be included in the same Zip file named 'gt.zip', gt.txt' or 'gt.json'. If not, you will be able to get it on the Downloads page of the Task.\n-s: Path of your method's results file.\n \nOptional parameters:\n-o: Path to a directory where to copy the file ‘results.zip’ that contains per-sample results.\n-p: JSON string parameters to override the script default parameters. The parameters that can be overrided are inside the function 'default_evaluation_params' located at the begining of the evaluation Script.\n \nExample: python script.py –g=gt.zip –s=submit.zip –o=./ -p='{\" IOU_CONSTRAINT\" = 0.8}'"
  },
  {
    "path": "dataset/TD500/Evaluation_Protocol/rrc_evaluation_funcs.py",
    "content": "#!/usr/bin/env python2\n#encoding: UTF-8\nimport json\nimport sys;sys.path.append('./')\nimport zipfile\nimport re\nimport sys\nimport os\nimport codecs\nimport importlib\n# from StringIO import StringIO\n\ndef print_help():\n    sys.stdout.write('Usage: python %s.py -g=<gtFile> -s=<submFile> [-o=<outputFolder> -p=<jsonParams>]' %sys.argv[0])\n    sys.exit(2)\n    \n\ndef load_zip_file_keys(file,fileNameRegExp=''):\n    \"\"\"\n    Returns an array with the entries of the ZIP file that match with the regular expression.\n    The key's are the names or the file or the capturing group definied in the fileNameRegExp\n    \"\"\"\n    try:\n        archive=zipfile.ZipFile(file, mode='r', allowZip64=True)\n    except :\n        raise Exception('Error loading the ZIP archive.')\n\n    pairs = []\n    \n    for name in archive.namelist():\n        addFile = True\n        keyName = name\n        if fileNameRegExp!=\"\":\n            m = re.match(fileNameRegExp,name)\n            if m == None:\n                addFile = False\n            else:\n                if len(m.groups())>0:\n                    keyName = m.group(1)\n                    \n        if addFile:\n            pairs.append( keyName )\n                \n    return pairs\n    \n\ndef load_zip_file(file,fileNameRegExp='',allEntries=False):\n    \"\"\"\n    Returns an array with the contents (filtered by fileNameRegExp) of a ZIP file.\n    The key's are the names or the file or the capturing group definied in the fileNameRegExp\n    allEntries validates that all entries in the ZIP file pass the fileNameRegExp\n    \"\"\"\n    try:\n        archive=zipfile.ZipFile(file, mode='r', allowZip64=True)\n    except :\n        raise Exception('Error loading the ZIP archive')    \n\n    pairs = []\n    for name in archive.namelist():\n        addFile = True\n        keyName = name\n        if fileNameRegExp!=\"\":\n            m = re.match(fileNameRegExp,name)\n            if m == None:\n                addFile = False\n            else:\n                if len(m.groups())>0:\n                    keyName = m.group(1)\n        \n        if addFile:\n            pairs.append( [ keyName , archive.read(name)] )\n        else:\n            if allEntries:\n                raise Exception('ZIP entry not valid: %s' %name)             \n\n    return dict(pairs)\n\t\ndef decode_utf8(raw):\n    \"\"\"\n    Returns a Unicode object on success, or None on failure\n    \"\"\"\n    try:\n        raw = codecs.decode(raw,'utf-8', 'replace')\n        #extracts BOM if exists\n        raw = raw.encode('utf8')\n        if raw.startswith(codecs.BOM_UTF8):\n            raw = raw.replace(codecs.BOM_UTF8, '', 1)\n        return raw.decode('utf-8')\n    except:\n       return None\n   \ndef validate_lines_in_file(fileName,file_contents,CRLF=True,LTRB=True,withTranscription=False,withConfidence=False,imWidth=0,imHeight=0):\n    \"\"\"\n    This function validates that all lines of the file calling the Line validation function for each line\n    \"\"\"\n    utf8File = decode_utf8(file_contents)\n    if (utf8File is None) :\n        raise Exception(\"The file %s is not UTF-8\" %fileName)\n\n    lines = utf8File.split( \"\\r\\n\" if CRLF else \"\\n\" )\n    for line in lines:\n        line = line.replace(\"\\r\",\"\").replace(\"\\n\",\"\")\n        if(line != \"\"):\n            try:\n                validate_tl_line(line,LTRB,withTranscription,withConfidence,imWidth,imHeight)\n            except Exception as e:\n                raise Exception((\"Line in sample not valid. Sample: %s Line: %s Error: %s\" %(fileName,line,str(e))).encode('utf-8', 'replace'))\n    \n   \n   \ndef validate_tl_line(line,LTRB=True,withTranscription=True,withConfidence=True,imWidth=0,imHeight=0):\n    \"\"\"\n    Validate the format of the line. If the line is not valid an exception will be raised.\n    If maxWidth and maxHeight are specified, all points must be inside the imgage bounds.\n    Posible values are:\n    LTRB=True: xmin,ymin,xmax,ymax[,confidence][,transcription] \n    LTRB=False: x1,y1,x2,y2,x3,y3,x4,y4[,confidence][,transcription] \n    \"\"\"\n    get_tl_line_values(line,LTRB,withTranscription,withConfidence,imWidth,imHeight)\n    \n   \ndef get_tl_line_values(line,LTRB=True,withTranscription=False,withConfidence=False,imWidth=0,imHeight=0,withAngel=True):\n    \"\"\"\n    Validate the format of the line. If the line is not valid an exception will be raised.\n    If maxWidth and maxHeight are specified, all points must be inside the imgage bounds.\n    Posible values are:\n    LTRB=True: xmin,ymin,xmax,ymax[,confidence][,transcription] \n    LTRB=False: x1,y1,x2,y2,x3,y3,x4,y4[,confidence][,transcription] \n    Returns values from a textline. Points , [Confidences], [Transcriptions]\n    \"\"\"\n    confidence = 0.0\n    transcription = \"\";\n    points = []\n    \n    numPoints = 4;\n    \n    if LTRB:\n    \n        numPoints = 4;\n        if withAngel:\n            if withTranscription and withConfidence:\n                m = re.match(r'^\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*([0-9]+)\\s*,\\s*([0-9]+)\\s*,\\s*(-?[0-1].?[0-9]*)\\s*,\\s*([0-1].?[0-9]*)\\s*,(.*)$',line)\n                if m == None :\n                    m = re.match(r'^\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*([0-9]+)\\s*,\\s*([0-9]+)\\s*,\\s*(-?[0-1].?[0-9]*)\\s*,\\s*([0-1].?[0-9]*)\\s*,(.*)$',line)\n                    raise Exception(\"Format incorrect. Should be: xmin,ymin,xmax,ymax,angle,confidence,transcription\")\n            elif withConfidence:\n                m = re.match(r'^\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*([0-9]+)\\s*,\\s*([0-9]+)\\s*,\\s*(-?[0-1].?[0-9]*)\\s*,\\s*([0-1].?[0-9]*)\\s*$',line)\n                if m == None :\n                    raise Exception(\"Format incorrect. Should be: xmin,ymin,xmax,ymax,angle,confidence\")\n            elif withTranscription:\n                # print('---------------')\n                m = re.match(r'^\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*([0-9]+)\\s*,\\s*([0-9]+)\\s*,\\s*(-?[0-1].?[0-9]*)\\s*,(.*)$',line)\n                if m == None :\n                    raise Exception(\"Format incorrect. Should be: xmin,ymin,xmax,ymax,angle,transcription\")\n            else:\n                m = re.match(r'^\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*([0-9]+)\\s*,\\s*([0-9]+)\\s*,\\s*(-?[0-1].?[0-9]*)\\s*,?\\s*$',line)\n                if m == None :\n                   raise Exception(\"Format incorrect. Should be: xmin,ymin,xmax,ymax,angle\")\n        elif withTranscription and withConfidence:\n            m = re.match(r'^\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*([0-9]+)\\s*,\\s*([0-9]+)\\s*,\\s*([0-1].?[0-9]*)\\s*,(.*)$',line)\n            if m == None :\n                m = re.match(r'^\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*([0-9]+)\\s*,\\s*([0-9]+)\\s*,\\s*([0-1].?[0-9]*)\\s*,(.*)$',line)\n                raise Exception(\"Format incorrect. Should be: xmin,ymin,xmax,ymax,confidence,transcription\")\n        elif withConfidence:\n            m = re.match(r'^\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*([0-9]+)\\s*,\\s*([0-9]+)\\s*,\\s*([0-1].?[0-9]*)\\s*$',line)\n            if m == None :\n                raise Exception(\"Format incorrect. Should be: xmin,ymin,xmax,ymax,confidence\")\n        elif withTranscription:\n            m = re.match(r'^\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*([0-9]+)\\s*,\\s*([0-9]+)\\s*,(.*)$',line)\n            if m == None :\n                raise Exception(\"Format incorrect. Should be: xmin,ymin,xmax,ymax,transcription\")\n        else:\n            m = re.match(r'^\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*([0-9]+)\\s*,\\s*([0-9]+)\\s*,?\\s*$',line)\n            if m == None :\n                raise Exception(\"Format incorrect. Should be: xmin,ymin,xmax,ymax\")\n            \n        xmin = int(m.group(1))\n        ymin = int(m.group(2))\n        xmax = int(m.group(3))\n        ymax = int(m.group(4))\n        # print(m.group(5))\n        angle = float(m.group(5))\n        if(xmax<xmin):\n                raise Exception(\"Xmax value (%s) not valid (Xmax < Xmin).\" %(xmax))\n        if(ymax<ymin):\n                raise Exception(\"Ymax value (%s)  not valid (Ymax < Ymin).\" %(ymax))  \n\n        points = [ float(m.group(i)) for i in range(1, (numPoints+1) ) ]\n        \n        if (imWidth>0 and imHeight>0):\n            validate_point_inside_bounds(xmin,ymin,imWidth,imHeight);\n            validate_point_inside_bounds(xmax,ymax,imWidth,imHeight);\n\n    else:\n        \n        numPoints = 8;\n        \n        if withTranscription and withConfidence:\n            m = re.match(r'^\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*([0-1].?[0-9]*)\\s*,(.*)$',line)\n            if m == None :\n                raise Exception(\"Format incorrect. Should be: x1,y1,x2,y2,x3,y3,x4,y4,confidence,transcription\")\n        elif withConfidence:\n            m = re.match(r'^\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*([0-1].?[0-9]*)\\s*$',line)\n            if m == None :\n                raise Exception(\"Format incorrect. Should be: x1,y1,x2,y2,x3,y3,x4,y4,confidence\")\n        elif withTranscription:\n            m = re.match(r'^\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,(.*)$',line)\n            if m == None :\n                raise Exception(\"Format incorrect. Should be: x1,y1,x2,y2,x3,y3,x4,y4,transcription\")\n        else:\n            m = re.match(r'^\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*$',line)\n            if m == None :\n                raise Exception(\"Format incorrect. Should be: x1,y1,x2,y2,x3,y3,x4,y4\")\n            \n        points = [ float(m.group(i)) for i in range(1, (numPoints+1) ) ]\n        \n        validate_clockwise_points(points)\n        \n        if (imWidth>0 and imHeight>0):\n            validate_point_inside_bounds(points[0],points[1],imWidth,imHeight);\n            validate_point_inside_bounds(points[2],points[3],imWidth,imHeight);\n            validate_point_inside_bounds(points[4],points[5],imWidth,imHeight);\n            validate_point_inside_bounds(points[6],points[7],imWidth,imHeight);\n            \n    \n    if withConfidence:\n        try:\n            confidence = float(m.group(numPoints+1+1))\n        except ValueError:\n            raise Exception(\"Confidence value must be a float\")       \n            \n    if withTranscription:\n        posTranscription = numPoints + 1 + (2 if withConfidence else 1)\n        transcription = m.group(posTranscription)\n        m2 = re.match(r'^\\s*\\\"(.*)\\\"\\s*$',transcription)\n        if m2 != None : #Transcription with double quotes, we extract the value and replace escaped characters\n            transcription = m2.group(1).replace(\"\\\\\\\\\", \"\\\\\").replace(\"\\\\\\\"\", \"\\\"\")\n    \n    return points,confidence,transcription, angle\n    \n            \ndef validate_point_inside_bounds(x,y,imWidth,imHeight):\n    if(x<0 or x>imWidth):\n            raise Exception(\"X value (%s) not valid. Image dimensions: (%s,%s)\" %(x,imWidth,imHeight))\n    if(y<0 or y>imHeight):\n            raise Exception(\"Y value (%s)  not valid. Image dimensions: (%s,%s) Sample: %s Line:%s\" %(y,imWidth,imHeight))\n\ndef validate_clockwise_points(points):\n    \"\"\"\n    Validates that the points that the 4 points that dlimite a polygon are in clockwise order.\n    \"\"\"\n    \n    if len(points) != 8:\n        raise Exception(\"Points list not valid.\" + str(len(points)))\n    \n    point = [\n                [int(points[0]) , int(points[1])],\n                [int(points[2]) , int(points[3])],\n                [int(points[4]) , int(points[5])],\n                [int(points[6]) , int(points[7])]\n            ]\n    edge = [\n                ( point[1][0] - point[0][0])*( point[1][1] + point[0][1]),\n                ( point[2][0] - point[1][0])*( point[2][1] + point[1][1]),\n                ( point[3][0] - point[2][0])*( point[3][1] + point[2][1]),\n                ( point[0][0] - point[3][0])*( point[0][1] + point[3][1])\n    ]\n    \n    summatory = edge[0] + edge[1] + edge[2] + edge[3];\n    if summatory>0:\n        raise Exception(\"Points are not clockwise. The coordinates of bounding quadrilaterals have to be given in clockwise order. Regarding the correct interpretation of 'clockwise' remember that the image coordinate system used is the standard one, with the image origin at the upper left, the X axis extending to the right and Y axis extending downwards.\")\n\ndef get_tl_line_values_from_file_contents(content,CRLF=True,LTRB=True,withTranscription=False,withConfidence=False,imWidth=0,imHeight=0,sort_by_confidences=True):\n    \"\"\"\n    Returns all points, confindences and transcriptions of a file in lists. Valid line formats:\n    xmin,ymin,xmax,ymax,[confidence],[transcription]\n    x1,y1,x2,y2,x3,y3,x4,y4,[confidence],[transcription]\n    \"\"\"\n    pointsList = []\n    transcriptionsList = []\n    confidencesList = []\n    angleList = []\n    \n    lines = content.split( \"\\r\\n\" if CRLF else \"\\n\" )\n    for line in lines:\n        line = line.replace(\"\\r\",\"\").replace(\"\\n\",\"\")\n        if(line != \"\") :\n            points, confidence, transcription, angle = get_tl_line_values(line,LTRB,withTranscription,withConfidence,imWidth,imHeight);\n            pointsList.append(points)\n            transcriptionsList.append(transcription)\n            angleList.append(angle)\n            confidencesList.append(confidence)\n\n    if withConfidence and len(confidencesList)>0 and sort_by_confidences:\n        import numpy as np\n        sorted_ind = np.argsort(-np.array(confidencesList))\n        confidencesList = [confidencesList[i] for i in sorted_ind]\n        pointsList = [pointsList[i] for i in sorted_ind]\n        transcriptionsList = [transcriptionsList[i] for i in sorted_ind]        \n        \n    return pointsList,confidencesList,transcriptionsList,angleList\n\ndef main_evaluation(p,default_evaluation_params_fn,validate_data_fn,evaluate_method_fn,show_result=True,per_sample=True):\n    \"\"\"\n    This process validates a method, evaluates it and if it succed generates a ZIP file with a JSON entry for each sample.\n    Params:\n    p: Dictionary of parmeters with the GT/submission locations. If None is passed, the parameters send by the system are used.\n    default_evaluation_params_fn: points to a function that returns a dictionary with the default parameters used for the evaluation\n    validate_data_fn: points to a method that validates the corrct format of the submission\n    evaluate_method_fn: points to a function that evaluated the submission and return a Dictionary with the results\n    \"\"\"\n    \n    if (p == None):\n        p = dict([s[1:].split('=') for s in sys.argv[1:]])\n        if(len(sys.argv)<3):\n            print_help()\n    evalParams = default_evaluation_params_fn()\n    if 'p' in p.keys():\n        evalParams.update( p['p'] if isinstance(p['p'], dict) else json.loads(p['p'][1:-1]) )\n    resDict={'calculated':True,'Message':'','method':'{}','per_sample':'{}'}    \n    try:\n        validate_data_fn(p['g'], p['s'], evalParams)  \n        evalData = evaluate_method_fn(p['g'], p['s'], evalParams)\n        resDict.update(evalData)\n        \n    except Exception as e:\n        resDict['Message']= str(e)\n        resDict['calculated']=False\n\n    if 'o' in p:\n        if not os.path.exists(p['o']):\n            os.makedirs(p['o'])\n\n        resultsOutputname = p['o'] + '/results.zip'\n        outZip = zipfile.ZipFile(resultsOutputname, mode='w', allowZip64=True)\n\n        del resDict['per_sample']\n        if 'output_items' in resDict.keys():\n            del resDict['output_items']\n\n        outZip.writestr('method.json',json.dumps(resDict))\n        \n    if not resDict['calculated']:\n        if show_result:\n            sys.stderr.write('Error!\\n'+ resDict['Message']+'\\n\\n')\n        if 'o' in p:\n            outZip.close()\n        return resDict\n    \n    if 'o' in p:\n        if per_sample == True:\n            for k,v in evalData['per_sample']:\n                outZip.writestr( k + '.json',json.dumps(v)) \n\n            if 'output_items' in evalData.keys():\n                for k, v in evalData['output_items']:\n                    outZip.writestr( k,v) \n\n        outZip.close()\n\n    if show_result:\n        sys.stdout.write(\"Calculated:\\n\")\n        sys.stdout.write(json.dumps(resDict['method']))\n        print(\"\\n\")\n    \n    return resDict\n\n\ndef main_validation(default_evaluation_params_fn,validate_data_fn):\n    \"\"\"\n    This process validates a method\n    Params:\n    default_evaluation_params_fn: points to a function that returns a dictionary with the default parameters used for the evaluation\n    validate_data_fn: points to a method that validates the corrct format of the submission\n    \"\"\"    \n    try:\n        p = dict([s[1:].split('=') for s in sys.argv[1:]])\n        evalParams = default_evaluation_params_fn()\n        if 'p' in p.keys():\n            evalParams.update( p['p'] if isinstance(p['p'], dict) else json.loads(p['p'][1:-1]) )\n\n        validate_data_fn(p['g'], p['s'], evalParams)              \n        print('SUCCESS')\n        sys.exit(0)\n    except Exception as e:\n        print(str(e))\n        sys.exit(101)\n"
  },
  {
    "path": "dataset/TD500/Evaluation_Protocol/script.py",
    "content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\nfrom collections import namedtuple\nimport rrc_evaluation_funcs\nimport importlib\nimport numpy as np\nimport Polygon as plg\nfrom tqdm import tqdm\n\neval_result_dir = \"../../output/Analysis/output_eval\"\nfid_path = '{}/Eval_TD500.txt'.format(eval_result_dir)\n\ndef evaluation_imports():\n    \"\"\"\n    evaluation_imports: Dictionary ( key = module name , value = alias  )  with python modules used in the evaluation. \n    \"\"\"    \n    return {\n            'Polygon':'plg',\n            'numpy':'np'\n            }\n\ndef default_evaluation_params():\n    \"\"\"\n    default_evaluation_params: Default parameters to use for the validation and evaluation.\n    \"\"\"\n    return {\n                'IOU_CONSTRAINT' :0.5,\n                'AREA_PRECISION_CONSTRAINT' :0.5,\n                'GT_SAMPLE_NAME_2_ID':'gt_img_([0-9]+).txt',\n                'DET_SAMPLE_NAME_2_ID':'res_img_([0-9]+).txt',\n                'LTRB':True, #LTRB:2points(left,top,right,bottom) or 4 points(x1,y1,x2,y2,x3,y3,x4,y4)\n                'CRLF':False, # Lines are delimited by Windows CRLF format\n                'CONFIDENCES':False, #Detections must include confidence value. AP will be calculated\n                'PER_SAMPLE_RESULTS':True #Generate per sample results and produce data for visualization\n            }\n\ndef validate_data(gtFilePath, submFilePath,evaluationParams):\n    \"\"\"\n    Method validate_data: validates that all files in the results folder are correct (have the correct name contents).\n                            Validates also that there are no missing files in the folder.\n                            If some error detected, the method raises the error\n    \"\"\"\n    gt = rrc_evaluation_funcs.load_zip_file(gtFilePath,evaluationParams['GT_SAMPLE_NAME_2_ID'])\n\n    subm = rrc_evaluation_funcs.load_zip_file(submFilePath,evaluationParams['DET_SAMPLE_NAME_2_ID'],True)\n    \n    #Validate format of GroundTruth\n    for k in gt:\n        rrc_evaluation_funcs.validate_lines_in_file(k,gt[k],evaluationParams['CRLF'],evaluationParams['LTRB'],True)\n\n    #Validate format of results\n    for k in subm:\n        if (k in gt) == False :\n            raise Exception(\"The sample %s not present in GT\" %k)\n        \n        rrc_evaluation_funcs.validate_lines_in_file(k,subm[k],evaluationParams['CRLF'],evaluationParams['LTRB'],False,evaluationParams['CONFIDENCES'])\n\n    \ndef evaluate_method(gtFilePath, submFilePath, evaluationParams):\n    \"\"\"\n    Method evaluate_method: evaluate method and returns the results\n        Results. Dictionary with the following values:\n        - method (required)  Global method metrics. Ex: { 'Precision':0.8,'Recall':0.9 }\n        - samples (optional) Per sample metrics. Ex: {'sample1' : { 'Precision':0.8,'Recall':0.9 } , 'sample2' : { 'Precision':0.8,'Recall':0.9 }\n    \"\"\"    \n    \n    for module,alias in evaluation_imports().items():\n        globals()[alias] = importlib.import_module(module)    \n    \n    def polygon_from_points(points):\n        \"\"\"\n        Returns a Polygon object to use with the Polygon2 class from a list of 8 points: x1,y1,x2,y2,x3,y3,x4,y4\n        \"\"\"        \n        resBoxes=np.empty([1,8],dtype='int32')\n        resBoxes[0,0]=int(points[0])\n        resBoxes[0,4]=int(points[1])\n        resBoxes[0,1]=int(points[2])\n        resBoxes[0,5]=int(points[3])\n        resBoxes[0,2]=int(points[4])\n        resBoxes[0,6]=int(points[5])\n        resBoxes[0,3]=int(points[6])\n        resBoxes[0,7]=int(points[7])\n        pointMat = resBoxes[0].reshape([2,4]).T\n        return plg.Polygon( pointMat)    \n    \n    def rectangle_to_polygon(rect):\n        resBoxes=np.empty([1,8],dtype='int32')\n        resBoxes[0,0]=int(rect.xmin)\n        resBoxes[0,4]=int(rect.ymax)\n        resBoxes[0,1]=int(rect.xmin)\n        resBoxes[0,5]=int(rect.ymin)\n        resBoxes[0,2]=int(rect.xmax)\n        resBoxes[0,6]=int(rect.ymin)\n        resBoxes[0,3]=int(rect.xmax)\n        resBoxes[0,7]=int(rect.ymax)\n\n        pointMat = resBoxes[0].reshape([2,4]).T\n        \n        return plg.Polygon( pointMat)\n    \n    def rectangle_to_points(rect):\n        points = [int(rect.xmin), int(rect.ymax), int(rect.xmax), int(rect.ymax), int(rect.xmax), int(rect.ymin), int(rect.xmin), int(rect.ymin)]\n        return points\n        \n    def get_union(pD,pG):\n        areaA = pD.area();\n        areaB = pG.area();\n        return areaA + areaB - get_intersection(pD, pG);\n        \n    def get_intersection_over_union(pD,pG):\n        try:\n            return get_intersection(pD, pG) / get_union(pD, pG);\n        except:\n            return 0\n        \n    def get_intersection(pD,pG):\n        pInt = pD & pG\n        if len(pInt) == 0:\n            return 0\n        return pInt.area()\n    \n    def compute_ap(confList, matchList,numGtCare):\n        correct = 0\n        AP = 0\n        if len(confList)>0:\n            confList = np.array(confList)\n            matchList = np.array(matchList)\n            sorted_ind = np.argsort(-confList)\n            confList = confList[sorted_ind]\n            matchList = matchList[sorted_ind]\n            for n in range(len(confList)):\n                match = matchList[n]\n                if match:\n                    correct += 1\n                    AP += float(correct)/(n + 1)\n\n            if numGtCare>0:\n                AP /= numGtCare\n            \n        return AP\n    \n    perSampleMetrics = {}\n    \n    matchedSum = 0\n    \n    Rectangle = namedtuple('Rectangle', 'xmin ymin xmax ymax')\n    \n    gt = rrc_evaluation_funcs.load_zip_file(gtFilePath,evaluationParams['GT_SAMPLE_NAME_2_ID'])\n    subm = rrc_evaluation_funcs.load_zip_file(submFilePath,evaluationParams['DET_SAMPLE_NAME_2_ID'],True)\n   \n    numGlobalCareGt = 0;\n    numGlobalCareDet = 0;\n    \n    arrGlobalConfidences = [];\n    arrGlobalMatches = [];\n\n    with open(fid_path, 'w') as f:\n        for resFile in tqdm(gt):\n\n            gtFile = rrc_evaluation_funcs.decode_utf8(gt[resFile])\n            recall = 0\n            precision = 0\n            hmean = 0\n\n            detMatched = 0\n\n            iouMat = np.empty([1,1])\n\n            gtPols = []\n            detPols = []\n\n            gtPolPoints = []\n            detPolPoints = []\n\n            gtAngle = []\n            detAngle = []\n\n            #Array of Ground Truth Polygons' keys marked as don't Care\n            gtDontCarePolsNum = []\n            #Array of Detected Polygons' matched with a don't Care GT\n            detDontCarePolsNum = []\n\n            pairs = []\n            detMatchedNums = []\n\n            arrSampleConfidences = [];\n            arrSampleMatch = [];\n            sampleAP = 0;\n\n            evaluationLog = \"\"\n\n            pointsList,_,transcriptionsList,angleList = rrc_evaluation_funcs.get_tl_line_values_from_file_contents(gtFile,evaluationParams['CRLF'],evaluationParams['LTRB'],True,False)\n            for n in range(len(pointsList)):\n                points = pointsList[n]\n                angle = angleList[n]\n                transcription = transcriptionsList[n]\n                dontCare = transcription == \"###\"\n                if evaluationParams['LTRB']:\n                    gtRect = Rectangle(*points)\n                    gtPol = rectangle_to_polygon(gtRect)\n                else:\n                    gtPol = polygon_from_points(points)\n                gtPols.append(gtPol)\n                gtPolPoints.append(points)\n                gtAngle.append(angle)\n                if dontCare:\n                    gtDontCarePolsNum.append(len(gtPols)-1)\n\n            evaluationLog += \"GT polygons: \" + str(len(gtPols)) + (\" (\" + str(len(gtDontCarePolsNum)) + \" don't care)\\n\" if len(gtDontCarePolsNum)>0 else \"\\n\")\n\n            if resFile in subm:\n\n                detFile = rrc_evaluation_funcs.decode_utf8(subm[resFile])\n\n                pointsList,confidencesList,_,angleList = rrc_evaluation_funcs.get_tl_line_values_from_file_contents(detFile,evaluationParams['CRLF'],evaluationParams['LTRB'],False,evaluationParams['CONFIDENCES'])\n                for n in range(len(pointsList)):\n                    points = pointsList[n]\n                    angle = angleList[n]\n\n                    if evaluationParams['LTRB']:\n                        detRect = Rectangle(*points)\n                        detPol = rectangle_to_polygon(detRect)\n                    else:\n                        detPol = polygon_from_points(points)\n                    detPols.append(detPol)\n                    detPolPoints.append(points)\n                    detAngle.append(angle)\n                    if len(gtDontCarePolsNum)>0 :\n                        for dontCarePol in gtDontCarePolsNum:\n                            dontCarePol = gtPols[dontCarePol]\n                            intersected_area = get_intersection(dontCarePol,detPol)\n                            pdDimensions = detPol.area()\n                            precision = 0 if pdDimensions == 0 else intersected_area / pdDimensions\n                            if (precision > evaluationParams['AREA_PRECISION_CONSTRAINT'] ):\n                                detDontCarePolsNum.append(len(detPols)-1)\n                                break\n\n                evaluationLog += \"DET polygons: \" + str(len(detPols)) + (\" (\" + str(len(detDontCarePolsNum)) + \" don't care)\\n\" if len(detDontCarePolsNum)>0 else \"\\n\")\n\n                if len(gtPols)>0 and len(detPols)>0:\n                    #Calculate IoU and precision matrixs\n                    outputShape=[len(gtPols),len(detPols)]\n                    iouMat = np.empty(outputShape)\n                    gtRectMat = np.zeros(len(gtPols),np.int8)\n                    detRectMat = np.zeros(len(detPols),np.int8)\n                    for gtNum in range(len(gtPols)):\n                        for detNum in range(len(detPols)):\n                            pG = gtPols[gtNum]\n                            pD = detPols[detNum]\n                            iouMat[gtNum,detNum] = get_intersection_over_union(pD,pG)\n\n                    for gtNum in range(len(gtPols)):\n                        for detNum in range(len(detPols)):\n                            if gtRectMat[gtNum] == 0 and detRectMat[detNum] == 0 and gtNum not in gtDontCarePolsNum and detNum not in detDontCarePolsNum :\n                                if iouMat[gtNum,detNum]>evaluationParams['IOU_CONSTRAINT'] and abs(gtAngle[gtNum]-detAngle[detNum]) < np.pi / 8:\n                                    gtRectMat[gtNum] = 1\n                                    detRectMat[detNum] = 1\n                                    detMatched += 1\n                                    pairs.append({'gt':gtNum,'det':detNum})\n                                    detMatchedNums.append(detNum)\n                                    evaluationLog += \"Match GT #\" + str(gtNum) + \" with Det #\" + str(detNum) + \"\\n\"\n\n                if evaluationParams['CONFIDENCES']:\n                    for detNum in range(len(detPols)):\n                        if detNum not in detDontCarePolsNum :\n                            #we exclude the don't care detections\n                            match = detNum in detMatchedNums\n\n                            arrSampleConfidences.append(confidencesList[detNum])\n                            arrSampleMatch.append(match)\n\n                            arrGlobalConfidences.append(confidencesList[detNum]);\n                            arrGlobalMatches.append(match);\n\n            numGtCare = (len(gtPols) - len(gtDontCarePolsNum))\n            numDetCare = (len(detPols) - len(detDontCarePolsNum))\n            if numGtCare == 0:\n                recall = float(1)\n                precision = float(0) if numDetCare >0 else float(1)\n                sampleAP = precision\n            else:\n                recall = float(detMatched) / numGtCare\n                precision = 0 if numDetCare==0 else float(detMatched) / numDetCare\n                if evaluationParams['CONFIDENCES'] and evaluationParams['PER_SAMPLE_RESULTS']:\n                    sampleAP = compute_ap(arrSampleConfidences, arrSampleMatch, numGtCare )\n\n            hmean = 0 if (precision + recall)==0 else 2.0 * precision * recall / (precision + recall)\n\n            # write result\n            f.write('%s :: Precision=%.4f - Recall=%.4f\\n' % (\"img_\"+str(resFile) + \".txt\", recall, precision))\n\n            matchedSum += detMatched\n            numGlobalCareGt += numGtCare\n            numGlobalCareDet += numDetCare\n\n            if evaluationParams['PER_SAMPLE_RESULTS']:\n                perSampleMetrics[resFile] = {\n                                                'precision':precision,\n                                                'recall':recall,\n                                                'hmean':hmean,\n                                                'pairs':pairs,\n                                                'AP':sampleAP,\n                                                'iouMat':[] if len(detPols)>100 else iouMat.tolist(),\n                                                'gtPolPoints':gtPolPoints,\n                                                'detPolPoints':detPolPoints,\n                                                'gtDontCare':gtDontCarePolsNum,\n                                                'detDontCare':detDontCarePolsNum,\n                                                'evaluationParams': evaluationParams,\n                                                'evaluationLog': evaluationLog\n                                            }\n\n        # Compute MAP and MAR\n        AP = 0\n        if evaluationParams['CONFIDENCES']:\n            AP = compute_ap(arrGlobalConfidences, arrGlobalMatches, numGlobalCareGt)\n\n        methodRecall = 0 if numGlobalCareGt == 0 else float(matchedSum)/numGlobalCareGt\n        methodPrecision = 0 if numGlobalCareDet == 0 else float(matchedSum)/numGlobalCareDet\n        methodHmean = 0 if methodRecall + methodPrecision==0 else 2* methodRecall * methodPrecision / (methodRecall + methodPrecision)\n\n        methodMetrics = {'precision':methodPrecision, 'recall':methodRecall,'hmean': methodHmean, 'AP': AP}\n\n        resDict = {'calculated':True,'Message':'','method': methodMetrics,'per_sample': perSampleMetrics}\n\n        f.write('ALL :: AP=%.4f - Precision=%.4f - Recall=%.4f - Fscore=%.4f'\n                % (AP, methodPrecision, methodRecall, methodHmean))\n\n    return resDict;\n\n\nif __name__=='__main__':\n\n    rrc_evaluation_funcs.main_evaluation(None,default_evaluation_params,validate_data,evaluate_method)\n"
  },
  {
    "path": "dataset/TD500/Evaluation_Protocol/script075.py",
    "content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\nfrom collections import namedtuple\nimport rrc_evaluation_funcs\nimport importlib\n\ndef evaluation_imports():\n    \"\"\"\n    evaluation_imports: Dictionary ( key = module name , value = alias  )  with python modules used in the evaluation. \n    \"\"\"    \n    return {\n            'Polygon':'plg',\n            'numpy':'np'\n            }\n\ndef default_evaluation_params():\n    \"\"\"\n    default_evaluation_params: Default parameters to use for the validation and evaluation.\n    \"\"\"\n    return {\n                'IOU_CONSTRAINT' :0.75,\n                'AREA_PRECISION_CONSTRAINT' :0.5,\n                'GT_SAMPLE_NAME_2_ID':'gt_img_([0-9]+).txt',\n                'DET_SAMPLE_NAME_2_ID':'res_img_([0-9]+).txt',\n                'LTRB':False, #LTRB:2points(left,top,right,bottom) or 4 points(x1,y1,x2,y2,x3,y3,x4,y4)\n                'CRLF':False, # Lines are delimited by Windows CRLF format\n                'CONFIDENCES':False, #Detections must include confidence value. AP will be calculated\n                'PER_SAMPLE_RESULTS':True #Generate per sample results and produce data for visualization\n            }\n\ndef validate_data(gtFilePath, submFilePath,evaluationParams):\n    \"\"\"\n    Method validate_data: validates that all files in the results folder are correct (have the correct name contents).\n                            Validates also that there are no missing files in the folder.\n                            If some error detected, the method raises the error\n    \"\"\"\n    gt = rrc_evaluation_funcs.load_zip_file(gtFilePath,evaluationParams['GT_SAMPLE_NAME_2_ID'])\n\n    subm = rrc_evaluation_funcs.load_zip_file(submFilePath,evaluationParams['DET_SAMPLE_NAME_2_ID'],True)\n    \n    #Validate format of GroundTruth\n    for k in gt:\n        rrc_evaluation_funcs.validate_lines_in_file(k,gt[k],evaluationParams['CRLF'],evaluationParams['LTRB'],True)\n\n    #Validate format of results\n    for k in subm:\n        if (k in gt) == False :\n            raise Exception(\"The sample %s not present in GT\" %k)\n        \n        rrc_evaluation_funcs.validate_lines_in_file(k,subm[k],evaluationParams['CRLF'],evaluationParams['LTRB'],False,evaluationParams['CONFIDENCES'])\n\n    \ndef evaluate_method(gtFilePath, submFilePath, evaluationParams):\n    \"\"\"\n    Method evaluate_method: evaluate method and returns the results\n        Results. Dictionary with the following values:\n        - method (required)  Global method metrics. Ex: { 'Precision':0.8,'Recall':0.9 }\n        - samples (optional) Per sample metrics. Ex: {'sample1' : { 'Precision':0.8,'Recall':0.9 } , 'sample2' : { 'Precision':0.8,'Recall':0.9 }\n    \"\"\"    \n    \n    for module,alias in evaluation_imports().iteritems():\n        globals()[alias] = importlib.import_module(module)    \n    \n    def polygon_from_points(points):\n        \"\"\"\n        Returns a Polygon object to use with the Polygon2 class from a list of 8 points: x1,y1,x2,y2,x3,y3,x4,y4\n        \"\"\"        \n        resBoxes=np.empty([1,8],dtype='int32')\n        resBoxes[0,0]=int(points[0])\n        resBoxes[0,4]=int(points[1])\n        resBoxes[0,1]=int(points[2])\n        resBoxes[0,5]=int(points[3])\n        resBoxes[0,2]=int(points[4])\n        resBoxes[0,6]=int(points[5])\n        resBoxes[0,3]=int(points[6])\n        resBoxes[0,7]=int(points[7])\n        pointMat = resBoxes[0].reshape([2,4]).T\n        return plg.Polygon( pointMat)    \n    \n    def rectangle_to_polygon(rect):\n        resBoxes=np.empty([1,8],dtype='int32')\n        resBoxes[0,0]=int(rect.xmin)\n        resBoxes[0,4]=int(rect.ymax)\n        resBoxes[0,1]=int(rect.xmin)\n        resBoxes[0,5]=int(rect.ymin)\n        resBoxes[0,2]=int(rect.xmax)\n        resBoxes[0,6]=int(rect.ymin)\n        resBoxes[0,3]=int(rect.xmax)\n        resBoxes[0,7]=int(rect.ymax)\n\n        pointMat = resBoxes[0].reshape([2,4]).T\n        \n        return plg.Polygon( pointMat)\n    \n    def rectangle_to_points(rect):\n        points = [int(rect.xmin), int(rect.ymax), int(rect.xmax), int(rect.ymax), int(rect.xmax), int(rect.ymin), int(rect.xmin), int(rect.ymin)]\n        return points\n        \n    def get_union(pD,pG):\n        areaA = pD.area();\n        areaB = pG.area();\n        return areaA + areaB - get_intersection(pD, pG);\n        \n    def get_intersection_over_union(pD,pG):\n        try:\n            return get_intersection(pD, pG) / get_union(pD, pG);\n        except:\n            return 0\n        \n    def get_intersection(pD,pG):\n        pInt = pD & pG\n        if len(pInt) == 0:\n            return 0\n        return pInt.area()\n    \n    def compute_ap(confList, matchList,numGtCare):\n        correct = 0\n        AP = 0\n        if len(confList)>0:\n            confList = np.array(confList)\n            matchList = np.array(matchList)\n            sorted_ind = np.argsort(-confList)\n            confList = confList[sorted_ind]\n            matchList = matchList[sorted_ind]\n            for n in range(len(confList)):\n                match = matchList[n]\n                if match:\n                    correct += 1\n                    AP += float(correct)/(n + 1)\n\n            if numGtCare>0:\n                AP /= numGtCare\n            \n        return AP\n    \n    perSampleMetrics = {}\n    \n    matchedSum = 0\n    \n    Rectangle = namedtuple('Rectangle', 'xmin ymin xmax ymax')\n    \n    gt = rrc_evaluation_funcs.load_zip_file(gtFilePath,evaluationParams['GT_SAMPLE_NAME_2_ID'])\n    subm = rrc_evaluation_funcs.load_zip_file(submFilePath,evaluationParams['DET_SAMPLE_NAME_2_ID'],True)\n   \n    numGlobalCareGt = 0;\n    numGlobalCareDet = 0;\n    \n    arrGlobalConfidences = [];\n    arrGlobalMatches = [];\n\n    for resFile in gt:\n        \n        gtFile = rrc_evaluation_funcs.decode_utf8(gt[resFile])\n        recall = 0\n        precision = 0\n        hmean = 0    \n        \n        detMatched = 0\n        \n        iouMat = np.empty([1,1])\n        \n        gtPols = []\n        detPols = []\n        \n        gtPolPoints = []\n        detPolPoints = []  \n        \n        #Array of Ground Truth Polygons' keys marked as don't Care\n        gtDontCarePolsNum = []\n        #Array of Detected Polygons' matched with a don't Care GT\n        detDontCarePolsNum = []   \n        \n        pairs = [] \n        detMatchedNums = []\n        \n        arrSampleConfidences = [];\n        arrSampleMatch = [];\n        sampleAP = 0;\n\n        evaluationLog = \"\"\n        \n        pointsList,_,transcriptionsList = rrc_evaluation_funcs.get_tl_line_values_from_file_contents(gtFile,evaluationParams['CRLF'],evaluationParams['LTRB'],True,False)\n        for n in range(len(pointsList)):\n            points = pointsList[n]\n            transcription = transcriptionsList[n]\n            dontCare = transcription == \"###\"\n            if evaluationParams['LTRB']:\n                gtRect = Rectangle(*points)\n                gtPol = rectangle_to_polygon(gtRect)\n            else:\n                gtPol = polygon_from_points(points)\n            gtPols.append(gtPol)\n            gtPolPoints.append(points)\n            if dontCare:\n                gtDontCarePolsNum.append( len(gtPols)-1 )\n                \n        evaluationLog += \"GT polygons: \" + str(len(gtPols)) + (\" (\" + str(len(gtDontCarePolsNum)) + \" don't care)\\n\" if len(gtDontCarePolsNum)>0 else \"\\n\")\n        \n        if resFile in subm:\n            \n            detFile = rrc_evaluation_funcs.decode_utf8(subm[resFile]) \n            \n            pointsList,confidencesList,_ = rrc_evaluation_funcs.get_tl_line_values_from_file_contents(detFile,evaluationParams['CRLF'],evaluationParams['LTRB'],False,evaluationParams['CONFIDENCES'])\n            for n in range(len(pointsList)):\n                points = pointsList[n]\n                \n                if evaluationParams['LTRB']:\n                    detRect = Rectangle(*points)\n                    detPol = rectangle_to_polygon(detRect)\n                else:\n                    detPol = polygon_from_points(points)                    \n                detPols.append(detPol)\n                detPolPoints.append(points)\n                if len(gtDontCarePolsNum)>0 :\n                    for dontCarePol in gtDontCarePolsNum:\n                        dontCarePol = gtPols[dontCarePol]\n                        intersected_area = get_intersection(dontCarePol,detPol)\n                        pdDimensions = detPol.area()\n                        precision = 0 if pdDimensions == 0 else intersected_area / pdDimensions\n                        if (precision > evaluationParams['AREA_PRECISION_CONSTRAINT'] ):\n                            detDontCarePolsNum.append( len(detPols)-1 )\n                            break\n                                \n            evaluationLog += \"DET polygons: \" + str(len(detPols)) + (\" (\" + str(len(detDontCarePolsNum)) + \" don't care)\\n\" if len(detDontCarePolsNum)>0 else \"\\n\")\n            \n            if len(gtPols)>0 and len(detPols)>0:\n                #Calculate IoU and precision matrixs\n                outputShape=[len(gtPols),len(detPols)]\n                iouMat = np.empty(outputShape)\n                gtRectMat = np.zeros(len(gtPols),np.int8)\n                detRectMat = np.zeros(len(detPols),np.int8)\n                for gtNum in range(len(gtPols)):\n                    for detNum in range(len(detPols)):\n                        pG = gtPols[gtNum]\n                        pD = detPols[detNum]\n                        iouMat[gtNum,detNum] = get_intersection_over_union(pD,pG)\n\n                for gtNum in range(len(gtPols)):\n                    for detNum in range(len(detPols)):\n                        if gtRectMat[gtNum] == 0 and detRectMat[detNum] == 0 and gtNum not in gtDontCarePolsNum and detNum not in detDontCarePolsNum :\n                            if iouMat[gtNum,detNum]>evaluationParams['IOU_CONSTRAINT']:\n                                gtRectMat[gtNum] = 1\n                                detRectMat[detNum] = 1\n                                detMatched += 1\n                                pairs.append({'gt':gtNum,'det':detNum})\n                                detMatchedNums.append(detNum)\n                                evaluationLog += \"Match GT #\" + str(gtNum) + \" with Det #\" + str(detNum) + \"\\n\"\n\n            if evaluationParams['CONFIDENCES']:\n                for detNum in range(len(detPols)):\n                    if detNum not in detDontCarePolsNum :\n                        #we exclude the don't care detections\n                        match = detNum in detMatchedNums\n\n                        arrSampleConfidences.append(confidencesList[detNum])\n                        arrSampleMatch.append(match)\n\n                        arrGlobalConfidences.append(confidencesList[detNum]);\n                        arrGlobalMatches.append(match);\n                            \n        numGtCare = (len(gtPols) - len(gtDontCarePolsNum))\n        numDetCare = (len(detPols) - len(detDontCarePolsNum))\n        if numGtCare == 0:\n            recall = float(1)\n            precision = float(0) if numDetCare >0 else float(1)\n            sampleAP = precision\n        else:\n            recall = float(detMatched) / numGtCare\n            precision = 0 if numDetCare==0 else float(detMatched) / numDetCare\n            if evaluationParams['CONFIDENCES'] and evaluationParams['PER_SAMPLE_RESULTS']:\n                sampleAP = compute_ap(arrSampleConfidences, arrSampleMatch, numGtCare )                    \n\n        hmean = 0 if (precision + recall)==0 else 2.0 * precision * recall / (precision + recall)                \n\n        matchedSum += detMatched\n        numGlobalCareGt += numGtCare\n        numGlobalCareDet += numDetCare\n        \n        if evaluationParams['PER_SAMPLE_RESULTS']:\n            perSampleMetrics[resFile] = {\n                                            'precision':precision,\n                                            'recall':recall,\n                                            'hmean':hmean,\n                                            'pairs':pairs,\n                                            'AP':sampleAP,\n                                            'iouMat':[] if len(detPols)>100 else iouMat.tolist(),\n                                            'gtPolPoints':gtPolPoints,\n                                            'detPolPoints':detPolPoints,\n                                            'gtDontCare':gtDontCarePolsNum,\n                                            'detDontCare':detDontCarePolsNum,\n                                            'evaluationParams': evaluationParams,\n                                            'evaluationLog': evaluationLog                                        \n                                        }\n                                    \n    # Compute MAP and MAR\n    AP = 0\n    if evaluationParams['CONFIDENCES']:\n        AP = compute_ap(arrGlobalConfidences, arrGlobalMatches, numGlobalCareGt)\n\n    methodRecall = 0 if numGlobalCareGt == 0 else float(matchedSum)/numGlobalCareGt\n    methodPrecision = 0 if numGlobalCareDet == 0 else float(matchedSum)/numGlobalCareDet\n    methodHmean = 0 if methodRecall + methodPrecision==0 else 2* methodRecall * methodPrecision / (methodRecall + methodPrecision)\n    \n    methodMetrics = {'precision':methodPrecision, 'recall':methodRecall,'hmean': methodHmean, 'AP': AP  }\n\n    resDict = {'calculated':True,'Message':'','method': methodMetrics,'per_sample': perSampleMetrics}\n    \n    \n    return resDict;\n\n\n\nif __name__=='__main__':\n        \n    rrc_evaluation_funcs.main_evaluation(None,default_evaluation_params,validate_data,evaluate_method)\n"
  },
  {
    "path": "dataset/TD500/batch_eval.sh",
    "content": "for ((i=490000;i<=510000;i+=1000)); do echo $i;sh eval.sh $1/submit-${i}_1.5|grep Calc;done\n"
  },
  {
    "path": "dataset/TD500/eval.sh",
    "content": "cd dataset/TD500\nrm submit.zip\ncp $1/*.txt submit\ncd submit/;zip -r  submit.zip * &> ../log.txt ; mv submit.zip ../; cd ../\nrm log.txt\npython Evaluation_Protocol/script.py -g=gt.zip -s=submit.zip\n"
  },
  {
    "path": "dataset/TD500/submit/res_img_1.txt",
    "content": "637,679,749,723,0.08314135922921725\n"
  },
  {
    "path": "dataset/TD500_Text.py",
    "content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n__author__ = '古溪'\nimport re\nimport os\nimport numpy as np\nfrom util import strs\nfrom dataset.data_util import pil_load_img\nfrom dataset.dataload import TextDataset, TextInstance\nfrom util.io import read_lines\nfrom util.misc import norm2\n\n\nclass TD500Text(TextDataset):\n\n    def __init__(self, data_root, is_training=True, ignore_list=None, load_memory=False, transform=None):\n        super().__init__(transform, is_training)\n        self.data_root = data_root\n        self.is_training = is_training\n        self.load_memory = load_memory\n\n        self.image_root = os.path.join(data_root, 'Train' if is_training else 'Test')\n        self.annotation_root = os.path.join(data_root,'Train' if is_training else 'Test')\n        self.image_list = os.listdir(self.image_root)\n\n        p = re.compile('.rar|.txt')\n        self.image_list = [x for x in self.image_list if not p.findall(x)]\n        p = re.compile('(.jpg|.JPG|.PNG|.JPEG)')\n        self.annotation_list = ['{}'.format(p.sub(\"\", img_name)) for img_name in self.image_list]\n\n        if self.load_memory:\n            self.datas = list()\n            for item in range(len(self.image_list)):\n                self.datas.append(self.load_img_gt(item))\n\n    @staticmethod\n    def parse_txt(gt_path):\n        \"\"\"\n        .mat file parser\n        :param gt_path: (str), mat file path\n        :return: (list), TextInstance\n        \"\"\"\n        lines = read_lines(gt_path+\".txt\")\n        polygons = []\n        for line in lines:\n            line = strs.remove_all(line.strip('\\ufeff'), '\\xef\\xbb\\xbf')\n            gt = line.split(',')\n            x1, y1, x2, y2, x3, y3, x4, y4 = list(map(int, gt[:8]))\n            xx = [x1, x2, x3, x4]\n            yy = [y1, y2, y3, y4]\n\n            label = gt[-1].strip().replace(\"###\", \"#\")\n            pts = np.stack([xx, yy]).T.astype(np.int32)\n            # d1 = norm2(pts[0] - pts[1])\n            # d2 = norm2(pts[1] - pts[2])\n            # d3 = norm2(pts[2] - pts[3])\n            # d4 = norm2(pts[3] - pts[0])\n            # if min([d1, d2, d3, d4]) < 2:\n            #     continue\n            polygons.append(TextInstance(pts, 'c', label))\n\n        return polygons\n\n    def load_img_gt(self, item):\n        image_id = self.image_list[item]\n        image_path = os.path.join(self.image_root, image_id)\n        # Read image data\n        image = pil_load_img(image_path)\n        try:\n            # Read annotation\n            annotation_id = self.annotation_list[item]\n            annotation_path = os.path.join(self.annotation_root, annotation_id)\n            polygons = self.parse_txt(annotation_path)\n        except:\n            polygons = None\n\n        data = dict()\n        data[\"image\"] = image\n        data[\"polygons\"] = polygons\n        data[\"image_id\"] = image_id\n        data[\"image_path\"] = image_path\n\n        return data\n\n    def __getitem__(self, item):\n        # image_id = self.image_list[item]\n        # image_path = os.path.join(self.image_root, image_id)\n        #\n        # # Read image data\n        # image = pil_load_img(image_path)\n        #\n        # try:\n        #     # Read annotation\n        #     annotation_id = self.annotation_list[item]\n        #     annotation_path = os.path.join(self.annotation_root, annotation_id)\n        #     polygons = self.parse_txt(annotation_path)\n        # except:\n        #     polygons = None\n\n        if self.load_memory:\n            data = self.datas[item]\n        else:\n            data = self.load_img_gt(item)\n\n        if self.is_training:\n            return self.get_training_data(data[\"image\"], data[\"polygons\"],\n                                          image_id=data[\"image_id\"], image_path=data[\"image_path\"])\n        else:\n            return self.get_test_data(data[\"image\"], data[\"polygons\"],\n                                      image_id=data[\"image_id\"], image_path=data[\"image_path\"])\n\n    def __len__(self):\n        return len(self.image_list)\n\n\nif __name__ == '__main__':\n    import os\n    import cv2\n    from util.augmentation import Augmentation\n    from util import canvas as cav\n    import time\n\n    means = (0.485, 0.456, 0.406)\n    stds = (0.229, 0.224, 0.225)\n\n    transform = Augmentation(\n        size=640, mean=means, std=stds\n    )\n\n    trainset = TD500Text(\n        data_root='../data/TD500',\n        is_training=False,\n        transform=transform\n    )\n\n    # img, train_mask, tr_mask, tcl_mask, radius_map, sin_map, cos_map, meta = trainset[944]\n    for idx in range(0, len(trainset)):\n        t0 = time.time()\n        img, train_mask, tr_mask = trainset[idx]\n        img, train_mask, tr_mask = map(lambda x: x.cpu().numpy(), (img, train_mask, tr_mask))\n\n        img = img.transpose(1, 2, 0)\n        img = ((img * stds + means) * 255).astype(np.uint8)\n        print(idx, img.shape)\n\n        for i in range(tr_mask.shape[2]):\n            cv2.imshow(\"tr_mask_{}\".format(i),\n                       cav.heatmap(np.array(tr_mask[:, :, i] * 255 / np.max(tr_mask[:, :, i]), dtype=np.uint8)))\n\n        cv2.imshow('imgs', img)\n        cv2.waitKey(0)\n"
  },
  {
    "path": "dataset/Total_Text.py",
    "content": "# -*- coding: utf-8 -*-\n__author__ = \"S.X.Zhang\"\nimport warnings\nwarnings.filterwarnings(\"ignore\")\nimport os\nimport re\nimport numpy as np\nimport scipy.io as io\nfrom util import strs\nfrom dataset.data_util import pil_load_img\nfrom dataset.dataload import TextDataset, TextInstance\nimport cv2\nfrom util import io as libio\n\n\nclass TotalText(TextDataset):\n\n    def __init__(self, data_root, ignore_list=None, is_training=True, load_memory=False, transform=None):\n        super().__init__(transform, is_training)\n        self.data_root = data_root\n        self.is_training = is_training\n        self.load_memory = load_memory\n\n        if ignore_list:\n            with open(ignore_list) as f:\n                ignore_list = f.readlines()\n                ignore_list = [line.strip() for line in ignore_list]\n        else:\n            ignore_list = []\n\n        self.image_root = os.path.join(data_root, 'Images', 'Train' if is_training else 'Test')\n        self.annotation_root = os.path.join(data_root, 'gt', 'Train' if is_training else 'Test')\n        self.image_list = os.listdir(self.image_root)\n        self.image_list = list(filter(lambda img: img.replace('.jpg', '') not in ignore_list, self.image_list))\n        self.annotation_list = ['poly_gt_{}'.format(img_name.replace('.jpg', '')) for img_name in self.image_list]\n\n        if self.load_memory:\n            self.datas = list()\n            for item in range(len(self.image_list)):\n                self.datas.append(self.load_img_gt(item))\n\n    @staticmethod\n    def parse_mat(mat_path):\n        \"\"\"\n        .mat file parser\n        :param mat_path: (str), mat file path\n        :return: (list), TextInstance\n        \"\"\"\n        annot = io.loadmat(mat_path + \".mat\")\n        polygons = []\n        for cell in annot['polygt']:\n            x = cell[1][0]\n            y = cell[3][0]\n            text = cell[4][0] if len(cell[4]) > 0 else '#'\n            ori = cell[5][0] if len(cell[5]) > 0 else 'c'\n\n            if len(x) < 4:  # too few points\n                continue\n            pts = np.stack([x, y]).T.astype(np.int32)\n            polygons.append(TextInstance(pts, ori, text))\n\n        return polygons\n\n    @staticmethod\n    def parse_carve_txt(gt_path):\n        \"\"\"\n        .mat file parser\n        :param gt_path: (str), mat file path\n        :return: (list), TextInstance\n        \"\"\"\n        lines = libio.read_lines(gt_path + \".txt\")\n        polygons = []\n        for line in lines:\n            line = strs.remove_all(line, '\\xef\\xbb\\xbf')\n            gt = line.split(',')\n            xx = gt[0].replace(\"x: \", \"\").replace(\"[[\", \"\").replace(\"]]\", \"\").lstrip().rstrip()\n            yy = gt[1].replace(\"y: \", \"\").replace(\"[[\", \"\").replace(\"]]\", \"\").lstrip().rstrip()\n            try:\n                xx = [int(x) for x in re.split(r\" *\", xx)]\n                yy = [int(y) for y in re.split(r\" *\", yy)]\n            except:\n                xx = [int(x) for x in re.split(r\" +\", xx)]\n                yy = [int(y) for y in re.split(r\" +\", yy)]\n            if len(xx) < 4 or len(yy) < 4:  # too few points\n                continue\n            text = gt[-1].split('\\'')[1]\n            try:\n                ori = gt[-2].split('\\'')[1]\n            except:\n                ori = 'c'\n            pts = np.stack([xx, yy]).T.astype(np.int32)\n            polygons.append(TextInstance(pts, ori, text))\n        # print(polygon)\n        return polygons\n\n    def load_img_gt(self, item):\n        image_id = self.image_list[item]\n        image_path = os.path.join(self.image_root, image_id)\n\n        # Read image data\n        image = pil_load_img(image_path)\n\n        # Read annotation\n        annotation_id = self.annotation_list[item]\n        annotation_path = os.path.join(self.annotation_root, annotation_id)\n        polygons = self.parse_mat(annotation_path)\n\n        data = dict()\n        data[\"image\"] = image\n        data[\"polygons\"] = polygons\n        data[\"image_id\"] = image_id\n        data[\"image_path\"] = image_path\n\n        return data\n\n    def __getitem__(self, item):\n\n        # image_id = self.image_list[item]\n        # image_path = os.path.join(self.image_root, image_id)\n        #\n        # # Read image data\n        # image = pil_load_img(image_path)\n        #\n        # # Read annotation\n        # annotation_id = self.annotation_list[item]\n        # annotation_path = os.path.join(self.annotation_root, annotation_id)\n        # polygons = self.parse_mat(annotation_path)\n\n        if self.load_memory:\n            data = self.datas[item]\n        else:\n            data = self.load_img_gt(item)\n\n        if self.is_training:\n            return self.get_training_data(data[\"image\"], data[\"polygons\"],\n                                          image_id=data[\"image_id\"], image_path=data[\"image_path\"])\n        else:\n            return self.get_test_data(data[\"image\"], data[\"polygons\"],\n                                      image_id=data[\"image_id\"], image_path=data[\"image_path\"])\n\n    def __len__(self):\n        return len(self.image_list)\n\n\nif __name__ == '__main__':\n    import time\n    from util.augmentation import Augmentation\n    from util import canvas as cav\n\n    means = (0.485, 0.456, 0.406)\n    stds = (0.229, 0.224, 0.225)\n\n    transform = Augmentation(\n        size=640, mean=means, std=stds\n    )\n\n    # transform = BaseTransformNresize(\n    #     mean=means, std=stds\n    # )\n\n    trainset = TotalText(\n        data_root='../data/total-text-mat',\n        ignore_list=None,\n        is_training=True,\n        transform=transform\n    )\n    for idx in range(0, len(trainset)):\n        t0 = time.time()\n        img, train_mask, tr_mask, distance_field, \\\n        direction_field, weight_matrix, ctrl_points, proposal_points, ignore_tags = trainset[idx]\n        img, train_mask, tr_mask, distance_field, \\\n        direction_field, weight_matrix, ctrl_points, proposal_points, ignore_tags\\\n            = map(lambda x: x.cpu().numpy(),\n                  (img, train_mask, tr_mask, distance_field,\n                   direction_field, weight_matrix, ctrl_points, proposal_points, ignore_tags))\n\n        img = img.transpose(1, 2, 0)\n        img = ((img * stds + means) * 255).astype(np.uint8)\n\n        distance_map = cav.heatmap(np.array(distance_field * 255 / np.max(distance_field), dtype=np.uint8))\n        cv2.imshow(\"distance_map\", distance_map)\n        cv2.waitKey(0)\n\n        direction_map = cav.heatmap(np.array(direction_field[0] * 255 / np.max(direction_field[0]), dtype=np.uint8))\n        cv2.imshow(\"direction_field\", direction_map)\n        cv2.waitKey(0)\n\n        from util.vis_flux import vis_direction_field\n        vis_direction_field(direction_field)\n\n        weight_map = cav.heatmap(np.array(weight_matrix * 255 / np.max(weight_matrix), dtype=np.uint8))\n        cv2.imshow(\"weight_matrix\", weight_map)\n        # cv2.waitKey(0)\n\n        boundary_point = ctrl_points[np.where(ignore_tags!=0)[0]]\n        for i, bpts in enumerate(boundary_point):\n            cv2.drawContours(img, [bpts.astype(np.int32)], -1, (0, 255, 0), 1)\n            for j,  pp in enumerate(bpts):\n                if j==0:\n                    cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (255, 0, 255), -1)\n                elif j==1:\n                    cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (0, 255, 255), -1)\n                else:\n                    cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (0, 0, 255), -1)\n\n            ppts = proposal_points[i]\n            cv2.drawContours(img, [ppts.astype(np.int32)], -1, (0, 0, 255), 1)\n            for j,  pp in enumerate(ppts):\n                if j==0:\n                    cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (255, 0, 255), -1)\n                elif j==1:\n                    cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (0, 255, 255), -1)\n                else:\n                    cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (0, 0, 255), -1)\n            cv2.imshow('imgs', img)\n            cv2.waitKey(0)\n\n        # from util.misc import split_edge_seqence\n        # from cfglib.config import config as cfg\n        #\n        # ret, labels = cv2.connectedComponents(np.array(distance_field >0.35, dtype=np.uint8), connectivity=4)\n        # for idx in range(1, ret):\n        #     text_mask = labels == idx\n        #     ist_id = int(np.sum(text_mask*tr_mask)/np.sum(text_mask))-1\n        #     contours, _ = cv2.findContours(text_mask.astype(np.uint8), cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE)\n        #     epsilon = 0.007 * cv2.arcLength(contours[0], True)\n        #     approx = cv2.approxPolyDP(contours[0], epsilon, True).reshape((-1, 2))\n        #\n        #     control_points = split_edge_seqence(approx, cfg.num_points)\n        #     control_points = np.array(control_points[:cfg.num_points, :]).astype(np.int32)\n        #\n        #     cv2.drawContours(img, [ctrl_points[ist_id].astype(np.int32)], -1, (0, 255, 0), 1)\n        #     cv2.drawContours(img, [control_points.astype(np.int32)], -1, (0, 0, 255), 1)\n        #     for j,  pp in enumerate(control_points):\n        #         if j == 0:\n        #             cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (255, 0, 255), -1)\n        #         elif j == 1:\n        #             cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (0, 255, 255), -1)\n        #         else:\n        #             cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (0, 255, 0), -1)\n        #\n        #     cv2.imshow('imgs', img)\n        #     cv2.waitKey(0)\n\n"
  },
  {
    "path": "dataset/Total_Text_New.py",
    "content": "# -*- coding: utf-8 -*-\n__author__ = \"S.X.Zhang\"\nimport warnings\nwarnings.filterwarnings(\"ignore\")\nimport os\nimport cv2\nimport numpy as np\nimport scipy.io as io\nfrom dataset.data_util import pil_load_img\nfrom dataset.dataload import TextDataset, TextInstance\n\n\nclass TotalText_New(TextDataset):\n\n    def __init__(self, data_root, ignore_list=None, is_training=True,\n                 load_memory=False, pix_mask=False, transform=None):\n        super().__init__(transform, is_training)\n        self.data_root = data_root\n        self.is_training = is_training\n        self.pix_mask = pix_mask\n        self.load_memory = load_memory\n\n        if ignore_list:\n            with open(ignore_list) as f:\n                ignore_list = f.readlines()\n                ignore_list = [line.strip() for line in ignore_list]\n        else:\n            ignore_list = []\n\n        self.image_root = os.path.join(data_root, 'Images', 'Train' if is_training else 'Test')\n        self.annotation_root = os.path.join(data_root, 'gt/Polygon', 'Train' if is_training else 'Test')\n        self.image_list = os.listdir(self.image_root)\n        self.image_list = list(filter(lambda img: img.replace('.jpg', '') not in ignore_list, self.image_list))\n        self.annotation_list = ['gt_{}'.format(img_name.replace('.jpg', '')) for img_name in self.image_list]\n\n        if self.pix_mask:\n            self.mask_root = os.path.join(data_root, 'gt_pixel', 'Train' if is_training else 'Test')\n            self.mask_list = os.listdir(self.mask_root)\n            self.mask_list = list(filter(lambda img: img.replace('.jpg', '') not in ignore_list, self.mask_list))\n\n        if self.load_memory:\n            self.datas = list()\n            for item in range(len(self.image_list)):\n                self.datas.append(self.load_img_gt(item))\n\n\n    @staticmethod\n    def parse_mat(mat_path):\n        \"\"\"\n        .mat file parser\n        :param mat_path: (str), mat file path\n        :return: (list), TextInstance\n        \"\"\"\n        annot = io.loadmat(mat_path + \".mat\")\n        polygons = []\n        for cell in annot['gt']:\n            x = cell[1][0]\n            y = cell[3][0]\n            text = cell[4][0] if len(cell[4]) > 0 else '#'\n            ori = cell[5][0] if len(cell[5]) > 0 else 'c'\n            pts = np.stack([x, y]).T.astype(np.int32)\n            polygons.append(TextInstance(pts, ori, text))\n\n        return polygons\n\n    def load_img_gt(self, item):\n        image_id = self.image_list[item]\n        image_path = os.path.join(self.image_root, image_id)\n\n        # Read image data\n        image = pil_load_img(image_path)\n\n        # Read annotation\n        annotation_id = self.annotation_list[item]\n        annotation_path = os.path.join(self.annotation_root, annotation_id)\n        polygons = self.parse_mat(annotation_path)\n\n        data = dict()\n        data[\"image\"] = image\n        data[\"polygons\"] = polygons\n        data[\"image_id\"] = image_id\n        data[\"image_path\"] = image_path\n\n        return data\n\n    def __getitem__(self, item):\n\n        # image_id = self.image_list[item]\n        # image_path = os.path.join(self.image_root, image_id)\n        #\n        # # Read image data\n        # image = pil_load_img(image_path)\n        #\n        # # Read annotation\n        # annotation_id = self.annotation_list[item]\n        # annotation_path = os.path.join(self.annotation_root, annotation_id)\n        # polygons = self.parse_mat(annotation_path)\n\n        if self.load_memory:\n            data = self.datas[item]\n        else:\n            data = self.load_img_gt(item)\n\n        if self.is_training:\n            return self.get_training_data(data[\"image\"], data[\"polygons\"],\n                                          image_id=data[\"image_id\"], image_path=data[\"image_path\"])\n        else:\n            return self.get_test_data(data[\"image\"], data[\"polygons\"],\n                                      image_id=data[\"image_id\"], image_path=data[\"image_path\"])\n\n    def __len__(self):\n        return len(self.image_list)\n\n\nif __name__ == '__main__':\n    import time\n    from util.augmentation import Augmentation\n    from util import canvas as cav\n\n    means = (0.485, 0.456, 0.406)\n    stds = (0.229, 0.224, 0.225)\n\n    transform = Augmentation(\n        size=640, mean=means, std=stds\n    )\n\n    trainset = TotalText_New(\n        data_root='../data/Total-Text',\n        ignore_list=None,\n        is_training=True,\n        transform=transform\n    )\n    for idx in range(0, len(trainset)):\n        t0 = time.time()\n        img, train_mask, tr_mask, distance_field, \\\n        direction_field, weight_matrix, ctrl_points, proposal_points, ignore_tags = trainset[idx]\n        img, train_mask, tr_mask, distance_field, \\\n        direction_field, weight_matrix, ctrl_points, proposal_points, ignore_tags\\\n            = map(lambda x: x.cpu().numpy(),\n                  (img, train_mask, tr_mask, distance_field,\n                   direction_field, weight_matrix, ctrl_points, proposal_points, ignore_tags))\n\n        img = img.transpose(1, 2, 0)\n        img = ((img * stds + means) * 255).astype(np.uint8)\n\n        distance_map = cav.heatmap(np.array(distance_field * 255 / np.max(distance_field), dtype=np.uint8))\n        cv2.imshow(\"distance_map\", distance_map)\n        cv2.waitKey(0)\n\n        direction_map = cav.heatmap(np.array(direction_field[0] * 255 / np.max(direction_field[0]), dtype=np.uint8))\n        cv2.imshow(\"direction_field\", direction_map)\n        cv2.waitKey(0)\n        #\n        from util.vis_flux import vis_direction_field\n        vis_direction_field(direction_field)\n\n        weight_map = cav.heatmap(np.array(weight_matrix * 255 / np.max(weight_matrix), dtype=np.uint8))\n        cv2.imshow(\"weight_matrix\", weight_map)\n        cv2.waitKey(0)\n\n\n        boundary_point = ctrl_points[np.where(ignore_tags!=0)[0]]\n        for i, bpts in enumerate(boundary_point):\n            cv2.drawContours(img, [bpts.astype(np.int32)], -1, (0, 255, 0), 1)\n            for j,  pp in enumerate(bpts):\n                if j==0:\n                    cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (255, 0, 255), -1)\n                elif j==1:\n                    cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (0, 255, 255), -1)\n                else:\n                    cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (0, 0, 255), -1)\n\n            ppts = proposal_points[i]\n            cv2.drawContours(img, [ppts.astype(np.int32)], -1, (0, 0, 255), 1)\n            for j,  pp in enumerate(ppts):\n                if j==0:\n                    cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (255, 0, 255), -1)\n                elif j==1:\n                    cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (0, 255, 255), -1)\n                else:\n                    cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (0, 0, 255), -1)\n            cv2.imshow('imgs', img)\n            cv2.waitKey(0)\n\n        # from util.misc import split_edge_seqence\n        # from cfglib.config import config as cfg\n        #\n        # ret, labels = cv2.connectedComponents(np.array(distance_field >0.35, dtype=np.uint8), connectivity=4)\n        # for idx in range(1, ret):\n        #     text_mask = labels == idx\n        #     ist_id = int(np.sum(text_mask*tr_mask)/np.sum(text_mask))-1\n        #     contours, _ = cv2.findContours(text_mask.astype(np.uint8), cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE)\n        #     epsilon = 0.007 * cv2.arcLength(contours[0], True)\n        #     approx = cv2.approxPolyDP(contours[0], epsilon, True).reshape((-1, 2))\n        #\n        #     pts_num = approx.shape[0]\n        #     e_index = [(i, (i + 1) % pts_num) for i in range(pts_num)]\n        #     control_points = split_edge_seqence(approx, e_index, cfg.num_points)\n        #     control_points = np.array(control_points[:cfg.num_points, :]).astype(np.int32)\n        #\n        #     cv2.drawContours(img, [ctrl_points[ist_id].astype(np.int32)], -1, (0, 255, 0), 1)\n        #     cv2.drawContours(img, [control_points.astype(np.int32)], -1, (0, 0, 255), 1)\n        #     for j,  pp in enumerate(control_points):\n        #         if j == 0:\n        #             cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (255, 0, 255), -1)\n        #         elif j == 1:\n        #             cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (0, 255, 255), -1)\n        #         else:\n        #             cv2.circle(img, (int(pp[0]), int(pp[1])), 2, (0, 255, 0), -1)\n        #\n        #     cv2.imshow('imgs', img)\n        #     cv2.waitKey(0)\n\n"
  },
  {
    "path": "dataset/__init__.py",
    "content": "from .dataload import *\nfrom .Total_Text import TotalText\nfrom .synth_text import SynthText\nfrom .ctw1500_text import Ctw1500Text\nfrom .Icdar15_Text import Icdar15Text\nfrom .Icdar17_Text import Mlt2017Text\nfrom .TD500_Text import TD500Text\nfrom .Icdar19ArT_Json import ArtTextJson\nfrom .Icdar19ArT_Text import ArtText\nfrom .Icdar19_Text import Mlt2019Text\nfrom .CTW1500_Text_New import Ctw1500Text_New\nfrom .Total_Text_New import TotalText_New\nfrom .Icdar19LSVT_Json import LsvtTextJson\nfrom .deploy import DeployDataset\n"
  },
  {
    "path": "dataset/ctw1500/Evaluation_Protocol/ctw1500_eval.py",
    "content": "#!/usr/bin/env python2\n# -*- coding: utf-8 -*-\n\nimport os, shutil, sys\nfrom voc_eval_polygon import voc_eval_polygon\nfrom collections import Counter\nimport numpy as np\n\nimport argparse\nparser = argparse.ArgumentParser()\n\n# basic opts\nparser.add_argument('exp_name', type=str, help='Model output directory')\nargs = parser.parse_args()\n\n\ninput_dir = 'output/{}'.format(args.exp_name)\neval_result_dir = \"output/Analysis/output_eval\"\n\nanno_path = 'data/ctw1500/test/test_label_curve.txt'\nimagesetfile = 'data/ctw1500/test/test.txt'\n\noutputstr = \"dataset/ctw1500/Evaluation_sort/detections_text\"\n\n\n# score_thresh_list=[0.2, 0.3, 0.4, 0.5, 0.6, 0.62, 0.65, 0.7, 0.75, 0.8, 0.9]\nscore_thresh_list = [0.5]\nfiles = os.listdir(input_dir)\nfiles.sort()\nfor iscore in score_thresh_list:\n    fpath = outputstr + str(iscore) + '.txt'\n    with open(fpath, \"w\") as f1:\n        for ix, filename in enumerate(files):\n            imagename = filename[:-4]\n            with open(os.path.join(input_dir, filename), \"r\") as f:\n                lines = f.readlines()\n\n            for line in lines:\n                box = line.strip().split(\",\")\n                assert (len(box) % 2 == 0), 'mismatch xy'\n                out_str = \"{} {}\".format(str(int(imagename[:]) - 1001), 0.999)\n                for i in box:\n                    out_str = out_str + ' ' + str(i)\n                f1.writelines(out_str + '\\n')\n    rec, prec, AP, FP, TP, image_ids, num_gt = voc_eval_polygon(fpath, anno_path, imagesetfile, 'text', ovthresh=0.5)\n    fid_path = '{}/Eval_ctw1500_{}.txt'.format(eval_result_dir, iscore)\n    F = lambda x, y: 2 * x * y * 1.0 / (x + y)\n\n    img_dict = dict(Counter(image_ids))\n\n    with open(fid_path, 'w') as f:\n        count = 0\n        for k, v in zip(img_dict.keys(), img_dict.values()):\n            fp = np.sum(FP[count:count+v])\n            tp = np.sum(TP[count:count+v])\n            count += v\n            recall = tp / float(num_gt[k])\n            precision = tp / np.maximum(tp + fp, np.finfo(np.float64).eps)\n            f.write('%s :: Precision=%.4f - Recall=%.4f\\n' % (str(int(k)+1001)+\".txt\", recall, precision))\n\n        Recall = rec[-1]\n        Precision = prec[-1]\n        F_score = F(Recall, Precision)\n        f.write('ALL :: AP=%.4f - Precision=%.4f - Recall=%.4f - Fscore=%.4f' % (AP, Precision, Recall, F_score))\n\n    print('AP: {:.4f}, recall: {:.4f}, pred: {:.4f}, '\n          'FM: {:.4f}\\n'.format(AP, Recall, Precision, F_score))\n\n"
  },
  {
    "path": "dataset/ctw1500/Evaluation_Protocol/voc_eval_polygon.py",
    "content": "# --------------------------------------------------------\n# Fast/er R-CNN\n# Licensed under The MIT License [see LICENSE for details]\n# Written by Bharath Hariharan\n# Modified by yl\n# --------------------------------------------------------\n\nimport os\ntry:\n    import cPickle\nexcept:\n    import _pickle as cPickle\nimport numpy as np\nfrom shapely.geometry import *\nfrom tqdm import tqdm\n\ndef parse_rec_txt(filename):\n    with open(filename.strip(),'r') as f:\n        gts = f.readlines()\n        objects = []\n        for obj in gts:\n            cors = obj.strip().split(',')\n            obj_struct = {}\n            obj_struct['name'] = 'text'\n            obj_struct['difficult'] = 0\n            obj_struct['bbox'] = [int(cors[0]),\n                                  int(cors[1]),\n                                  int(cors[2]),\n                                  int(cors[3])]\n            objects.append(obj_struct)\n    return objects\n\n\ndef curve_parse_rec_txt(filename):\n    with open(filename.strip(),'r') as f:\n        gts = f.readlines()\n        objects = []\n        for obj in gts:\n            cors = obj.strip().split(',')\n            obj_struct = {}\n            obj_struct['name'] = 'text'\n            obj_struct['difficult'] = 0\n            obj_struct['bbox'] = [int(cors[0]), int(cors[1]),int(cors[2]),int(cors[3]),\n                                  int(cors[4]), int(cors[5]),int(cors[6]),int(cors[7]),\n                                  int(cors[8]), int(cors[9]),int(cors[10]),int(cors[11]),\n                                  int(cors[12]), int(cors[13]),int(cors[14]),int(cors[15]),int(cors[16]), int(cors[17]),int(cors[18]),int(cors[19]),int(cors[20]), int(cors[21]),\n                                  int(cors[22]), int(cors[23]),int(cors[24]),int(cors[25]),int(cors[26]), int(cors[27]),int(cors[28]),int(cors[29]),int(cors[30]), int(cors[31])]\n            objects.append(obj_struct)\n    return objects\n\n\ndef voc_ap(rec, prec, use_07_metric=False):\n    \"\"\" ap = voc_ap(rec, prec, [use_07_metric])\n    Compute VOC AP given precision and recall.\n    If use_07_metric is true, uses the\n    VOC 07 11 point method (default:False).\n    \"\"\"\n    if use_07_metric:\n        # 11 point metric\n        ap = 0.\n        for t in np.arange(0., 1.1, 0.1):\n            if np.sum(rec >= t) == 0:\n                p = 0\n            else:\n                p = np.max(prec[rec >= t])\n            ap = ap + p / 11.\n    else:\n        # correct AP calculation\n        # first append sentinel values at the end\n        mrec = np.concatenate(([0.], rec, [1.]))\n        mpre = np.concatenate(([0.], prec, [0.]))\n\n        # compute the precision envelope\n        for i in range(mpre.size - 1, 0, -1):\n            mpre[i - 1] = np.maximum(mpre[i - 1], mpre[i])\n\n        # to calculate area under PR curve, look for points\n        # where X axis (recall) changes value\n        i = np.where(mrec[1:] != mrec[:-1])[0]\n\n        # and sum (\\Delta recall) * prec\n        ap = np.sum((mrec[i + 1] - mrec[i]) * mpre[i + 1])\n    return ap\n\n\ndef voc_eval_polygon(detpath,\n             annopath,\n             imagesetfile,\n             classname,\n             ovthresh=0.5,\n             use_07_metric=False):\n\n    # first load gt\n    cachefile = 'dataset/ctw1500/annots.pkl'\n    # read list of images\n    with open(imagesetfile, 'r') as f, open(annopath, 'r') as fa:\n        lines = f.readlines()\n        anno_lines = fa.readlines()\n    imagenames = [x.strip() for x in lines]\n    anno_names = [y.strip() for y in anno_lines]\n    assert(len(imagenames) == len(anno_names)), 'each image should correspond to one label file'\n\n    if not os.path.isfile(cachefile):\n        # load annots\n        recs = {}\n        for i, imagename in enumerate(imagenames):\n            print(anno_names[i].strip())\n            recs[imagename] = curve_parse_rec_txt(anno_names[i])\n            if i % 100 == 0:\n                print('Reading annotation for {:d}/{:d}'.format(\n                    i + 1, len(imagenames)))\n        # save\n        print('Saving cached annotations to {:s}'.format(cachefile))\n        with open(cachefile, 'wb') as f:\n            cPickle.dump(recs, f)\n    else:\n        # load\n        with open(cachefile, 'rb') as f:\n            recs = cPickle.load(f)\n\n    class_recs = {}\n    npos = 0\n    num_gt = {}\n    for ix, imagename in enumerate(imagenames):\n        R = [obj for obj in recs[imagename] if obj['name'] == classname] # text\n        # assert(R), 'Can not find any object in '+ classname+' class.'\n        if not R: continue\n        bbox = np.array([x['bbox'] for x in R])\n        difficult = np.array([x['difficult'] for x in R]).astype(np.bool)\n        det = [False] * len(R)\n        npos = npos + sum(~difficult)\n        num_gt[str(ix)] = sum(~difficult)\n        class_recs[str(ix)] = {'bbox': bbox, 'difficult': difficult, 'det': det}\n\n    # read dets\n    detfile = detpath.format(classname)\n    with open(detfile, 'r') as f:\n        lines = f.readlines()\n\n    splitlines = [x.strip().split(' ') for x in lines]\n    image_ids = [x[0] for x in splitlines]\n    confidence = np.array([float(x[1]) for x in splitlines])\n    BB = np.array([[float(z) for z in x[2:]] for x in splitlines])\n    # sort by confidence\n    sorted_ind = np.argsort(-confidence)\n    sorted_scores = np.sort(-confidence)\n    # BB = BB[sorted_ind, :]\n    # image_ids = [image_ids[x] for x in sorted_ind]\n\n    # go down dets and mark TPs and FPs\n    nd = len(image_ids)\n    tp = np.zeros(nd)\n    fp = np.zeros(nd)\n    # for d in range(nd):\n    for d in tqdm(range(nd)):\n        R = class_recs[image_ids[d]]\n        bb = BB[d] # mask rcnn\n        det_bbox = bb[:]\n        pts = [(det_bbox[j], det_bbox[j+1]) for j in range(0,len(bb),2)]\n        try:\n            pdet = Polygon(pts)\n        except Exception as e:\n            print(e)\n            continue\n        if not pdet.is_valid: \n            # print('{} img predicted polygon has intersecting sides.'.format(d))\n            # print(pts, image_ids[d])\n            continue\n\n        ovmax = -np.inf\n        BBGT = R['bbox'].astype(float)\n        gt_bbox = BBGT[:, :4]\n        info_bbox_gt = BBGT[:, 4:32]\n        ls_pgt = [] \n        overlaps = np.zeros(BBGT.shape[0])\n        for iix in range(BBGT.shape[0]):\n            pts = [(int(gt_bbox[iix, 0]) + info_bbox_gt[iix, j],\n                    int(gt_bbox[iix, 1]) + info_bbox_gt[iix, j+1]) for j in range(0,28,2)]\n            pgt = Polygon(pts)\n            if not pgt.is_valid: \n                print('GT polygon has intersecting sides.')\n                continue\n            try:\n                sec = pdet.intersection(pgt)\n            except Exception as e:\n                print('intersect invalid',e)\n                continue\n            try:\n                assert sec.is_valid, 'polygon has intersection sides.'  # for mask rcnn\n            except Exception as e:\n                print(e)\n                continue\n            inters = sec.area\n            uni = pgt.area + pdet.area - inters\n            if uni <= 0.00001: uni = 0.00001\n            overlaps[iix] = inters*1.0 / uni\n            \n        ovmax = np.max(overlaps)\n        jmax = np.argmax(overlaps)\n\n        if ovmax > ovthresh:\n            if not R['difficult'][jmax]:\n                if not R['det'][jmax]:\n                    tp[d] = 1.\n                    R['det'][jmax] = 1\n                else:\n                    fp[d] = 1.\n        else:\n            fp[d] = 1.\n\n    # compute precision recall\n    fpp = fp\n    tpp = tp\n    fp = np.cumsum(fp)\n    tp = np.cumsum(tp)\n    rec = tp / float(npos)\n    # ground truth\n    prec = tp / np.maximum(tp + fp, np.finfo(np.float64).eps)\n    ap = voc_ap(rec, prec, use_07_metric)\n    return rec, prec, ap, fpp, tpp, image_ids, num_gt\n"
  },
  {
    "path": "dataset/ctw1500/Evaluation_sort/detections_text0.5.txt",
    "content": "0 0.999 44 73 49 83 59 80 68 75 78 71 87 69 98 70 106 74 115 78 123 82 131 79 130 71 120 66 111 63 101 59 91 56 80 57 70 61 61 65 53 69\n0 0.999 99 117 92 117 84 117 78 117 71 117 64 117 57 117 50 118 50 124 50 130 50 136 56 136 64 136 71 136 77 136 84 137 90 137 99 137 98 129 98 125\n0 0.999 40 136 39 143 39 151 44 155 50 155 57 155 63 155 70 155 76 156 82 156 89 156 89 148 89 140 86 137 79 137 72 137 64 136 57 136 51 136 46 136\n0 0.999 40 183 42 192 55 193 70 193 84 193 98 194 110 195 124 195 137 197 141 188 143 178 139 168 125 167 111 167 99 166 86 166 72 165 59 165 47 164 40 168\n1 0.999 52 58 59 91 67 124 76 152 124 149 137 148 165 146 194 147 219 149 248 151 277 155 283 132 286 108 257 101 226 95 199 91 166 93 138 95 110 87 82 68\n1 0.999 277 204 267 203 258 204 249 204 242 204 228 206 217 206 207 206 197 206 194 214 196 224 204 226 214 225 225 225 234 224 244 224 253 224 264 224 274 224 277 215\n1 0.999 167 258 157 259 148 259 139 260 134 260 121 261 111 261 101 262 92 264 93 273 94 281 103 281 112 280 122 280 131 280 140 279 148 279 158 279 167 276 167 268\n2 0.999 57 102 63 116 72 119 85 114 99 111 112 110 125 110 138 112 151 115 163 120 173 116 175 103 163 97 151 93 137 90 123 88 109 88 95 90 82 93 69 97\n2 0.999 154 132 141 133 120 133 101 133 87 134 75 135 72 149 71 165 72 181 72 195 73 208 84 208 103 208 120 207 136 206 153 206 153 193 154 177 155 163 155 149\n2 0.999 78 228 78 238 79 247 91 248 100 248 111 248 120 248 130 248 139 248 149 248 159 248 159 237 156 227 148 227 137 227 126 227 117 227 106 227 97 227 87 227\n3 0.999 153 77 159 98 171 107 193 103 216 100 238 99 261 100 283 102 304 106 325 112 340 109 347 88 329 78 305 70 282 67 261 64 240 63 217 64 195 65 172 70\n3 0.999 238 121 237 124 237 128 238 130 242 132 244 132 248 132 251 133 254 133 256 133 260 130 260 127 260 122 258 121 255 120 253 120 249 120 246 120 243 120 239 121\n3 0.999 102 156 101 166 102 179 107 183 118 183 129 184 140 184 151 185 161 185 171 185 180 186 181 174 182 159 175 157 164 156 154 156 143 156 132 155 122 155 110 155\n3 0.999 301 163 301 176 301 189 311 190 324 191 336 191 348 191 359 192 371 192 382 192 393 193 393 178 392 165 382 164 370 164 358 163 347 163 335 163 323 162 310 161\n3 0.999 165 221 179 227 194 232 213 236 230 238 248 240 266 239 284 237 300 232 315 228 322 213 309 212 293 216 276 220 258 222 241 223 224 221 207 217 190 212 172 206\n4 0.999 19 169 63 179 109 185 160 192 210 199 266 207 328 216 380 225 431 231 481 239 489 185 445 166 396 152 346 144 299 137 248 128 191 116 137 104 82 92 33 99\n4 0.999 279 226 289 246 310 249 332 250 353 253 375 256 399 258 420 260 442 263 465 266 487 266 475 248 453 244 431 241 410 239 388 236 365 233 343 231 322 228 299 226\n4 0.999 76 338 105 345 135 339 167 338 198 345 229 358 265 363 298 364 329 365 360 364 391 359 368 337 334 334 301 332 268 332 234 332 203 326 171 320 138 320 105 327\n5 0.999 54 58 67 64 82 62 91 57 104 52 119 51 134 54 148 59 160 70 172 67 185 58 176 48 163 40 148 33 134 30 119 28 104 29 89 32 76 38 66 45\n5 0.999 23 110 37 114 56 114 78 114 99 114 123 114 146 114 167 113 189 113 210 113 210 94 196 82 172 81 154 82 131 81 110 81 88 82 67 81 45 81 25 81\n5 0.999 84 122 84 125 84 128 84 131 85 132 88 133 90 133 92 132 95 132 98 132 100 131 100 128 100 125 100 121 98 121 95 121 93 121 90 121 87 121 85 121\n5 0.999 100 122 99 127 99 135 100 139 104 141 110 141 116 141 122 141 127 141 133 141 137 141 138 134 137 125 137 121 132 121 126 121 121 121 115 121 109 121 103 121\n5 0.999 47 141 57 153 71 163 85 171 99 174 115 177 131 176 146 172 160 165 171 157 182 144 170 131 155 131 143 141 128 149 114 151 99 148 85 139 73 126 60 130\n5 0.999 95 206 95 217 100 223 108 223 117 223 125 223 134 223 143 223 151 223 160 224 168 223 168 209 164 206 155 206 146 205 138 206 129 206 121 205 112 205 103 205\n6 0.999 118 42 109 41 101 43 93 46 85 49 77 52 68 56 61 61 54 66 48 70 51 77 59 81 66 75 73 70 81 66 88 62 96 58 104 56 112 53 119 49\n6 0.999 168 46 165 54 173 57 181 60 189 63 198 67 206 71 213 75 220 80 227 85 236 86 239 79 232 74 224 68 216 63 209 59 201 55 192 52 184 49 176 46\n6 0.999 18 121 17 150 34 161 64 161 93 161 124 161 153 162 183 163 212 164 242 165 271 165 270 138 254 124 224 124 193 123 164 122 134 121 105 121 73 120 44 120\n6 0.999 39 176 42 200 66 198 90 198 114 198 138 199 161 200 185 201 209 201 233 202 255 202 251 179 229 178 205 177 181 177 158 176 133 176 109 175 86 175 61 174\n6 0.999 86 204 86 221 102 221 120 222 136 222 152 222 169 223 185 223 202 224 219 224 235 223 234 207 218 206 202 206 185 205 168 205 151 205 134 205 118 204 102 204\n6 0.999 84 235 83 254 98 257 115 257 131 257 148 258 166 258 183 258 201 259 220 259 234 257 235 239 221 236 203 235 185 235 168 235 150 234 132 234 116 234 96 233\n6 0.999 126 277 128 280 131 280 134 281 137 281 141 282 144 283 148 283 151 284 155 284 158 280 155 278 151 278 147 277 144 277 141 276 138 276 134 275 131 274 128 274\n6 0.999 201 275 197 275 194 276 190 276 187 277 184 278 181 278 177 278 174 278 171 278 170 282 172 284 176 284 180 284 183 283 186 283 190 282 193 282 197 281 200 279\n7 0.999 434 130 426 69 384 70 343 69 302 69 267 70 217 72 177 81 138 86 95 87 52 89 57 140 96 142 137 141 178 142 224 149 270 146 313 140 353 138 391 133\n7 0.999 276 162 270 152 263 153 256 156 249 158 240 159 233 159 226 157 218 155 210 153 206 161 209 167 217 170 225 173 232 174 240 175 248 175 256 173 263 171 270 168\n8 0.999 199 152 207 160 216 161 227 161 240 161 254 161 271 162 283 160 287 161 300 163 314 155 305 148 294 145 285 141 272 139 257 138 243 139 228 140 215 144 207 148\n8 0.999 304 226 296 225 283 222 272 223 260 224 250 226 242 227 234 227 219 227 212 232 209 240 219 243 230 244 242 243 251 241 261 240 271 238 280 237 290 239 300 239\n9 0.999 200 25 200 35 203 43 214 43 225 43 236 43 246 43 261 44 274 43 282 43 291 43 292 31 286 24 275 24 265 24 254 24 243 24 233 24 221 24 209 24\n9 0.999 390 62 357 63 325 72 293 83 261 94 231 104 201 115 173 121 140 128 104 138 111 173 143 171 176 164 206 155 236 144 268 131 297 120 327 108 361 100 388 92\n9 0.999 405 107 367 113 328 123 289 138 251 153 215 168 176 181 146 189 102 195 79 220 84 256 128 255 169 247 207 237 244 223 279 207 317 191 355 179 396 169 409 145\n9 0.999 328 281 310 281 292 281 275 281 256 282 239 283 222 283 206 283 188 283 171 289 171 307 189 307 207 307 225 307 243 306 260 305 279 305 297 305 315 305 328 298\n10 0.999 443 20 443 25 447 27 452 27 456 27 461 27 466 28 472 28 476 28 480 28 484 23 482 18 479 17 474 17 469 17 464 17 459 17 455 16 450 16 446 16\n10 0.999 84 78 87 81 96 82 106 84 117 85 126 86 137 89 149 90 156 90 166 91 168 81 164 74 155 73 145 71 136 70 126 68 117 67 107 65 97 63 88 64\n10 0.999 427 109 427 116 428 122 435 123 441 122 447 122 454 122 461 122 466 122 474 122 479 120 479 113 477 106 470 106 463 106 457 106 451 107 444 107 438 107 432 107\n10 0.999 372 105 355 109 339 114 323 117 307 119 291 120 277 120 260 119 241 117 234 122 233 142 250 145 265 147 283 146 300 145 316 143 333 142 349 138 364 135 371 123\n10 0.999 429 125 430 132 434 137 441 137 449 138 454 142 460 144 467 147 473 147 480 145 480 139 481 134 481 125 477 121 470 121 463 121 456 121 449 121 442 122 436 123\n10 0.999 204 114 186 116 167 120 151 120 132 120 113 119 98 119 79 124 82 140 83 162 83 177 103 178 122 178 139 178 161 179 182 180 200 180 211 170 210 152 207 136\n10 0.999 383 129 366 134 349 138 333 142 315 144 299 146 283 147 265 148 247 147 231 147 230 166 247 168 264 169 280 168 298 167 316 166 333 164 351 160 368 156 384 147\n10 0.999 379 176 372 158 356 160 340 165 323 168 305 170 290 170 270 169 251 169 236 168 229 176 228 193 249 195 263 194 282 194 299 193 316 191 332 188 348 185 364 181\n10 0.999 86 180 89 193 102 194 116 194 129 194 142 195 156 195 170 196 181 196 196 197 207 196 203 183 190 182 177 182 164 182 151 181 138 181 125 181 112 180 99 180\n10 0.999 395 205 373 195 346 196 319 200 293 203 268 206 244 206 219 206 189 206 168 206 150 216 170 230 196 231 222 231 250 230 275 229 302 227 331 223 356 219 376 216\n11 0.999 372 219 346 216 325 214 301 213 275 213 247 214 222 214 195 215 168 218 140 221 143 248 163 248 191 245 218 245 242 244 268 244 294 244 320 245 348 247 368 245\n11 0.999 281 366 281 388 305 391 331 391 353 391 372 391 398 391 423 391 443 391 463 391 480 388 478 366 457 366 436 365 413 365 390 364 369 363 346 363 321 363 300 364\n12 0.999 105 46 103 36 96 33 89 33 81 34 74 34 66 35 60 34 54 36 46 36 44 39 43 48 49 50 56 51 65 51 73 51 80 51 88 50 95 49 102 48\n12 0.999 101 52 95 49 90 51 85 54 78 56 72 56 66 56 60 54 53 52 48 51 45 54 49 60 55 63 61 64 66 65 73 66 79 65 85 63 91 61 97 58\n12 0.999 376 50 376 58 378 67 387 67 396 66 405 66 414 67 423 68 432 69 440 71 448 70 450 61 445 53 437 53 428 52 419 51 410 51 401 50 393 50 385 50\n12 0.999 212 56 215 69 228 72 243 72 258 72 273 72 287 74 303 75 316 77 330 80 342 79 340 65 326 62 312 60 298 58 283 57 269 56 255 56 240 56 226 56\n12 0.999 393 73 395 75 398 76 403 76 407 77 413 77 417 77 422 77 426 77 431 75 431 70 428 68 423 67 419 67 414 67 410 66 405 66 401 66 396 66 393 68\n12 0.999 381 77 381 84 386 86 393 86 400 86 407 86 415 87 422 88 430 88 436 89 442 87 441 78 435 76 428 77 420 77 413 77 406 76 400 75 392 74 386 73\n12 0.999 230 75 229 86 236 92 248 93 260 94 271 95 283 96 296 97 308 98 319 98 330 98 330 85 321 81 310 80 299 79 286 79 275 78 263 77 251 76 240 75\n12 0.999 395 87 395 91 396 95 401 96 405 96 410 97 414 97 420 97 423 97 427 97 431 95 432 89 429 88 425 87 420 87 416 87 412 86 407 86 402 86 399 86\n12 0.999 245 98 245 104 251 106 258 106 265 106 271 107 278 109 286 109 292 109 298 110 305 107 303 99 297 98 290 98 283 97 276 97 270 97 263 96 256 96 250 95\n12 0.999 385 96 385 103 388 107 395 107 401 107 408 107 415 107 422 107 428 108 433 108 440 107 440 100 435 96 428 97 422 97 416 97 410 96 404 96 397 95 391 95\n12 0.999 231 109 233 118 242 119 252 119 262 120 272 121 282 121 293 122 302 123 311 123 321 121 318 113 309 112 299 112 289 111 280 110 270 110 260 109 250 109 240 108\n12 0.999 21 155 31 158 45 158 60 159 75 159 88 160 103 161 119 162 133 163 148 164 151 149 140 148 126 147 112 147 98 145 83 145 69 144 55 142 40 142 24 141\n12 0.999 335 157 345 164 358 165 373 165 388 165 404 167 420 167 436 167 451 168 466 168 473 156 462 152 448 152 433 152 418 151 403 151 388 150 373 149 358 148 343 147\n12 0.999 18 157 19 171 34 173 49 174 65 174 80 174 95 176 111 176 127 177 141 177 155 177 152 164 138 162 123 162 108 161 93 161 78 160 63 159 47 158 33 158\n12 0.999 336 182 336 190 345 192 353 193 364 193 374 194 383 195 394 197 403 196 405 190 409 181 407 173 398 171 390 171 381 170 371 170 361 170 352 169 341 168 338 175\n12 0.999 236 196 236 205 237 214 246 214 256 214 266 215 276 215 286 215 297 215 305 216 313 215 313 203 310 197 301 197 292 196 282 196 273 196 263 196 253 195 245 195\n12 0.999 266 219 266 223 266 227 270 230 274 230 278 230 282 230 288 230 291 230 296 230 299 227 299 223 296 219 293 219 289 218 286 218 281 218 276 218 273 218 269 218\n12 0.999 385 220 383 226 386 229 393 229 398 229 404 229 410 229 416 230 421 230 426 230 431 229 431 221 427 220 421 220 416 219 410 219 405 219 399 219 394 218 389 218\n12 0.999 376 229 375 238 374 247 377 253 387 255 396 255 405 255 415 255 427 255 433 255 441 252 443 244 444 234 437 230 429 230 420 230 410 229 400 229 391 229 384 228\n12 0.999 261 232 261 237 262 243 267 243 272 243 277 243 282 243 288 243 293 243 296 244 301 242 301 237 299 232 295 232 290 231 285 231 280 231 275 231 270 231 266 231\n12 0.999 241 244 241 252 247 256 256 256 265 256 274 257 283 257 292 257 301 258 309 258 317 256 317 246 309 246 301 246 292 245 283 245 275 245 266 245 258 244 249 244\n12 0.999 260 261 260 265 263 269 269 268 275 269 280 269 285 269 291 269 297 269 302 268 305 262 304 258 300 257 294 257 289 257 284 257 278 257 273 257 267 257 263 257\n12 0.999 263 271 263 276 265 280 268 281 273 281 277 281 281 281 286 281 291 281 296 281 301 280 301 275 300 270 295 270 290 270 286 269 281 269 276 269 271 269 266 269\n12 0.999 262 317 262 329 271 334 283 334 296 334 308 334 320 334 334 334 347 332 359 331 370 328 368 318 357 317 346 317 334 317 322 317 310 317 297 317 285 317 272 317\n12 0.999 428 320 407 321 387 325 369 329 349 332 330 334 311 334 293 334 275 334 261 337 265 352 286 353 304 354 323 354 343 354 363 354 384 354 407 354 426 354 428 338\n13 0.999 58 46 70 66 85 73 107 62 134 54 158 48 180 47 202 54 225 65 255 78 262 69 267 44 246 35 223 28 199 22 172 18 146 21 124 24 100 27 76 36\n13 0.999 19 81 19 94 19 107 19 119 30 120 45 119 58 119 71 120 85 121 105 117 105 120 105 108 105 94 105 84 94 80 79 76 65 75 52 74 39 76 20 79\n13 0.999 216 84 215 96 215 108 215 119 222 125 238 126 250 127 263 128 277 128 297 128 298 129 299 117 299 103 299 93 289 90 275 89 262 88 249 86 237 84 217 84\n13 0.999 136 177 137 182 143 182 150 182 156 182 163 182 169 182 175 182 183 182 190 179 190 173 189 169 182 169 176 169 170 169 162 169 156 169 150 169 143 169 136 171\n13 0.999 32 180 32 185 32 190 32 195 33 200 38 201 42 201 47 201 52 201 60 198 60 199 61 194 61 190 61 185 60 180 55 180 50 179 45 179 40 179 34 180\n13 0.999 298 186 275 186 252 186 230 187 207 187 185 187 163 188 140 188 116 188 100 189 94 206 114 209 137 209 159 208 182 208 204 208 226 207 250 207 272 206 292 205\n13 0.999 25 209 25 213 30 215 35 214 41 214 47 214 52 214 58 214 65 214 71 211 71 205 70 202 65 201 59 201 53 201 47 201 41 201 36 201 30 201 25 204\n13 0.999 296 210 275 208 254 208 232 208 211 209 190 210 168 210 147 211 126 211 107 212 94 221 114 224 135 223 155 223 177 223 199 222 220 221 241 221 263 220 282 220\n13 0.999 28 222 28 225 32 228 38 228 43 228 48 228 53 228 58 228 64 228 70 226 70 221 70 217 65 216 60 216 55 216 50 216 45 216 39 216 34 216 28 218\n13 0.999 261 224 246 224 233 224 219 224 205 224 191 224 177 225 164 225 150 225 139 225 134 236 147 238 160 237 174 237 189 237 202 237 216 237 230 236 244 236 258 236\n14 0.999 60 39 72 59 91 63 113 53 140 46 162 39 183 40 205 48 227 58 249 69 264 62 265 37 242 26 220 19 197 13 172 10 149 12 126 16 103 19 80 28\n14 0.999 146 79 145 83 146 91 146 97 146 103 150 107 154 107 160 107 165 107 170 107 175 105 175 101 176 92 177 86 177 79 173 79 168 79 161 78 155 78 150 78\n14 0.999 40 111 40 123 40 139 43 148 55 147 69 147 80 147 92 149 105 151 116 153 124 149 125 135 126 118 125 109 114 107 100 106 87 105 74 105 61 107 49 108\n14 0.999 197 119 197 129 197 145 197 157 209 158 224 155 235 156 246 160 258 161 269 160 280 159 280 149 281 132 280 124 269 122 257 121 245 120 232 120 220 118 207 118\n14 0.999 82 158 95 168 110 176 127 182 145 185 163 186 180 186 196 182 213 177 228 170 239 159 226 155 212 163 195 168 178 172 161 172 145 171 127 167 112 160 96 152\n14 0.999 95 194 99 206 114 206 128 206 143 205 157 205 171 205 185 205 200 205 214 205 228 204 224 193 209 192 195 192 181 192 167 192 152 192 138 192 123 192 109 192\n15 0.999 99 25 104 39 110 52 118 59 135 54 149 53 163 55 179 60 194 68 203 68 209 57 216 46 206 39 193 32 182 28 169 23 155 20 141 17 125 18 112 20\n15 0.999 102 107 104 126 106 146 105 157 128 157 149 157 167 157 188 157 211 157 218 157 234 155 233 140 231 125 223 112 204 111 189 111 172 110 154 110 137 108 120 107\n15 0.999 267 240 251 226 238 212 227 206 214 218 196 226 179 227 161 224 146 219 133 213 127 231 122 252 136 260 150 264 167 268 183 270 201 269 220 264 235 258 248 251\n16 0.999 353 28 353 36 354 43 360 44 366 44 374 44 380 44 388 44 395 44 403 44 410 43 410 36 410 27 405 27 397 27 390 26 381 25 375 25 367 26 359 26\n16 0.999 85 29 85 38 86 44 92 44 98 44 104 44 111 44 117 44 125 44 130 44 138 43 138 35 137 28 131 28 125 28 118 28 111 28 104 28 98 28 90 28\n16 0.999 141 70 154 91 175 87 197 80 218 76 240 74 262 76 285 77 307 83 329 89 344 82 342 65 321 58 298 52 274 49 251 47 228 47 205 50 183 55 163 62\n16 0.999 33 61 31 70 31 80 31 89 31 99 31 107 31 116 31 125 31 135 34 144 46 144 48 134 48 125 48 115 48 106 48 96 48 86 48 77 48 67 45 60\n16 0.999 11 62 9 71 9 80 9 89 10 98 10 106 10 115 10 124 10 133 9 141 19 144 23 135 23 126 23 116 23 107 23 98 23 88 23 80 23 70 21 63\n16 0.999 457 62 451 80 451 100 451 120 451 140 451 160 451 179 450 199 450 219 450 239 453 258 460 241 461 220 461 200 461 181 461 160 461 140 461 119 462 99 462 78\n16 0.999 468 62 462 81 462 100 462 121 462 141 462 160 462 180 462 200 462 219 462 239 464 258 472 239 472 220 472 200 472 180 472 159 473 138 473 119 473 99 473 78\n16 0.999 476 63 473 84 473 104 473 123 473 144 473 163 473 183 473 202 473 223 473 243 483 258 486 237 486 217 486 197 486 176 486 156 486 135 487 115 487 95 487 75\n16 0.999 12 153 12 174 11 195 12 216 12 236 11 257 11 277 11 298 11 318 11 339 17 356 20 334 20 314 20 294 20 273 20 252 20 231 20 210 20 189 20 168\n16 0.999 32 155 30 175 30 196 30 216 30 236 30 257 30 277 30 297 30 318 30 338 35 356 39 336 39 315 39 294 39 274 39 254 38 233 38 212 39 192 39 171\n16 0.999 22 157 20 178 20 198 20 219 20 238 20 258 20 278 20 298 20 318 20 339 27 356 30 335 30 315 30 295 30 275 30 254 29 235 29 214 29 194 30 173\n16 0.999 40 198 38 214 38 231 38 247 39 263 39 279 39 294 39 310 39 326 39 343 46 355 48 339 48 323 48 307 48 291 48 274 48 258 48 242 48 226 48 209\n16 0.999 120 205 120 241 126 255 157 254 189 255 219 254 248 254 277 255 309 255 337 254 370 255 370 228 360 211 329 209 302 209 268 211 237 211 210 211 180 211 150 207\n16 0.999 163 273 171 289 187 284 203 281 221 279 237 278 255 279 272 280 289 283 306 287 321 287 317 271 302 266 284 263 265 260 248 258 230 259 212 260 195 264 179 268\n16 0.999 86 312 86 321 86 327 92 328 98 328 104 328 111 328 117 328 125 328 131 329 137 327 138 319 137 313 133 311 126 311 120 311 112 310 106 311 100 311 93 311\n16 0.999 346 341 349 344 357 344 365 344 373 344 381 344 388 344 396 344 405 344 412 343 412 334 409 329 401 328 394 327 385 326 377 326 369 326 361 325 353 325 346 328\n16 0.999 83 334 83 343 89 344 96 344 102 344 109 344 115 345 122 345 129 345 137 345 140 339 140 331 135 329 127 329 120 329 112 328 107 328 101 328 94 328 87 328\n17 0.999 156 73 164 102 182 97 202 87 224 81 243 77 265 82 286 92 307 104 326 118 343 122 348 96 329 82 311 68 290 55 269 46 243 42 218 43 196 52 174 62\n17 0.999 161 135 158 164 175 171 195 171 216 173 236 174 256 175 278 176 301 176 321 178 341 179 343 146 331 144 310 142 290 140 268 138 245 137 222 135 200 134 179 133\n18 0.999 170 136 165 170 165 208 201 200 206 164 224 136 248 122 271 123 301 135 324 159 338 192 352 208 368 188 358 154 339 124 312 106 280 92 248 89 212 97 183 116\n18 0.999 437 249 404 226 368 226 334 226 295 226 255 226 214 227 170 227 129 227 90 228 64 241 84 258 128 256 170 255 209 254 249 254 287 254 325 252 361 251 400 251\n18 0.999 397 257 368 257 343 258 319 258 292 260 263 260 233 261 203 263 173 263 146 264 136 286 159 288 191 287 220 285 248 284 277 283 305 281 331 280 356 278 385 277\n19 0.999 100 93 110 107 123 108 138 100 153 97 169 97 184 101 199 108 211 117 223 128 235 125 233 112 222 100 210 90 195 82 178 76 162 75 145 75 129 78 114 84\n19 0.999 140 101 139 110 140 116 148 118 157 119 165 121 172 122 180 123 188 124 196 126 204 127 205 115 204 111 196 109 188 107 180 106 172 104 164 103 156 101 147 100\n19 0.999 156 126 156 130 159 133 163 133 166 134 170 134 174 135 177 135 181 136 185 136 189 136 189 130 186 129 183 128 178 127 175 127 171 126 168 125 163 125 159 124\n19 0.999 19 165 19 167 19 171 20 173 23 174 25 174 28 174 31 174 33 174 36 175 39 174 39 169 39 165 38 164 35 164 33 163 30 163 27 163 24 163 21 163\n19 0.999 43 196 43 200 43 206 46 208 50 208 54 209 58 209 63 209 68 210 72 210 76 210 77 204 78 197 74 196 70 196 66 196 61 195 56 195 52 195 47 195\n20 0.999 53 16 56 43 60 67 83 67 111 59 136 56 160 56 186 59 211 62 230 66 250 66 256 44 246 29 222 23 200 20 177 17 151 15 128 14 100 14 76 15\n20 0.999 73 90 81 103 95 101 117 99 142 98 162 98 184 98 206 98 231 100 237 89 238 77 227 65 208 61 188 60 169 58 150 58 129 57 109 60 87 64 70 71\n20 0.999 37 112 41 140 42 164 70 156 102 149 128 147 155 144 183 144 211 147 229 149 251 151 252 124 245 108 219 103 193 101 168 100 141 100 113 100 84 102 60 107\n20 0.999 154 146 155 164 156 186 158 207 163 215 185 216 205 216 225 218 248 219 248 220 259 219 260 201 261 186 263 166 259 157 242 154 222 152 201 151 184 150 171 148\n20 0.999 140 169 138 164 122 155 109 155 99 156 90 157 80 158 68 160 63 161 55 170 57 177 58 190 68 192 79 191 90 190 100 189 110 189 122 187 130 185 135 178\n20 0.999 158 208 153 191 144 188 131 189 119 190 108 190 96 190 83 190 74 190 62 191 56 195 55 213 66 217 80 217 93 218 105 217 117 217 130 216 142 216 156 215\n20 0.999 49 237 53 257 76 256 99 255 121 254 143 253 165 252 188 251 211 251 232 251 251 251 245 234 224 233 203 232 180 232 159 232 137 232 114 232 91 233 70 234\n20 0.999 225 253 206 254 187 253 167 253 148 254 130 255 112 256 91 257 77 258 59 264 61 284 79 286 96 285 116 284 135 282 153 281 172 281 191 280 210 280 225 274\n21 0.999 118 96 119 128 118 161 152 162 185 161 220 160 254 159 289 159 320 160 352 162 385 164 385 134 381 98 346 96 315 95 281 95 248 94 215 94 182 93 150 93\n21 0.999 144 185 152 208 176 205 199 200 223 198 246 197 270 196 295 198 318 200 341 207 359 204 357 185 336 178 310 172 286 168 263 168 238 169 213 171 190 174 166 179\n21 0.999 227 304 226 311 227 321 234 322 242 322 250 322 256 322 265 322 272 322 280 322 288 322 288 313 287 302 281 301 274 301 266 301 257 301 249 301 242 301 233 301\n22 0.999 191 60 190 76 189 92 192 101 210 103 227 103 242 104 259 104 274 104 287 104 302 104 301 88 301 72 298 63 280 63 263 63 247 62 232 62 218 61 204 61\n22 0.999 430 103 427 103 423 103 420 103 417 104 413 104 410 105 408 107 409 112 409 115 409 118 412 119 417 119 420 119 424 118 427 118 431 118 432 113 432 110 431 107\n22 0.999 453 117 447 117 440 117 435 117 429 118 424 118 418 118 412 118 407 120 408 125 408 131 413 133 418 133 424 133 431 133 437 133 444 133 450 133 452 127 452 122\n22 0.999 125 137 120 137 115 137 111 137 105 137 100 137 94 137 89 138 86 141 87 145 87 151 90 152 96 152 101 151 107 151 111 151 117 151 122 151 126 147 126 142\n22 0.999 200 193 204 203 211 208 220 204 229 201 239 200 250 200 260 201 269 203 278 207 286 202 290 193 280 189 270 185 260 183 250 182 238 182 228 184 219 186 209 189\n22 0.999 200 229 206 234 215 234 225 234 235 234 246 235 256 235 266 234 276 234 286 234 289 223 283 221 274 221 264 221 254 221 243 221 233 221 223 221 214 221 203 221\n22 0.999 419 366 421 374 430 374 439 374 448 374 457 374 466 374 475 374 484 374 493 374 499 373 498 366 491 365 482 365 472 365 463 365 454 365 445 365 436 365 427 365\n23 0.999 194 110 196 122 197 133 210 133 224 133 237 133 250 135 262 137 274 140 285 144 296 146 300 134 294 124 282 120 271 116 257 113 245 111 231 110 219 110 207 110\n23 0.999 214 139 214 147 218 151 226 151 234 151 242 152 250 152 257 154 265 155 271 158 279 158 280 148 274 145 266 143 259 141 252 140 243 139 236 139 228 138 221 138\n23 0.999 453 159 452 166 453 172 453 177 459 179 465 179 471 179 477 181 483 183 487 186 492 189 495 182 493 176 489 171 486 167 481 164 476 162 469 160 463 159 458 159\n23 0.999 220 188 220 194 224 197 231 197 237 197 243 198 250 199 257 199 263 200 269 201 275 200 275 192 269 191 263 189 256 189 250 189 244 188 237 187 231 186 225 186\n23 0.999 460 191 457 189 454 190 450 190 447 191 443 191 439 191 436 191 433 191 432 193 432 197 435 201 438 200 441 199 444 199 448 199 451 199 456 199 459 198 461 193\n23 0.999 222 204 225 207 231 207 236 208 242 208 247 208 253 209 259 209 264 210 270 210 272 202 269 200 263 200 258 199 252 199 246 198 241 198 236 198 230 198 224 198\n23 0.999 231 209 230 213 231 217 236 218 240 218 246 219 251 219 255 220 259 220 264 220 268 218 268 211 266 210 262 209 257 209 253 209 248 209 243 208 238 208 234 208\n24 0.999 128 75 135 104 144 132 172 118 199 102 221 91 231 89 261 85 285 89 308 106 324 117 330 99 323 79 302 60 277 47 253 39 224 40 208 43 181 52 154 62\n24 0.999 315 153 302 162 294 169 289 174 269 184 246 191 213 191 190 187 183 201 180 222 175 241 191 244 212 244 233 240 251 232 270 223 286 214 295 206 310 193 321 177\n25 0.999 128 78 121 77 115 77 108 77 102 77 96 77 91 77 85 77 80 83 80 89 81 96 86 97 93 97 98 97 105 97 112 97 120 97 124 97 127 91 127 85\n25 0.999 198 107 174 108 149 109 124 109 101 110 77 110 55 110 34 110 9 121 10 146 11 170 30 169 60 168 80 167 103 167 126 167 149 166 174 166 198 153 198 131\n25 0.999 174 177 155 177 135 179 115 180 98 179 79 179 62 178 45 180 33 201 34 220 34 241 48 244 70 242 86 241 106 238 126 236 147 232 163 229 172 214 173 195\n25 0.999 157 290 145 285 137 293 128 300 118 305 106 307 94 306 84 302 73 295 65 287 57 294 64 302 73 310 84 315 94 319 106 319 119 318 130 314 140 308 149 299\n25 0.999 92 288 92 292 92 296 92 300 94 302 97 302 103 303 107 303 110 303 114 303 118 302 118 298 118 294 118 290 116 288 112 288 108 288 103 288 100 288 96 288\n26 0.999 231 62 225 64 221 67 218 71 217 77 216 81 215 85 214 90 214 96 214 100 216 104 225 105 228 101 229 96 229 90 231 83 232 78 233 75 235 69 233 65\n26 0.999 267 63 264 67 264 72 264 77 266 81 268 86 270 92 271 97 272 103 274 106 281 107 285 103 286 98 285 93 284 87 283 83 281 78 278 71 277 67 272 64\n26 0.999 228 127 228 132 234 135 238 137 244 139 250 140 256 140 260 139 267 136 270 134 271 127 270 122 265 121 260 121 255 123 250 124 244 123 239 122 235 121 231 121\n26 0.999 391 164 364 162 333 162 303 163 275 163 246 163 216 163 185 163 157 163 130 162 114 182 139 188 171 188 200 188 229 188 260 188 290 187 321 187 349 188 379 187\n26 0.999 89 222 124 223 152 223 185 223 220 223 255 223 288 223 323 223 355 223 390 223 409 201 381 200 347 200 314 200 278 200 242 200 209 200 175 199 139 199 103 199\n26 0.999 321 225 304 224 288 224 272 224 257 224 240 225 225 225 207 225 192 225 178 226 178 243 192 246 209 246 225 246 241 246 258 245 273 245 289 245 306 245 321 240\n26 0.999 378 259 350 256 323 256 296 256 269 256 242 256 214 257 188 256 162 256 132 256 118 275 145 281 174 280 201 280 229 279 257 279 285 279 311 278 339 279 368 278\n26 0.999 107 290 129 303 157 303 186 303 216 303 246 303 277 303 307 303 339 303 368 303 392 292 372 280 342 279 313 279 282 280 253 280 224 280 192 280 160 280 131 281\n26 0.999 84 305 98 329 136 330 173 329 209 329 246 329 282 328 317 329 353 328 389 328 424 328 407 304 371 304 335 304 297 305 261 305 226 305 189 305 151 304 116 304\n26 0.999 380 342 350 339 322 339 296 339 269 339 240 339 212 339 183 339 154 339 125 339 119 364 146 366 175 365 204 364 232 364 260 364 289 364 318 364 347 364 375 366\n26 0.999 175 367 176 383 192 388 209 388 225 388 242 388 259 388 276 388 292 388 310 388 324 385 324 365 307 365 290 364 273 364 257 365 239 365 223 365 206 365 189 365\n26 0.999 182 392 184 407 194 414 210 414 226 414 242 414 257 414 271 414 287 414 303 414 317 412 317 396 305 390 290 389 275 390 259 390 244 390 228 390 212 390 196 390\n26 0.999 218 417 217 425 217 435 224 440 232 441 242 441 251 441 261 441 269 440 280 440 287 439 288 428 287 419 281 414 271 414 262 414 253 414 243 414 234 415 225 415\n27 0.999 3 3 4 12 12 13 21 13 30 13 39 13 48 13 56 13 65 13 73 13 81 10 79 1 72 1 63 1 54 1 45 1 37 1 27 1 19 1 10 1\n27 0.999 192 17 191 20 192 23 194 24 198 24 201 25 204 25 207 25 210 26 213 26 217 25 216 22 216 19 213 18 210 18 207 17 203 17 200 16 197 16 193 16\n27 0.999 160 46 148 28 135 19 120 13 102 13 86 16 72 23 60 33 49 48 44 68 45 82 48 92 57 86 69 78 83 70 97 64 112 59 130 56 145 55 158 50\n27 0.999 181 24 181 29 186 31 192 32 197 33 203 34 208 34 213 35 219 36 225 37 228 33 228 27 223 27 217 26 212 26 207 25 201 24 196 24 190 22 184 22\n27 0.999 189 37 189 40 189 45 190 47 194 48 198 49 201 49 205 49 209 50 213 51 216 49 217 46 217 42 215 40 211 39 207 39 204 39 200 38 196 37 192 37\n27 0.999 131 56 124 58 116 60 108 62 100 65 93 68 86 71 79 75 79 81 82 87 85 93 92 93 99 91 107 89 114 86 122 84 130 81 137 78 136 70 133 64\n27 0.999 162 61 152 68 138 77 124 85 107 91 93 94 76 96 62 97 50 105 57 114 64 124 77 134 95 138 112 139 128 135 144 128 156 118 168 102 169 86 165 75\n27 0.999 157 175 151 176 144 178 138 180 131 181 124 183 119 184 116 189 118 195 119 202 121 207 127 207 133 205 139 204 146 202 153 200 159 199 160 193 159 187 158 181\n28 0.999 168 70 177 93 192 98 213 94 232 90 251 89 272 90 292 93 311 97 328 102 342 94 346 73 325 65 307 59 287 56 267 54 246 54 226 55 206 57 187 63\n28 0.999 76 107 82 148 123 148 164 149 204 149 243 150 284 150 323 150 363 150 402 151 439 150 437 110 392 108 354 108 314 107 274 106 233 106 193 106 152 107 115 106\n28 0.999 148 163 155 164 165 165 175 166 184 167 193 167 203 168 213 168 223 168 230 166 234 155 226 152 216 150 208 150 198 150 188 149 178 149 169 149 159 149 149 151\n28 0.999 410 271 416 277 424 277 434 277 442 277 451 277 460 278 468 278 478 278 486 278 495 274 489 269 480 269 472 269 462 269 454 269 445 269 436 269 427 269 417 269\n29 0.999 67 53 66 61 65 71 68 75 79 75 88 75 96 76 105 77 113 77 121 77 129 78 130 70 130 57 126 54 117 54 109 53 100 53 91 51 82 51 72 50\n29 0.999 64 76 63 86 62 96 66 100 77 101 86 101 95 101 104 102 112 102 121 102 130 103 131 95 131 83 126 81 117 80 108 79 99 79 90 78 80 76 70 75\n29 0.999 71 102 70 109 70 120 70 127 76 128 85 128 92 129 100 129 107 129 113 130 120 130 120 122 121 112 120 106 114 105 106 105 99 104 92 103 83 102 76 101\n29 0.999 60 130 58 139 59 147 69 148 78 148 87 148 95 148 104 149 113 149 122 149 130 149 131 140 129 132 120 132 112 131 103 131 95 131 86 131 77 130 67 130\n29 0.999 38 184 49 187 61 189 73 191 86 192 98 193 111 192 124 192 136 190 147 188 147 176 139 169 126 172 114 173 102 174 90 174 78 173 65 171 53 169 42 169\n29 0.999 38 210 40 219 52 219 63 220 75 221 86 221 98 222 110 222 121 222 132 223 142 222 139 212 128 211 116 210 105 210 94 209 82 209 70 208 59 208 47 208\n29 0.999 74 235 74 239 73 244 76 245 80 245 84 245 88 245 92 246 96 246 100 246 103 245 104 240 104 235 102 234 98 234 94 234 90 234 85 234 81 234 77 234\n29 0.999 42 254 50 255 60 256 71 256 81 257 91 257 100 258 111 258 120 258 130 259 135 250 127 247 117 247 108 246 98 246 87 246 77 246 67 245 57 245 46 245\n29 0.999 50 260 50 265 54 267 60 267 65 268 71 268 76 268 81 268 87 268 91 267 97 266 95 261 91 260 86 257 80 257 75 257 70 257 64 257 58 257 52 256\n29 0.999 99 260 98 264 99 268 102 269 106 269 109 269 113 269 117 270 120 270 123 270 125 269 126 263 126 259 123 258 119 258 116 258 112 258 109 258 105 258 101 258\n30 0.999 14 98 19 101 28 105 38 108 48 109 59 110 69 109 79 107 91 103 98 98 97 86 95 86 83 90 74 93 63 95 54 94 45 94 35 91 26 87 16 85\n30 0.999 4 125 16 126 27 125 37 125 49 125 61 125 73 125 86 125 98 127 108 127 110 114 99 112 87 112 76 112 63 112 51 112 39 112 28 112 17 112 5 113\n30 0.999 35 125 34 130 34 135 39 136 44 136 50 136 55 136 61 136 67 136 73 136 75 135 76 130 75 125 70 125 65 124 60 124 54 124 49 124 44 124 39 124\n30 0.999 7 160 7 172 11 180 23 176 35 171 48 167 60 168 75 172 90 177 103 179 106 177 108 164 103 155 91 152 78 148 65 144 51 144 37 148 26 152 15 157\n30 0.999 17 199 21 205 31 205 39 205 48 205 57 205 66 205 75 205 84 205 93 205 100 202 94 197 86 197 78 197 68 197 60 197 51 197 42 197 33 197 24 197\n30 0.999 86 216 84 209 77 209 70 209 64 209 58 209 52 209 45 209 39 209 33 209 27 210 27 217 34 217 41 217 48 217 55 217 62 217 68 217 75 217 81 217\n30 0.999 108 243 97 240 86 239 74 239 63 240 52 242 41 244 30 245 19 245 7 246 4 256 15 257 25 256 37 255 49 255 61 253 72 251 83 250 95 251 107 252\n30 0.999 78 259 74 258 70 258 66 258 63 258 59 259 55 259 51 259 48 259 44 260 44 264 47 265 51 265 55 265 59 265 63 265 67 265 71 265 75 265 78 263\n30 0.999 34 270 37 270 42 270 47 270 52 270 56 270 61 270 66 270 72 270 77 270 77 265 74 264 69 264 65 264 60 265 55 265 50 265 45 265 41 264 35 264\n30 0.999 9 273 16 277 26 277 36 276 46 277 56 276 66 276 77 276 87 276 97 276 105 271 97 269 87 269 78 270 67 270 57 270 46 270 36 270 27 270 17 269\n31 0.999 212 37 218 62 243 56 264 50 287 47 311 48 335 49 362 51 385 53 406 60 425 57 421 34 400 28 376 22 351 19 328 17 302 19 280 19 255 24 232 30\n31 0.999 40 113 88 123 148 122 203 122 261 122 320 122 379 122 442 122 503 122 552 122 584 85 531 80 480 78 421 78 362 77 305 76 243 76 186 77 127 77 70 78\n31 0.999 35 131 73 155 128 154 181 154 237 154 292 153 348 153 405 152 461 152 515 152 569 152 529 129 477 128 422 129 366 129 311 129 254 129 200 128 143 128 88 129\n31 0.999 426 178 407 175 386 175 360 175 338 175 317 176 296 176 275 176 252 177 227 177 226 199 240 205 269 204 291 203 313 203 335 202 355 202 378 202 400 202 423 202\n32 0.999 27 60 25 83 42 87 62 88 83 89 102 89 123 90 144 91 160 92 179 92 197 92 197 70 179 68 159 67 138 66 119 65 100 64 82 63 63 62 45 61\n32 0.999 189 103 169 100 153 97 132 95 114 95 95 96 77 99 58 103 43 108 30 111 20 131 39 131 50 127 71 123 90 118 107 115 124 114 144 114 164 116 182 118\n32 0.999 83 133 91 136 101 136 112 137 125 138 136 138 148 138 160 139 176 140 185 133 189 120 182 118 167 116 157 115 146 115 130 115 119 115 108 116 95 117 85 121\n33 0.999 85 144 83 186 111 210 153 196 196 185 223 181 254 180 292 184 335 197 378 210 409 216 410 174 401 142 362 141 320 140 277 140 236 139 189 139 163 139 125 142\n33 0.999 304 199 296 184 287 183 274 181 258 180 237 180 219 182 205 182 191 185 178 187 176 192 180 207 192 208 209 208 226 208 242 209 259 209 274 209 290 210 303 206\n33 0.999 98 226 125 229 158 229 187 229 220 230 246 232 275 232 305 233 338 233 370 234 387 217 358 215 329 215 300 214 270 212 239 212 209 211 177 211 147 210 118 209\n34 0.999 455 49 409 36 350 51 291 61 233 80 189 107 141 123 83 144 37 172 11 218 26 267 62 255 111 225 167 203 225 185 278 166 323 148 381 130 434 120 480 93\n34 0.999 85 52 73 54 61 56 50 59 38 60 29 61 18 61 10 69 8 80 7 91 7 102 16 102 26 98 37 96 48 92 60 90 70 89 78 86 80 76 82 64\n34 0.999 127 67 113 72 99 77 83 85 68 91 56 96 41 100 25 102 12 108 10 123 9 140 21 139 37 135 52 130 66 123 81 116 94 110 109 102 122 96 126 81\n34 0.999 367 155 354 158 341 160 327 164 314 168 302 172 289 176 276 181 264 185 259 195 259 208 270 207 283 203 296 198 309 193 322 189 333 185 346 182 360 177 368 168\n34 0.999 247 193 229 198 209 203 189 208 170 214 153 220 135 225 115 232 97 238 85 251 85 269 101 267 120 261 138 255 157 248 176 241 195 235 214 230 232 224 245 212\n34 0.999 379 212 378 218 377 224 377 230 377 238 377 243 379 249 389 249 396 249 403 249 410 247 411 241 410 234 410 227 410 220 410 212 406 209 399 210 391 210 384 210\n34 0.999 293 251 291 277 296 295 321 301 347 306 373 309 402 310 431 308 453 304 475 300 492 295 487 270 478 250 451 255 423 259 398 261 377 261 356 258 334 254 309 249\n34 0.999 488 328 439 327 390 328 342 328 293 328 246 328 198 328 148 328 100 328 53 329 15 340 63 341 111 341 160 341 208 341 256 341 303 342 351 341 401 341 449 341\n35 0.999 146 58 146 68 152 72 161 72 171 74 180 77 189 81 198 85 206 90 214 95 223 91 226 84 218 79 210 74 201 69 193 65 183 61 174 59 164 58 154 58\n35 0.999 141 59 133 60 125 62 116 65 108 69 101 73 95 78 87 84 81 90 76 95 80 102 88 103 95 97 100 91 107 87 114 82 121 79 130 75 140 73 143 68\n35 0.999 225 112 218 97 199 99 183 100 167 101 151 102 135 103 119 105 104 105 90 106 78 109 86 121 103 120 118 120 135 119 150 118 166 117 182 116 198 114 212 114\n35 0.999 234 137 216 138 199 139 181 139 163 140 146 141 129 142 111 142 93 144 77 144 71 156 89 156 107 155 124 154 141 154 159 153 176 152 194 151 212 151 230 150\n35 0.999 142 228 142 234 144 239 149 239 154 239 160 239 165 239 171 239 175 239 181 239 185 238 185 232 183 228 178 228 173 228 167 228 162 228 157 227 151 226 146 226\n36 0.999 189 48 188 62 188 85 197 85 216 79 229 76 245 75 261 78 276 82 290 89 306 93 304 77 305 57 296 49 281 42 265 37 247 34 231 34 215 36 198 40\n36 0.999 402 326 408 330 419 330 429 330 439 330 448 330 458 330 468 330 479 330 489 331 496 323 490 319 480 319 470 319 459 319 450 319 439 319 429 319 419 319 408 319\n37 0.999 40 71 56 88 73 90 89 74 105 64 125 58 148 57 167 61 186 70 204 73 216 52 209 40 189 32 169 26 148 23 128 22 107 25 86 34 70 44 55 57\n37 0.999 223 49 212 63 207 73 207 85 216 95 224 106 231 119 238 134 244 147 253 152 265 149 279 146 281 136 276 124 270 113 265 102 260 90 253 78 245 68 233 58\n37 0.999 77 78 70 102 73 123 75 157 76 187 85 212 98 237 142 245 172 238 186 240 208 224 217 198 222 160 222 142 216 121 195 100 173 90 162 71 139 62 105 66\n37 0.999 39 79 33 88 29 95 25 105 22 115 20 125 19 137 18 147 18 152 25 157 36 157 45 157 50 151 51 138 52 130 54 119 59 111 63 103 60 97 51 86\n37 0.999 278 207 246 195 231 217 211 241 183 256 150 261 121 253 97 240 85 224 55 199 35 220 56 245 79 269 102 280 129 290 161 294 193 289 221 277 244 259 261 234\n38 0.999 47 37 59 46 70 40 82 34 94 32 106 31 119 34 131 38 142 44 152 52 162 47 157 37 146 29 135 22 123 18 109 16 96 16 83 17 70 21 59 28\n38 0.999 143 64 143 80 145 93 161 94 176 95 192 96 206 97 221 99 236 99 249 100 265 100 264 84 262 74 245 72 231 71 216 69 201 68 187 67 172 65 156 64\n38 0.999 143 101 142 119 145 134 161 135 177 136 194 137 210 138 225 139 242 140 256 142 270 143 271 122 270 111 251 109 235 107 220 106 205 105 188 104 172 102 154 100\n38 0.999 141 148 143 162 159 163 174 164 188 165 202 166 218 167 232 167 247 169 262 169 276 169 274 154 259 153 245 152 230 150 215 150 200 149 185 147 170 147 155 147\n38 0.999 60 225 61 245 64 265 83 265 102 265 122 265 141 266 161 266 180 267 199 268 216 268 216 244 215 228 193 227 174 227 156 226 137 226 117 226 98 225 76 224\n38 0.999 0 282 0 284 3 285 7 285 10 285 13 285 17 285 20 285 24 285 26 285 31 282 29 279 25 279 21 279 19 278 15 278 11 278 7 278 3 278 0 279\n39 0.999 46 66 47 81 51 100 68 101 83 103 100 105 116 107 133 109 149 111 164 113 179 114 180 96 175 84 159 81 143 79 129 77 113 74 96 71 79 68 61 65\n39 0.999 36 150 35 168 48 173 63 174 78 175 94 177 109 178 125 179 141 181 156 182 172 182 174 163 160 160 144 159 130 157 115 156 99 155 83 153 66 151 50 149\n40 0.999 34 39 33 43 32 50 31 58 32 65 36 69 42 71 49 72 56 73 62 73 69 74 70 68 71 61 72 53 73 45 71 43 63 42 55 40 48 39 41 38\n40 0.999 83 108 85 123 87 141 88 154 103 154 124 152 142 153 159 155 175 162 190 168 202 170 204 155 206 142 199 130 185 125 167 119 151 114 133 110 116 107 97 105\n40 0.999 35 129 33 133 34 139 33 146 33 152 33 157 34 162 41 163 47 163 52 163 58 162 58 156 58 150 59 143 60 137 62 130 60 129 53 129 46 128 40 128\n40 0.999 87 156 87 171 87 183 99 186 114 188 129 189 143 191 158 193 172 195 186 196 199 198 200 181 200 173 186 169 171 166 158 164 143 161 129 159 115 156 100 155\n40 0.999 31 175 30 180 29 187 28 195 27 201 27 208 30 212 38 212 45 213 51 213 57 214 58 208 58 200 58 192 58 185 60 179 57 175 51 175 44 174 36 173\n40 0.999 33 225 31 230 31 235 30 242 30 247 30 252 30 258 34 261 40 261 46 262 52 261 53 259 53 253 54 245 54 238 54 232 54 227 51 225 43 224 37 224\n40 0.999 62 283 61 291 65 293 72 294 78 295 84 295 91 295 97 296 104 296 110 297 116 297 117 288 113 286 107 285 100 285 94 284 88 283 81 283 75 283 68 282\n40 0.999 42 301 50 303 59 304 69 305 78 306 88 306 97 307 108 308 117 309 127 310 131 301 124 298 114 297 105 297 95 296 85 295 75 295 66 294 56 293 46 291\n40 0.999 26 303 31 313 43 314 56 315 69 316 81 317 94 318 106 319 119 319 131 319 144 319 139 311 126 310 114 309 102 308 89 307 76 306 64 305 51 303 39 302\n41 0.999 497 59 489 59 482 60 475 63 469 65 463 68 457 71 450 74 445 76 443 83 443 89 451 90 456 86 462 84 468 80 475 78 481 76 488 73 495 72 498 66\n41 0.999 83 121 81 153 113 163 145 171 177 181 209 191 241 201 272 211 304 220 335 230 368 240 363 214 331 203 300 193 270 182 239 171 208 160 176 150 146 139 115 129\n41 0.999 0 160 0 170 6 178 12 184 20 188 30 191 39 193 49 193 58 192 67 188 74 184 71 175 63 174 55 177 45 180 36 178 27 176 19 171 11 166 6 160\n41 0.999 164 180 163 200 160 224 179 230 200 235 225 240 248 245 270 250 293 255 314 259 338 264 340 245 337 231 315 223 291 218 271 211 248 205 227 198 206 191 184 184\n42 0.999 25 20 36 27 49 27 62 27 75 27 88 27 101 28 116 28 130 28 140 28 152 21 143 17 130 17 117 16 104 16 91 16 77 16 64 16 51 16 39 16\n42 0.999 5 51 13 61 33 58 52 55 71 54 89 54 107 55 125 56 146 60 161 64 176 58 167 48 149 44 132 40 113 38 95 37 76 37 58 39 41 41 23 45\n42 0.999 5 60 7 76 8 93 8 108 12 122 30 119 48 113 67 110 83 108 97 109 114 109 121 99 122 81 120 73 102 70 86 69 70 67 52 65 37 61 23 60\n42 0.999 69 109 66 123 66 136 67 150 72 161 88 161 105 154 119 150 140 149 143 150 156 149 160 139 161 125 158 119 144 117 132 117 118 116 103 114 91 109 80 108\n42 0.999 170 157 153 162 135 171 115 181 94 188 74 193 55 197 39 199 22 198 7 214 10 235 32 238 53 237 74 233 95 226 115 219 132 211 151 201 169 192 175 179\n42 0.999 131 239 131 242 133 243 136 244 139 243 142 243 145 243 148 243 151 241 152 240 152 235 152 233 149 233 146 233 143 233 140 232 137 232 133 232 132 235 131 238\n42 0.999 35 254 45 260 56 261 73 261 90 261 105 261 120 262 135 262 155 263 166 263 171 250 166 242 147 242 133 242 118 241 102 241 87 240 71 240 57 239 41 239\n42 0.999 39 264 39 275 47 278 57 278 69 279 79 279 89 280 100 280 113 280 119 279 129 278 130 267 121 265 111 265 100 265 90 265 79 265 69 264 59 264 49 264\n42 0.999 31 299 40 301 50 302 61 302 74 302 85 302 95 303 106 304 118 304 128 305 136 298 127 295 116 295 105 294 94 293 83 293 73 293 61 292 50 292 40 292\n43 0.999 44 24 44 39 55 43 68 40 85 39 101 40 118 42 133 45 146 59 155 67 163 58 166 40 169 23 153 19 137 13 119 11 101 10 85 11 70 14 50 17\n43 0.999 47 47 49 56 58 57 69 56 79 56 90 56 100 56 110 57 120 58 129 60 137 56 136 47 126 45 116 43 106 42 96 42 85 42 75 42 64 43 55 45\n43 0.999 37 61 37 75 45 82 59 82 73 82 86 82 99 82 113 83 127 83 140 85 151 85 151 71 144 63 131 62 118 61 104 60 90 60 76 60 62 60 50 60\n43 0.999 173 98 160 91 145 86 128 83 112 82 94 82 79 83 63 84 45 84 32 84 22 90 33 100 50 101 66 101 82 99 99 98 115 97 131 100 147 103 164 108\n43 0.999 126 105 117 103 108 103 98 103 88 105 80 105 70 106 61 107 54 108 54 117 54 126 63 126 72 126 82 126 91 126 100 125 109 125 117 125 125 121 126 114\n43 0.999 4 130 4 146 12 153 27 153 43 154 57 156 70 157 87 158 102 159 117 160 130 161 132 145 123 137 108 136 93 135 78 134 62 133 48 131 32 130 18 130\n43 0.999 134 145 134 149 139 151 144 151 149 151 154 152 159 152 163 152 169 152 174 151 176 146 175 142 170 140 165 140 161 139 156 139 151 139 146 139 140 138 137 141\n43 0.999 133 151 133 156 134 162 140 162 145 163 151 163 156 164 161 165 167 165 173 165 178 163 179 158 176 153 171 153 166 152 160 152 154 152 148 151 143 151 138 151\n43 0.999 41 161 37 160 33 161 28 163 24 164 19 165 15 166 10 168 8 171 8 176 8 180 12 181 17 180 21 178 26 177 31 175 35 174 39 172 43 170 42 166\n43 0.999 42 193 51 199 62 196 81 193 100 192 118 193 136 195 155 197 171 199 178 188 180 170 164 166 147 163 134 162 118 160 103 159 85 158 68 156 47 157 46 170\n43 0.999 40 175 36 175 32 176 28 177 24 178 20 179 16 180 13 182 13 187 14 190 14 194 19 193 23 192 27 191 31 191 36 190 39 189 41 186 41 181 41 178\n43 0.999 171 209 155 201 137 196 119 193 102 193 82 194 63 197 45 199 29 200 10 196 0 203 13 212 32 215 49 215 68 213 87 211 105 210 123 210 141 214 161 220\n43 0.999 38 218 37 232 38 246 51 247 65 247 78 248 91 249 105 251 119 252 132 254 144 253 145 238 145 226 132 224 117 222 103 220 89 219 75 218 60 217 50 217\n43 0.999 36 310 48 311 60 312 71 312 81 313 91 314 103 315 115 316 127 317 140 317 141 305 133 303 121 302 109 301 98 300 85 299 73 298 61 298 49 296 38 297\n44 0.999 126 78 125 115 125 149 161 131 201 117 243 109 272 109 311 114 338 123 371 138 400 153 406 120 396 92 368 73 334 60 298 50 260 48 225 49 190 55 156 65\n44 0.999 237 114 237 123 243 126 251 126 259 126 269 127 275 127 283 127 290 127 297 128 305 127 306 117 299 115 291 114 283 114 275 114 268 114 260 114 253 114 245 114\n44 0.999 163 130 164 152 186 155 209 155 234 156 259 157 280 157 304 157 325 157 347 157 371 157 368 136 346 133 323 132 301 131 277 130 254 130 231 129 207 128 185 128\n44 0.999 161 168 161 193 185 195 209 194 233 194 260 194 282 194 306 193 328 194 351 194 375 193 374 168 352 166 328 166 303 166 280 166 256 166 231 166 208 166 183 166\n44 0.999 409 202 377 202 349 203 317 203 286 204 257 204 226 205 195 206 163 206 133 207 123 226 151 230 182 230 213 229 243 228 273 227 305 226 335 225 366 225 396 223\n44 0.999 421 237 412 237 406 237 396 237 388 237 382 238 373 239 364 239 354 239 353 247 353 254 360 256 368 255 378 254 386 254 394 254 402 253 410 253 418 254 421 246\n44 0.999 193 249 182 249 177 249 167 250 157 250 150 250 140 251 131 251 127 258 128 268 129 277 138 278 146 277 156 277 165 277 173 275 183 275 191 274 193 266 193 257\n44 0.999 418 255 412 253 406 253 398 253 391 253 385 253 378 254 370 254 362 254 356 256 356 261 361 266 367 264 375 264 382 264 389 264 396 263 403 263 409 263 417 261\n45 0.999 64 53 76 63 85 52 95 41 107 34 121 31 136 33 149 38 160 47 169 58 181 60 176 47 166 36 154 27 141 21 126 18 111 19 96 24 85 33 74 43\n45 0.999 64 65 63 67 62 71 61 74 61 77 60 81 59 84 59 88 59 92 63 93 66 94 69 90 69 87 69 83 70 80 70 77 72 74 73 69 72 65 68 65\n45 0.999 9 74 9 76 10 79 13 79 16 79 19 79 22 79 25 79 28 79 31 80 35 78 35 75 33 73 30 73 27 73 24 73 21 73 18 73 15 73 12 73\n45 0.999 31 80 28 79 25 79 23 79 20 79 16 79 13 79 11 79 8 80 8 83 8 85 11 86 13 86 16 86 20 86 22 86 25 86 28 86 31 86 32 82\n45 0.999 67 135 70 146 80 148 93 146 106 144 119 144 132 144 145 145 157 146 169 148 180 148 182 135 169 133 156 132 144 130 131 128 117 128 104 128 91 130 79 132\n45 0.999 10 146 9 154 9 162 9 170 9 178 10 186 9 193 9 201 9 210 9 217 17 219 18 210 18 203 18 194 18 187 18 179 18 171 19 163 19 154 19 147\n45 0.999 95 158 99 160 106 162 112 163 118 164 126 165 132 165 138 164 145 162 151 160 152 154 147 151 142 153 135 155 128 155 122 156 115 155 109 153 103 152 98 152\n45 0.999 166 181 163 178 158 178 153 178 148 178 144 178 139 178 135 178 130 178 126 179 125 182 128 185 133 185 137 185 142 185 147 185 151 185 155 185 160 185 164 185\n45 0.999 124 189 126 190 130 191 135 191 140 191 145 191 150 191 154 191 159 191 163 191 166 186 164 185 159 185 155 184 150 185 145 185 140 185 136 185 131 184 126 184\n45 0.999 125 192 125 197 127 198 132 198 137 198 142 198 146 198 151 198 156 198 160 198 165 198 165 193 161 191 156 191 151 191 147 191 142 191 138 191 133 191 129 191\n45 0.999 125 199 125 203 129 205 133 204 138 204 142 204 147 204 151 204 156 204 160 205 164 203 164 199 159 198 155 198 151 198 146 198 142 198 137 198 133 198 128 198\n45 0.999 134 205 134 207 135 210 137 210 140 210 142 210 145 210 148 210 151 210 154 210 156 208 156 205 154 204 152 204 149 204 147 204 144 204 141 204 139 204 136 204\n46 0.999 184 79 189 91 203 91 216 91 230 91 243 91 257 92 271 92 285 92 298 93 312 92 306 81 293 80 280 80 266 80 252 80 238 79 225 79 211 79 197 79\n46 0.999 181 100 181 116 189 125 207 125 223 126 239 126 255 126 270 126 286 127 301 127 314 126 314 109 306 101 289 101 272 101 256 100 241 100 225 99 211 99 194 99\n46 0.999 251 135 247 137 243 138 239 140 234 142 230 143 226 145 224 147 226 152 226 156 227 160 230 161 234 159 239 157 243 155 247 154 252 151 254 148 253 144 253 140\n46 0.999 275 141 267 143 259 147 252 151 244 156 236 159 228 162 220 165 215 170 216 178 216 186 223 186 230 182 238 177 246 173 254 169 261 165 269 161 275 158 275 149\n46 0.999 127 168 128 175 137 176 146 176 156 176 164 176 174 176 182 177 192 178 201 179 206 173 204 165 196 164 186 163 177 163 168 162 159 162 150 162 141 162 131 162\n46 0.999 272 161 267 163 262 167 256 170 251 172 246 175 240 177 235 181 234 185 234 191 235 197 241 196 246 194 252 191 257 187 263 184 268 181 273 178 274 172 273 167\n46 0.999 286 173 289 178 299 179 308 179 317 180 326 180 335 180 344 181 354 181 363 182 367 172 363 168 354 168 345 167 336 167 327 167 318 166 309 166 300 165 289 165\n46 0.999 208 210 213 216 221 221 229 224 239 227 247 227 256 227 266 224 273 221 282 217 286 210 280 202 272 205 264 209 254 212 245 213 236 212 229 209 221 205 213 203\n46 0.999 176 214 187 225 201 236 215 244 230 248 247 249 264 248 278 244 292 236 304 225 314 215 302 205 290 213 275 223 259 229 243 230 229 227 215 221 201 209 188 203\n46 0.999 0 232 0 237 0 244 0 250 2 254 8 254 14 254 20 254 27 254 31 254 36 253 37 247 38 241 39 236 35 232 28 231 22 231 17 231 11 231 5 230\n46 0.999 0 262 0 266 3 267 8 267 14 267 18 267 23 266 28 267 33 266 35 263 36 258 34 254 30 254 25 253 20 253 16 253 11 253 6 253 2 253 0 257\n46 0.999 167 259 175 271 192 271 209 271 226 271 242 271 258 271 275 272 292 272 308 272 324 271 316 260 300 260 283 260 266 260 250 260 233 259 217 259 200 259 184 259\n46 0.999 34 266 28 266 21 266 16 266 9 266 3 267 0 270 0 276 0 282 0 287 0 292 5 293 10 293 16 292 23 292 30 292 34 289 34 283 34 278 34 272\n47 0.999 415 107 393 85 368 60 341 41 307 35 267 36 237 45 212 52 172 62 140 82 140 117 156 129 179 113 210 96 251 89 286 90 318 96 351 114 383 136 403 132\n47 0.999 230 155 229 165 230 174 240 174 249 174 259 174 268 175 283 175 292 176 302 176 306 175 307 166 302 157 294 154 285 152 276 150 265 150 255 150 245 152 236 153\n47 0.999 196 182 196 200 203 209 220 209 237 208 255 208 271 208 293 209 310 209 324 210 338 210 338 193 328 183 312 182 296 183 279 182 262 182 245 182 228 182 211 182\n47 0.999 230 217 228 228 235 235 241 239 250 243 259 246 272 247 285 243 293 239 299 235 302 231 303 224 298 215 290 214 282 214 272 214 263 214 252 213 244 213 234 214\n47 0.999 322 272 319 259 310 256 297 262 285 266 271 267 258 267 246 264 233 260 223 255 212 259 210 270 223 277 235 281 247 285 260 287 273 288 286 286 299 283 310 279\n47 0.999 156 288 173 305 195 322 220 332 247 339 273 341 298 336 322 328 346 315 368 301 376 283 360 263 338 269 316 284 289 294 260 296 234 293 209 282 190 267 170 268\n47 0.999 396 340 396 351 407 352 418 352 429 352 441 352 452 353 467 353 478 353 490 353 498 351 496 339 484 339 473 339 462 339 451 339 440 339 428 339 417 339 405 339\n48 0.999 50 31 52 49 68 53 85 56 100 59 117 63 133 68 149 74 164 81 178 90 192 93 192 78 177 68 163 59 149 51 133 45 117 40 100 36 84 32 67 30\n48 0.999 31 58 31 80 41 100 61 106 82 113 103 120 126 126 147 133 169 140 190 146 210 152 211 131 203 115 180 106 159 100 139 93 118 86 97 79 75 71 52 63\n48 0.999 76 139 76 154 83 163 97 166 109 169 122 173 135 177 149 180 162 184 175 188 189 191 189 173 182 165 169 162 157 158 145 154 131 151 118 147 105 143 90 139\n48 0.999 76 165 76 179 84 188 97 191 109 194 121 197 134 200 148 204 161 208 173 211 186 215 188 200 179 191 167 187 155 183 142 180 129 176 116 173 103 168 89 165\n49 0.999 218 162 224 171 236 178 245 172 253 169 264 168 275 169 286 174 295 180 302 186 313 175 311 168 302 159 293 153 282 148 270 146 259 145 248 147 236 151 227 156\n49 0.999 221 224 228 226 239 227 250 230 260 232 272 232 284 232 295 232 305 232 312 230 310 215 303 211 292 212 283 212 272 212 262 211 252 210 240 208 228 206 223 211\n49 0.999 215 256 221 263 230 270 241 276 253 280 265 283 277 282 289 280 300 276 310 271 312 262 306 250 295 253 285 257 274 258 263 258 252 256 240 251 231 243 222 246\n50 0.999 359 67 358 71 358 74 358 77 360 79 362 80 365 82 368 83 370 83 374 84 377 83 378 80 378 76 376 74 375 73 371 72 368 70 366 69 364 67 361 67\n50 0.999 350 76 351 84 352 92 357 97 364 99 371 101 379 104 386 107 393 110 401 113 408 116 409 109 406 101 400 97 393 94 386 91 379 89 372 86 365 82 357 78\n50 0.999 388 110 387 113 387 116 386 120 387 123 390 124 393 125 396 126 399 127 402 128 406 129 409 125 409 122 408 117 406 117 403 116 399 114 396 113 394 112 390 111\n50 0.999 213 151 221 162 227 171 237 165 248 162 261 163 272 168 281 176 289 185 296 193 304 189 305 178 298 168 290 159 281 152 271 145 260 141 248 139 236 140 224 144\n50 0.999 255 199 254 202 254 205 255 209 258 210 261 211 265 212 267 212 271 213 274 214 277 213 278 210 278 207 276 203 273 203 270 202 267 202 264 200 261 199 257 199\n50 0.999 248 210 248 214 248 218 251 220 256 222 260 223 265 224 269 225 274 226 278 226 283 225 284 221 283 216 278 215 274 214 269 213 265 213 261 211 257 210 253 209\n50 0.999 199 216 206 233 212 249 221 264 234 275 250 285 268 290 287 289 303 279 313 264 319 249 316 236 302 234 293 249 278 261 260 257 245 246 234 231 227 215 215 208\n50 0.999 170 237 169 243 168 249 168 253 174 255 179 256 185 257 190 258 196 260 201 260 207 261 210 257 207 251 205 247 201 243 196 241 191 240 185 239 180 237 174 237\n50 0.999 145 254 144 264 147 272 156 273 166 274 175 275 185 277 194 278 203 280 213 281 222 282 222 273 219 264 209 262 200 261 191 259 182 258 172 257 163 255 153 254\n50 0.999 106 303 106 308 107 312 112 313 117 314 121 314 126 314 131 315 135 315 140 316 146 313 146 308 143 305 138 304 132 304 128 303 123 302 119 302 114 302 110 301\n50 0.999 485 309 480 305 475 305 468 306 463 307 457 308 451 309 446 309 441 309 436 311 436 316 439 321 445 319 451 319 457 318 463 318 468 317 474 316 479 316 485 314\n50 0.999 380 311 381 315 381 320 383 325 387 325 393 325 397 325 402 325 406 325 411 326 416 325 417 320 417 314 412 312 408 312 403 312 399 311 394 311 390 310 385 310\n50 0.999 391 341 390 344 390 348 390 351 393 352 396 353 400 354 403 355 406 356 409 356 413 356 415 352 415 348 412 347 409 346 406 345 403 344 400 343 396 342 393 341\n50 0.999 480 359 475 359 471 359 466 360 462 360 457 360 453 360 448 360 443 360 441 362 442 367 445 370 450 370 455 370 459 370 463 370 468 370 473 370 477 369 480 364\n51 0.999 66 28 67 49 70 67 93 59 114 55 135 52 155 54 175 57 194 63 213 72 232 78 231 58 225 41 205 31 186 24 167 19 146 16 125 16 104 19 84 23\n51 0.999 122 58 121 66 122 79 121 90 128 93 138 93 148 94 158 94 166 95 175 96 183 96 184 88 184 73 184 63 178 61 170 61 160 60 150 59 139 58 128 57\n51 0.999 78 112 91 118 107 124 123 128 139 131 157 132 176 132 193 130 210 127 222 124 223 101 214 95 196 98 178 101 162 103 145 101 128 98 112 94 96 88 82 91\n51 0.999 53 147 53 172 61 185 84 186 108 187 133 188 156 189 180 189 203 190 225 191 247 191 248 165 236 155 214 154 191 153 170 152 145 151 121 150 97 148 73 146\n51 0.999 45 200 50 223 71 223 96 223 119 223 142 223 165 224 188 224 208 224 231 224 255 223 247 204 225 203 203 202 181 200 159 200 135 200 112 199 89 198 65 197\n52 0.999 144 79 131 78 121 81 109 85 98 88 87 91 77 96 67 99 55 104 58 111 63 121 70 124 81 119 91 114 102 111 113 108 124 106 136 105 144 100 143 90\n52 0.999 172 84 170 96 173 105 185 105 198 108 210 111 221 114 233 117 243 121 255 126 261 117 267 107 258 102 248 97 237 93 226 89 214 85 201 82 189 80 179 79\n52 0.999 73 139 82 148 102 148 121 148 143 148 163 148 183 148 202 148 223 148 243 149 248 132 238 123 216 122 197 122 178 122 159 121 139 121 118 121 97 121 79 122\n53 0.999 23 112 52 148 80 130 96 102 126 83 164 79 201 88 236 103 260 103 238 124 270 142 295 119 270 87 254 60 218 35 181 24 140 22 101 34 69 51 47 79\n53 0.999 143 92 127 118 124 137 123 154 124 173 129 193 134 214 143 235 175 240 207 240 222 243 239 240 239 215 228 191 220 177 213 160 208 138 204 114 193 94 166 88\n54 0.999 71 40 83 50 100 40 117 33 135 29 153 27 172 28 190 31 208 37 224 44 237 38 225 28 209 21 191 15 173 12 154 11 136 13 118 16 101 22 85 31\n54 0.999 282 115 263 105 239 105 213 106 187 106 159 106 131 107 103 107 78 108 52 108 35 117 54 130 81 130 108 129 135 129 162 129 189 129 215 129 242 129 265 129\n54 0.999 246 133 225 133 209 134 190 135 169 135 146 135 125 135 103 135 82 136 63 140 63 161 82 162 104 161 125 161 146 161 167 161 188 161 209 161 232 160 246 152\n54 0.999 286 167 259 166 235 165 209 165 182 166 153 166 125 167 97 167 71 168 43 169 32 187 58 189 86 188 113 188 141 188 168 188 195 188 222 188 250 188 277 188\n54 0.999 53 195 55 218 82 218 108 218 131 219 152 219 175 219 199 219 223 219 244 218 262 217 260 196 236 195 213 196 190 195 166 195 143 195 120 195 96 195 74 195\n54 0.999 251 225 230 225 213 225 195 226 175 226 152 226 130 226 109 226 89 226 69 229 70 250 88 252 110 251 131 251 152 251 172 252 192 252 213 252 235 252 250 245\n54 0.999 98 257 97 272 108 278 127 278 141 278 151 278 165 278 180 278 195 278 208 278 214 277 213 261 202 257 189 257 175 257 161 256 148 256 134 255 119 255 106 255\n55 0.999 203 52 191 45 175 45 157 46 140 46 121 46 102 46 85 47 68 47 49 48 41 57 54 65 72 64 90 64 108 63 125 63 142 63 160 63 178 62 195 62\n55 0.999 195 77 182 71 167 72 150 72 133 73 116 73 98 73 82 74 66 74 47 75 41 85 55 91 71 90 88 91 104 90 121 90 136 90 154 90 170 90 188 89\n55 0.999 201 107 187 107 171 108 151 108 131 108 110 108 87 108 66 109 43 108 40 127 40 143 58 146 79 146 100 146 120 146 141 146 160 145 181 145 200 145 202 126\n55 0.999 203 147 189 146 172 147 150 147 131 147 109 147 84 147 62 147 40 147 38 165 37 185 55 186 76 186 97 186 117 186 139 186 159 186 181 186 200 186 202 166\n55 0.999 29 253 38 271 61 261 80 256 99 254 119 253 137 254 158 257 178 263 192 270 207 267 203 253 185 244 166 237 145 233 125 231 103 233 83 235 64 239 47 246\n55 0.999 57 277 68 278 84 278 98 278 112 279 126 279 139 280 153 279 168 280 181 280 183 266 172 265 159 265 145 265 131 264 117 264 102 264 88 264 74 264 60 264\n56 0.999 470 0 424 1 380 12 338 29 296 46 251 63 208 80 165 100 128 115 88 135 79 171 123 159 166 145 211 129 255 115 299 100 341 86 386 72 428 59 463 33\n56 0.999 474 110 432 120 389 130 346 140 304 150 263 159 220 169 178 179 134 189 93 200 80 222 125 215 167 206 210 198 253 189 296 180 339 171 381 162 418 154 470 145\n56 0.999 189 278 196 287 209 281 222 275 236 271 250 272 264 274 279 279 292 286 303 290 311 276 302 266 290 259 276 253 262 251 247 250 232 251 217 256 205 262 193 270\n56 0.999 110 272 109 276 109 280 108 285 108 289 112 290 117 290 121 290 125 291 129 291 134 290 135 286 136 281 135 277 133 274 129 273 125 273 120 273 116 272 114 272\n56 0.999 487 271 477 271 467 271 457 271 448 272 439 272 429 272 419 272 411 273 409 281 407 290 417 291 427 291 437 291 446 291 456 291 465 291 475 291 485 290 487 282\n56 0.999 394 301 397 309 408 309 421 309 433 310 445 310 456 311 469 311 481 311 495 312 498 296 493 292 481 291 469 291 457 291 445 291 433 291 421 291 409 291 397 290\n56 0.999 202 305 203 315 211 316 222 315 231 316 242 317 251 318 260 319 273 321 279 313 281 306 278 300 268 299 258 298 248 298 239 296 230 295 220 293 209 293 203 298\n56 0.999 105 336 105 338 105 342 108 343 111 343 113 343 116 345 120 345 122 345 125 345 127 342 127 339 125 335 123 335 120 334 117 333 114 333 111 333 109 333 106 333\n56 0.999 462 339 448 339 433 341 418 343 405 344 391 343 375 342 361 341 354 344 355 358 356 372 370 373 384 373 398 373 412 374 427 374 442 373 456 373 462 365 462 353\n56 0.999 95 350 94 354 94 357 93 360 94 365 97 365 101 365 105 366 108 366 111 367 115 365 117 361 117 358 117 352 114 352 111 352 108 350 103 350 100 350 98 350\n57 0.999 68 137 98 138 132 136 169 134 209 133 242 134 271 144 316 156 353 168 375 150 394 111 383 89 339 82 303 80 268 79 233 74 191 74 152 76 114 80 81 88\n57 0.999 96 179 95 212 107 228 131 232 155 236 179 240 200 244 228 248 252 252 275 258 298 253 300 220 287 206 265 202 243 199 219 196 194 192 168 188 142 183 118 178\n58 0.999 12 35 23 37 43 37 62 37 81 38 101 39 122 39 142 39 162 40 181 40 183 19 169 17 152 17 132 16 113 15 93 15 75 15 55 15 36 15 16 15\n58 0.999 14 60 25 61 39 62 54 62 66 63 81 63 95 63 108 64 123 64 135 63 137 49 125 47 111 46 97 46 83 46 70 46 56 45 43 45 30 45 17 45\n58 0.999 12 80 22 82 36 82 49 83 61 83 75 83 88 83 101 83 115 84 126 82 129 68 118 67 104 66 91 66 78 66 66 66 53 66 41 65 28 65 15 65\n58 0.999 21 162 32 163 45 163 57 163 70 163 83 164 97 163 111 163 123 163 135 163 140 153 128 153 116 153 104 153 91 153 78 153 66 153 53 153 40 153 26 153\n58 0.999 141 169 130 163 118 163 105 163 93 163 81 163 69 163 57 163 44 163 32 163 21 165 27 173 41 173 53 173 66 172 81 172 93 172 106 172 118 172 130 172\n59 0.999 292 80 283 82 273 83 264 85 254 87 244 89 235 90 226 92 217 94 207 97 208 106 218 106 227 104 237 102 246 100 256 98 266 97 275 94 283 93 293 90\n59 0.999 268 107 263 101 257 101 251 102 245 103 238 105 233 106 228 106 222 107 214 109 213 113 215 118 221 118 228 116 235 115 241 114 247 113 253 112 258 111 265 110\n59 0.999 190 103 179 105 169 107 160 110 150 114 141 118 131 122 121 124 117 130 115 140 114 151 124 150 132 147 142 144 151 140 161 135 173 131 182 127 188 124 189 114\n59 0.999 76 115 76 118 77 121 80 121 83 120 86 120 89 120 95 120 96 117 97 115 97 111 97 111 95 108 93 108 89 108 86 108 83 109 80 109 76 112 77 113\n59 0.999 278 118 272 110 265 111 258 112 250 112 243 114 236 115 230 116 223 117 215 118 212 122 216 128 223 128 231 127 238 126 245 124 253 123 260 123 266 121 273 120\n59 0.999 289 128 281 121 272 122 264 123 256 124 248 125 240 126 232 127 225 128 215 129 212 134 217 140 225 139 234 138 242 137 251 136 259 135 267 134 274 133 283 131\n59 0.999 62 120 60 127 56 138 56 143 62 144 72 144 80 146 91 151 99 154 103 156 106 156 108 148 110 141 110 133 105 130 97 126 90 123 82 121 74 121 67 121\n59 0.999 50 126 46 127 43 127 39 127 35 128 31 129 28 130 25 130 21 131 20 134 20 138 23 139 26 138 30 138 34 137 38 136 42 136 45 135 48 135 50 130\n59 0.999 286 133 278 135 270 135 261 136 253 138 244 139 236 139 229 140 220 141 213 142 214 151 222 152 230 151 239 150 248 149 256 148 265 147 273 146 280 145 287 142\n59 0.999 37 139 33 139 30 139 26 140 23 141 20 141 16 142 13 142 10 143 8 146 8 148 11 150 14 150 18 149 21 148 25 148 28 147 31 147 34 147 37 143\n59 0.999 31 148 28 147 25 148 22 148 19 149 16 150 13 150 10 150 7 150 5 153 5 156 8 157 10 157 14 157 17 156 20 156 24 155 26 155 30 154 32 151\n59 0.999 278 158 275 148 267 149 260 149 252 150 244 151 237 151 230 152 223 152 214 153 212 157 214 165 222 164 230 163 238 163 245 162 253 161 260 161 266 160 275 160\n59 0.999 172 160 169 156 166 156 162 156 159 156 155 156 152 157 149 157 146 157 142 157 141 160 142 164 145 164 150 164 153 163 156 163 160 163 163 163 166 162 170 162\n59 0.999 22 157 20 156 17 157 15 157 12 157 9 157 6 158 4 158 1 159 1 162 1 164 2 166 5 166 8 165 11 165 13 165 16 164 19 164 21 164 22 160\n59 0.999 182 161 176 160 171 161 166 162 160 162 154 163 148 163 143 163 137 164 132 165 131 171 137 172 143 172 149 171 155 171 160 170 166 169 171 169 177 168 182 167\n59 0.999 18 165 15 165 13 165 10 166 8 166 5 166 3 166 1 166 0 168 0 169 0 172 0 174 3 174 6 173 8 173 11 173 14 172 16 172 18 172 18 168\n59 0.999 66 166 63 166 61 166 59 166 56 167 54 167 51 167 49 167 46 168 46 171 45 173 48 174 50 174 53 174 55 173 58 173 61 172 63 172 65 172 67 169\n59 0.999 79 170 75 170 71 170 68 171 64 172 60 172 56 173 53 173 48 174 45 174 44 177 47 179 51 179 55 179 59 178 63 178 67 178 71 177 75 177 78 175\n59 0.999 28 173 24 172 21 173 17 173 13 174 9 174 6 174 2 174 0 176 0 177 0 180 0 182 3 183 7 183 11 182 15 182 19 181 22 181 25 181 28 177\n59 0.999 114 183 110 183 107 183 104 183 100 183 96 183 92 183 89 183 85 184 83 187 83 191 86 192 90 192 94 192 98 192 102 192 106 192 109 192 113 192 114 187\n60 0.999 307 47 292 48 272 52 255 58 238 63 222 68 206 72 190 77 171 84 170 95 177 120 190 121 209 115 227 110 246 105 262 99 279 94 294 90 308 87 314 75\n60 0.999 293 97 279 101 259 110 241 118 223 125 205 130 188 136 171 142 150 151 151 165 157 191 170 195 191 186 209 178 228 171 244 163 261 155 276 147 290 139 300 128\n60 0.999 351 261 343 244 331 237 320 242 307 247 294 251 282 253 269 254 257 254 244 257 242 268 243 284 250 292 262 291 278 288 291 285 304 282 316 277 328 272 340 265\n61 0.999 264 62 265 82 266 105 286 107 311 109 334 110 359 113 385 116 413 120 422 122 439 122 442 99 438 81 418 79 394 75 371 72 348 68 325 65 304 63 284 62\n61 0.999 253 62 229 65 205 65 182 66 156 67 131 69 109 70 85 72 60 74 52 92 56 120 75 124 98 122 124 119 150 117 177 114 202 112 226 110 249 107 253 88\n61 0.999 370 158 363 150 352 150 341 150 329 150 317 151 307 151 296 151 285 151 272 152 272 161 277 170 288 170 301 171 312 171 324 171 335 171 347 171 360 171 369 167\n61 0.999 119 152 119 162 121 171 132 171 141 171 151 171 160 171 172 171 183 171 190 171 199 170 200 159 195 151 186 151 177 151 167 151 157 151 146 151 136 151 128 151\n61 0.999 64 182 66 198 80 201 98 201 115 201 132 201 150 201 167 200 186 200 200 200 215 200 214 183 198 181 182 181 165 180 148 180 131 180 114 180 96 180 80 181\n61 0.999 428 181 413 180 400 180 386 180 372 181 358 181 345 181 332 181 318 181 305 185 305 200 318 201 332 201 346 201 361 201 375 201 389 201 403 201 418 201 427 195\n62 0.999 46 61 45 86 59 93 79 83 98 76 118 74 138 74 157 79 177 87 196 101 211 107 208 90 203 66 182 56 163 49 144 45 123 43 102 44 81 48 62 54\n62 0.999 113 78 112 81 112 86 113 90 116 91 120 91 123 91 126 91 130 91 133 91 136 90 136 85 136 81 136 77 133 77 130 76 126 76 122 76 118 76 115 77\n62 0.999 47 113 55 121 71 122 89 122 106 123 124 124 141 124 157 125 175 125 192 126 198 113 192 100 174 100 157 98 140 98 123 97 106 96 88 96 70 95 53 96\n62 0.999 31 155 46 168 65 175 84 179 106 182 130 183 152 182 176 178 195 172 211 165 210 134 202 125 177 135 155 141 136 143 115 143 93 140 73 133 52 123 37 127\n62 0.999 16 240 33 246 55 246 78 247 103 248 129 248 153 248 178 248 202 248 222 245 222 215 212 204 184 205 158 204 136 203 112 203 87 202 62 201 36 200 19 208\n62 0.999 19 258 17 287 19 309 45 310 70 310 96 311 123 311 149 311 174 311 196 311 224 308 222 283 218 261 190 260 167 259 142 259 116 259 90 258 64 258 40 257\n63 0.999 413 30 387 37 361 45 334 52 310 59 283 67 258 75 231 82 203 89 188 102 189 123 216 118 243 111 269 104 296 97 322 89 347 83 375 75 403 69 416 54\n63 0.999 359 117 343 120 328 121 311 124 295 127 278 129 261 131 244 134 227 136 227 149 228 166 241 167 258 164 274 161 291 158 308 154 323 151 340 147 355 144 361 133\n63 0.999 444 130 411 135 379 141 345 148 314 156 283 163 251 169 218 175 181 181 160 191 154 222 185 215 218 208 250 202 281 195 314 189 347 183 380 178 413 171 441 158\n63 0.999 34 166 35 176 37 186 48 186 58 188 67 191 77 196 85 202 93 209 100 217 108 219 111 208 106 200 100 193 92 185 84 178 75 174 65 170 54 168 44 166\n63 0.999 404 183 377 185 350 188 323 192 297 195 269 199 244 203 216 209 192 213 180 229 180 251 203 250 230 246 257 243 285 240 313 238 338 235 365 230 391 226 400 207\n63 0.999 42 193 39 201 37 208 42 213 48 217 55 220 63 224 70 229 78 232 85 236 93 239 96 231 96 222 91 218 84 214 77 210 70 206 63 203 55 199 49 195\n63 0.999 31 210 27 218 31 225 37 232 43 237 50 242 58 247 66 252 75 256 82 259 91 255 94 247 91 240 82 236 74 233 65 230 58 225 51 220 44 214 37 209\n64 0.999 89 47 95 93 115 110 158 108 199 109 239 111 279 116 316 123 353 132 389 142 420 144 431 101 399 80 360 70 322 62 284 53 244 48 201 45 157 42 122 43\n64 0.999 121 194 129 203 153 204 185 206 215 209 247 212 276 214 302 216 329 219 341 201 346 181 342 159 303 155 275 152 248 149 222 145 194 144 162 142 128 139 125 165\n64 0.999 182 282 189 286 208 287 228 288 247 289 267 290 286 291 303 292 322 294 337 290 340 267 338 253 312 250 294 248 275 246 257 245 237 245 216 243 193 243 186 257\n64 0.999 133 309 149 312 173 312 196 314 217 316 240 317 264 318 285 319 307 322 328 323 335 304 318 295 295 293 272 292 250 291 228 290 206 288 183 287 160 285 137 284\n64 0.999 130 318 133 342 157 343 180 344 203 346 225 347 250 348 271 349 293 351 315 352 339 354 334 328 311 326 289 325 266 323 244 322 221 320 196 319 174 319 151 318\n64 0.999 154 384 153 403 170 407 187 407 203 408 221 409 238 410 254 410 271 412 288 412 305 412 305 389 289 388 272 387 256 387 239 386 221 386 204 385 187 384 171 383\n64 0.999 201 410 202 420 203 427 208 430 215 430 222 430 229 430 236 430 242 431 250 431 256 430 257 419 255 410 250 409 242 409 235 409 228 409 221 409 213 409 206 409\n64 0.999 160 430 161 451 179 451 198 451 215 452 234 453 253 454 271 455 287 455 303 455 322 455 321 433 303 432 285 431 267 430 250 430 232 430 213 429 195 429 178 429\n65 0.999 398 48 373 32 349 22 314 16 284 15 250 21 233 25 210 32 181 43 162 80 161 105 186 96 209 88 240 80 262 78 295 78 339 85 366 94 396 99 395 71\n65 0.999 395 141 364 131 338 142 316 164 292 177 263 180 236 175 216 158 200 135 173 137 158 151 178 175 198 195 220 208 249 218 280 222 309 216 333 201 354 183 372 165\n65 0.999 16 298 15 304 20 307 26 306 32 306 38 307 48 307 55 307 61 307 63 306 65 301 64 294 58 292 54 291 46 291 39 291 35 291 28 291 23 291 18 292\n66 0.999 136 31 129 32 119 35 111 38 102 41 93 45 85 47 77 52 75 57 74 72 73 81 80 79 91 75 102 73 112 70 120 68 128 65 135 62 139 55 138 46\n66 0.999 144 68 134 70 124 74 115 77 105 80 96 83 87 86 77 88 70 91 70 103 70 117 79 116 89 113 100 110 110 106 119 102 128 99 136 97 144 94 144 84\n66 0.999 139 115 131 116 121 118 112 120 102 122 93 123 85 124 75 127 74 133 75 145 74 161 77 161 90 158 101 155 111 153 119 151 127 148 135 147 139 141 139 130\n66 0.999 177 171 166 154 153 158 142 166 128 174 114 180 100 183 86 184 72 183 57 178 48 186 50 199 65 205 81 208 98 208 112 205 127 201 140 195 153 189 165 180\n67 0.999 422 84 385 84 347 86 308 90 269 98 231 106 195 115 159 125 123 138 88 155 89 185 120 188 156 173 188 161 227 150 261 142 301 135 339 132 376 132 412 127\n67 0.999 259 147 251 146 243 147 236 150 226 152 219 156 215 162 216 169 216 179 219 187 226 193 235 192 243 190 250 188 259 186 266 183 271 181 269 174 266 166 265 157\n67 0.999 393 164 366 166 331 175 295 182 260 189 228 195 197 202 165 209 128 216 105 226 106 268 142 267 176 260 209 255 242 250 275 246 310 240 342 234 375 229 396 213\n68 0.999 409 168 402 126 375 106 329 107 287 108 246 108 209 109 179 109 140 110 110 112 83 122 80 167 116 151 153 147 192 144 227 143 268 144 304 146 339 150 370 157\n68 0.999 97 164 126 163 152 161 180 159 208 157 236 157 267 158 295 158 322 161 352 164 373 156 346 151 320 147 293 144 264 143 233 143 205 143 178 145 148 148 118 151\n68 0.999 87 233 92 235 97 235 104 235 110 235 117 235 126 235 132 235 139 235 147 235 149 228 146 225 139 225 132 224 125 224 118 224 110 224 104 224 96 224 89 225\n68 0.999 64 251 70 252 75 252 84 252 92 252 100 252 110 252 118 252 128 253 131 244 132 235 128 235 120 235 112 235 104 235 94 235 86 234 78 234 68 234 64 243\n68 0.999 132 241 133 246 134 249 139 249 144 249 150 249 157 249 161 249 167 249 172 246 172 241 173 240 170 235 166 235 161 235 155 235 150 235 144 235 138 235 133 237\n68 0.999 296 238 298 253 310 253 326 253 341 254 353 255 369 255 382 255 395 255 414 256 421 255 419 240 406 239 392 239 379 238 363 238 350 237 336 237 321 237 304 237\n68 0.999 298 270 301 272 303 272 308 272 313 272 317 272 323 272 328 272 333 272 337 267 337 261 335 261 331 261 327 260 321 260 317 260 312 260 307 260 302 260 298 264\n68 0.999 381 263 381 270 381 273 386 273 391 273 395 273 401 273 404 273 409 273 415 273 416 269 417 266 414 263 410 262 405 262 401 262 396 261 393 261 388 261 381 262\n68 0.999 430 330 436 331 442 331 450 331 458 332 467 332 476 332 484 332 494 332 499 325 498 315 497 314 490 313 481 313 473 313 464 313 455 313 446 313 437 313 431 322\n69 0.999 11 19 10 43 10 65 33 58 57 48 80 43 104 41 128 41 144 44 166 50 189 58 195 42 191 21 169 19 148 19 123 19 100 18 77 18 54 18 32 18\n69 0.999 31 153 45 155 58 157 75 158 91 159 109 160 126 159 145 157 162 156 179 155 184 145 174 140 158 138 139 136 122 135 104 134 88 135 72 136 56 137 40 139\n69 0.999 79 195 80 200 87 200 93 200 100 200 106 200 111 200 119 200 123 200 131 200 132 193 130 190 123 190 117 190 111 190 106 190 99 190 93 190 87 190 80 190\n69 0.999 74 203 75 211 80 214 88 214 95 214 102 214 109 214 118 214 123 214 130 214 137 213 137 204 131 202 124 201 117 201 110 201 103 201 95 202 88 202 81 202\n69 0.999 31 209 30 213 32 215 35 215 38 215 41 215 44 215 48 215 50 215 53 215 56 213 56 210 54 208 51 208 48 208 45 208 42 208 39 208 36 208 33 208\n69 0.999 28 218 29 221 32 221 36 221 39 221 43 221 46 221 50 221 53 221 56 222 58 217 57 215 53 215 50 215 46 215 43 215 40 215 37 215 33 215 30 215\n69 0.999 82 218 82 225 86 227 92 227 98 227 104 227 109 227 117 227 120 227 126 227 132 226 132 219 127 218 121 218 116 218 110 218 105 218 99 218 93 218 87 218\n69 0.999 33 223 33 226 34 228 36 228 39 228 42 228 44 228 47 228 49 228 52 228 54 225 54 222 52 222 50 221 47 221 45 221 42 221 40 222 37 222 35 222\n69 0.999 55 228 52 228 49 228 46 228 43 228 40 228 37 228 36 228 32 228 31 231 31 234 34 234 37 234 40 234 43 234 46 234 49 234 51 234 54 234 55 231\n69 0.999 31 235 31 238 32 241 35 241 39 241 41 240 44 240 48 240 50 241 52 241 56 239 56 235 54 234 51 234 48 234 45 234 42 234 39 234 36 234 33 234\n69 0.999 31 241 31 244 32 247 35 247 39 247 42 247 44 247 48 247 49 247 52 247 55 246 56 243 55 241 52 241 49 241 46 241 43 241 40 241 36 241 33 241\n69 0.999 158 240 157 244 159 247 162 247 165 247 168 247 171 247 175 247 177 246 180 246 183 245 183 241 181 240 178 240 175 240 173 240 169 240 166 240 163 240 161 240\n69 0.999 58 248 56 247 52 247 48 247 45 247 41 247 38 247 35 247 31 247 30 248 29 252 31 254 34 254 38 254 41 254 45 254 49 254 52 253 55 252 58 251\n69 0.999 157 247 157 251 158 253 162 253 165 253 168 253 171 253 175 253 177 253 180 253 183 252 183 248 181 247 178 247 175 247 172 247 169 246 166 247 163 247 160 246\n69 0.999 5 284 11 300 33 300 56 301 78 301 99 301 121 301 145 301 164 301 186 300 208 301 199 284 178 284 156 284 134 284 113 284 91 284 69 284 47 284 26 284\n69 0.999 1 315 22 315 42 315 64 315 86 315 108 316 130 316 154 316 174 317 196 317 212 304 193 303 171 303 149 303 126 303 105 303 83 303 60 303 39 302 17 302\n70 0.999 93 48 91 57 90 71 88 81 97 86 108 89 120 93 131 96 142 98 152 101 162 102 162 95 162 80 162 66 154 63 145 60 134 56 122 52 112 50 100 48\n70 0.999 89 90 89 102 88 115 87 126 97 129 111 131 122 133 134 136 145 138 155 140 166 141 168 134 168 118 166 112 153 108 143 106 132 102 120 98 109 94 96 90\n70 0.999 95 131 96 140 96 153 95 166 98 171 110 173 121 175 132 176 142 177 151 178 160 179 162 173 162 160 163 151 154 146 144 144 135 142 124 138 114 135 102 131\n70 0.999 86 171 85 183 85 196 94 198 106 199 118 200 129 201 140 203 151 204 161 205 172 206 172 195 172 185 162 180 151 179 140 178 129 176 117 175 106 173 94 171\n70 0.999 61 236 74 242 88 248 101 253 116 256 132 258 148 258 162 255 176 251 189 248 191 235 180 227 166 231 153 235 139 236 124 234 109 231 94 226 81 221 67 218\n70 0.999 113 281 114 283 118 283 123 283 128 283 132 283 137 283 142 284 146 284 151 283 151 276 150 273 145 273 141 273 136 273 132 273 127 272 123 272 118 272 113 274\n70 0.999 107 291 111 293 117 292 124 293 130 293 136 293 142 293 149 293 156 293 162 293 163 286 159 283 153 283 147 283 140 283 133 283 127 283 121 283 115 282 108 283\n70 0.999 106 294 106 302 114 303 121 302 129 303 136 303 144 303 151 303 159 303 166 304 173 302 172 293 166 293 157 293 150 293 142 293 135 293 127 292 120 292 112 292\n70 0.999 118 308 118 312 122 313 126 313 130 313 133 312 136 312 139 313 143 313 147 313 151 313 152 307 149 306 145 306 141 306 137 306 133 306 129 306 125 306 121 306\n71 0.999 121 90 120 108 118 125 118 140 135 142 157 144 182 146 202 147 219 146 235 145 237 144 237 128 236 109 234 94 220 93 206 95 188 93 168 91 151 90 136 90\n71 0.999 118 150 118 170 117 187 119 200 138 201 159 201 182 202 202 202 217 203 241 203 244 203 246 188 247 170 239 161 222 159 205 158 187 156 169 155 150 152 133 150\n71 0.999 128 204 129 221 130 236 129 250 137 259 159 260 183 259 202 259 213 259 233 259 237 258 238 245 239 229 237 217 222 215 209 213 192 211 174 208 157 205 142 204\n71 0.999 113 260 113 280 113 295 127 297 148 297 168 297 191 297 212 297 229 297 251 296 256 296 256 280 254 263 237 262 219 262 202 262 184 262 165 261 147 260 128 260\n71 0.999 297 352 287 329 267 326 243 335 219 341 191 344 165 344 141 340 118 334 96 328 82 338 82 357 106 364 130 370 154 374 178 376 201 376 226 372 252 365 275 359\n71 0.999 211 397 203 396 195 396 188 397 180 397 172 398 166 397 160 397 151 398 145 403 147 412 155 412 163 412 169 412 178 412 185 412 193 412 200 412 209 412 211 406\n71 0.999 232 412 222 411 212 412 201 412 191 412 179 412 171 412 161 412 151 412 138 412 137 424 147 426 158 426 168 426 179 426 189 426 200 426 210 426 222 426 232 424\n71 0.999 251 436 244 424 233 424 220 425 206 425 193 425 183 426 172 426 159 426 145 426 135 429 141 441 154 441 165 441 179 441 191 441 204 441 217 441 230 441 243 441\n71 0.999 160 447 159 453 166 455 170 455 177 455 183 454 191 454 196 454 202 455 209 455 214 453 213 445 207 445 201 445 195 445 189 445 183 445 176 445 171 445 165 445\n71 0.999 163 459 162 465 163 471 167 474 174 475 181 475 189 475 196 475 203 475 209 475 211 474 212 467 212 459 207 457 200 458 194 458 188 459 181 457 174 457 168 457\n71 0.999 215 478 209 475 203 475 198 475 192 475 185 475 181 475 176 475 170 475 163 475 161 481 165 484 171 484 176 484 183 484 189 484 195 484 200 484 206 484 213 483\n71 0.999 220 484 213 483 206 483 199 483 192 484 184 484 178 484 171 484 164 484 157 484 155 492 161 494 168 494 175 494 182 493 189 493 196 493 204 493 210 493 218 493\n72 0.999 330 21 313 12 294 3 274 1 253 3 233 7 214 13 197 21 181 33 167 45 161 61 177 69 193 57 209 46 229 39 249 34 267 33 286 35 304 43 319 45\n72 0.999 295 65 284 66 272 67 260 68 247 69 236 71 223 73 211 74 200 76 189 81 187 95 198 96 210 94 223 93 236 91 249 89 260 87 271 85 282 84 294 83\n72 0.999 397 93 369 95 333 97 300 100 266 106 233 110 200 114 166 118 134 123 100 128 97 168 125 166 159 163 196 159 231 155 264 151 297 148 330 145 361 142 394 138\n72 0.999 269 175 261 175 252 176 243 177 234 178 225 179 216 180 207 181 198 181 191 185 190 197 197 198 206 197 215 197 225 196 234 195 242 195 251 194 259 193 266 190\n72 0.999 331 223 316 197 297 208 278 220 256 230 234 235 212 236 189 233 169 222 149 209 133 222 142 242 160 257 181 267 207 272 230 269 253 265 275 258 294 248 313 237\n72 0.999 338 255 315 258 291 263 268 267 244 272 220 276 196 280 173 285 150 289 126 293 115 313 138 313 161 308 185 304 210 300 234 296 258 292 280 287 302 284 327 279\n73 0.999 78 96 87 140 131 125 174 114 217 108 261 106 303 106 346 112 390 123 429 135 463 138 467 97 421 79 376 65 334 56 292 52 246 53 199 58 157 66 117 79\n73 0.999 179 138 174 164 170 200 179 213 203 215 234 216 261 218 289 218 316 219 337 219 361 212 362 186 364 154 358 137 331 135 308 134 280 133 247 132 219 131 197 133\n73 0.999 134 259 149 277 178 276 208 277 237 277 267 277 296 278 326 279 357 280 385 279 406 268 393 249 364 248 334 247 305 246 275 246 244 245 213 245 183 245 153 244\n74 0.999 139 24 143 47 167 47 192 47 216 47 240 47 265 47 289 47 313 47 337 47 357 41 354 21 329 21 305 20 282 20 257 19 232 19 207 19 183 21 158 20\n74 0.999 13 26 13 84 12 142 13 198 24 252 26 308 27 366 42 313 43 256 43 203 42 200 42 255 59 300 59 240 59 183 61 125 61 67 42 29 44 85 42 26\n74 0.999 485 23 455 40 437 69 436 111 436 165 436 207 436 247 436 289 440 329 435 355 461 377 480 371 480 332 481 292 482 251 483 216 483 187 485 145 486 104 488 61\n74 0.999 426 95 380 97 333 107 289 121 244 137 200 152 153 164 110 174 69 180 64 223 65 270 98 274 136 259 182 236 225 219 270 200 316 181 359 168 402 157 423 131\n74 0.999 408 162 376 164 347 173 320 180 292 191 266 204 240 216 212 228 184 238 154 246 155 272 182 266 210 258 236 246 263 233 290 222 319 212 347 202 376 194 406 187\n74 0.999 18 310 13 315 13 322 13 328 13 334 13 340 13 347 13 354 13 359 16 364 22 364 26 357 29 352 28 345 29 339 29 332 29 327 29 320 28 314 24 311\n74 0.999 99 336 99 350 102 362 117 361 131 361 145 359 158 359 172 359 186 359 201 358 214 357 217 345 210 336 197 336 184 336 169 335 155 336 141 335 127 335 113 335\n74 0.999 391 336 375 336 361 337 348 337 335 337 321 338 308 338 294 338 281 338 276 345 277 357 292 359 305 359 319 359 332 359 346 359 360 359 374 359 388 359 391 349\n74 0.999 311 368 311 380 316 389 328 388 341 388 352 388 365 389 376 389 389 389 402 389 413 388 414 375 409 366 398 366 385 366 373 366 360 366 347 366 335 366 324 366\n74 0.999 77 370 79 382 87 389 99 388 112 389 125 389 137 389 149 389 162 389 175 389 185 387 184 374 177 367 165 368 152 368 139 368 126 368 113 368 102 368 89 367\n75 0.999 478 179 463 125 424 113 381 110 337 110 292 110 249 112 204 113 162 114 120 118 80 132 82 187 129 186 171 186 219 187 262 185 299 181 344 179 388 178 430 178\n75 0.999 474 205 426 185 377 188 332 188 272 191 222 193 173 195 114 199 66 225 62 273 70 335 115 343 166 346 216 348 277 345 327 342 373 341 424 339 478 305 475 261\n76 0.999 62 11 67 28 89 27 110 27 130 27 151 27 172 27 192 27 214 27 235 28 255 27 250 11 229 11 208 11 187 11 166 11 145 11 125 11 104 11 83 11\n76 0.999 8 15 7 34 7 53 7 72 7 91 8 110 8 129 8 147 8 166 8 184 16 198 17 179 18 160 18 140 18 121 17 103 17 83 17 64 17 45 17 26\n76 0.999 20 15 17 33 17 50 17 69 17 87 17 105 17 124 17 141 17 159 17 178 23 194 28 176 28 158 28 139 27 121 27 102 28 84 28 64 28 46 27 27\n76 0.999 27 15 27 35 27 54 27 73 27 92 27 110 27 129 27 149 27 166 28 188 37 200 38 181 38 162 38 142 38 123 38 103 38 84 39 65 39 46 38 26\n76 0.999 312 12 289 23 280 45 281 76 282 103 282 129 282 156 282 182 286 211 285 230 299 232 311 235 313 212 313 186 313 160 313 142 313 116 314 91 314 64 313 35\n76 0.999 272 56 245 59 217 66 190 75 162 84 132 94 102 101 77 106 50 113 44 138 50 166 70 171 98 164 125 148 152 138 178 126 207 113 235 105 260 97 272 80\n76 0.999 261 102 243 103 226 107 209 113 192 119 176 126 160 134 145 141 129 147 111 153 109 168 126 164 142 159 158 152 175 146 192 138 208 132 225 126 243 122 260 117\n76 0.999 91 196 91 203 90 208 90 214 96 215 103 216 109 216 115 215 122 215 128 215 135 215 137 210 137 204 135 198 129 198 122 198 116 198 110 197 104 196 97 196\n76 0.999 179 199 179 205 180 210 182 214 189 214 195 214 201 214 207 214 213 214 220 214 226 214 227 208 227 203 222 200 216 200 210 200 203 200 197 199 191 198 185 198\n76 0.999 13 203 7 204 7 208 7 214 7 219 7 224 7 230 7 235 7 239 7 243 8 248 16 246 17 241 18 236 18 233 18 228 18 223 18 217 19 212 19 205\n76 0.999 146 217 140 217 134 217 128 218 123 218 117 218 112 218 107 218 100 218 95 218 95 223 101 224 106 224 112 224 117 224 123 224 129 224 135 224 140 224 146 222\n76 0.999 170 218 171 224 177 225 184 225 190 225 197 225 202 225 209 225 215 225 222 225 229 223 227 218 221 217 215 217 208 217 201 217 195 217 188 217 182 217 176 217\n76 0.999 121 234 113 233 105 233 97 233 89 233 81 233 73 234 66 234 57 234 52 238 53 246 61 246 69 246 77 246 84 246 92 246 101 246 109 246 117 246 121 242\n76 0.999 206 233 206 241 209 246 215 246 223 246 231 246 238 246 245 247 253 247 261 247 269 247 270 239 266 233 258 233 250 233 242 233 235 233 228 233 220 233 213 233\n77 0.999 173 93 180 117 208 120 235 124 261 128 289 130 315 133 343 137 369 140 393 143 418 146 411 120 384 117 359 114 330 110 303 107 278 104 251 101 225 98 198 95\n77 0.999 220 139 232 143 247 145 263 147 278 149 293 151 308 153 323 155 338 157 353 159 363 149 351 145 336 143 320 141 305 140 291 137 275 135 260 133 246 131 231 130\n77 0.999 186 143 185 174 198 184 223 188 248 190 271 193 291 197 315 201 340 204 363 206 383 204 388 172 371 167 348 164 322 161 299 158 278 156 255 153 233 148 209 143\n77 0.999 166 186 181 194 201 197 221 199 241 201 261 204 280 207 301 209 320 211 340 214 361 211 344 205 323 203 303 200 284 197 265 195 245 192 225 189 206 186 186 184\n77 0.999 186 230 200 248 229 252 258 255 286 260 316 263 344 266 374 269 403 272 431 275 453 267 440 247 410 244 381 241 351 238 321 234 293 230 266 227 237 223 208 217\n78 0.999 310 97 275 103 235 117 196 127 156 133 129 134 99 131 61 125 19 116 12 163 16 196 50 184 91 176 132 173 169 175 209 178 245 183 281 191 313 181 313 138\n78 0.999 101 196 101 209 110 213 123 214 137 214 155 215 171 215 185 216 194 216 204 216 216 215 217 202 207 197 194 197 180 197 166 196 154 196 140 195 127 195 114 196\n78 0.999 286 215 282 215 278 214 273 215 270 214 268 215 265 215 260 215 255 215 255 221 254 226 258 228 261 227 266 227 270 227 274 227 278 227 282 227 286 224 286 219\n79 0.999 436 72 398 53 361 38 312 34 266 35 222 39 177 40 133 45 90 60 49 73 46 124 78 116 117 99 164 92 208 96 252 91 296 90 340 94 383 107 423 111\n79 0.999 367 101 332 100 305 100 272 101 239 102 210 103 178 103 148 104 117 106 86 111 80 137 114 140 145 139 177 139 209 139 240 138 271 137 304 137 335 137 364 130\n79 0.999 442 141 396 140 359 141 314 140 269 141 229 141 186 142 147 142 106 142 64 142 49 175 94 177 135 177 179 177 221 178 263 179 305 179 349 179 391 178 438 179\n79 0.999 78 185 86 215 119 215 153 216 185 217 218 217 249 219 286 219 320 220 349 220 370 212 369 183 334 182 301 182 268 181 236 181 204 180 169 179 135 179 103 179\n79 0.999 67 219 64 262 83 278 125 282 168 284 210 287 248 291 299 294 339 296 371 298 397 297 398 251 383 232 338 230 299 229 259 227 220 226 178 224 138 223 100 222\n79 0.999 61 301 90 305 129 309 163 313 199 316 235 320 270 324 311 328 350 331 385 333 403 318 378 304 340 298 302 296 266 292 229 290 194 287 155 285 117 282 81 278\n80 0.999 54 111 57 117 66 119 73 114 81 111 88 109 98 110 106 113 113 117 119 120 127 116 124 108 117 104 109 100 101 97 92 96 83 96 75 99 67 102 61 106\n80 0.999 213 113 213 130 217 144 235 144 256 144 274 143 292 143 314 143 331 144 344 145 359 145 360 126 353 117 335 116 317 115 299 115 281 115 263 114 246 113 230 113\n80 0.999 62 154 62 161 69 163 75 163 81 163 87 163 94 163 100 163 106 163 112 164 116 160 115 153 109 154 104 154 98 153 91 153 85 153 78 153 73 153 67 152\n80 0.999 212 167 213 180 232 181 250 181 272 181 291 182 310 182 333 182 353 183 360 180 360 164 358 152 339 151 320 151 302 151 284 151 267 150 249 150 231 150 214 151\n80 0.999 127 181 123 172 115 173 107 178 99 181 89 182 81 182 73 179 65 175 57 171 53 178 54 185 63 190 71 193 79 195 87 196 97 195 105 193 114 190 121 186\n80 0.999 172 219 173 221 178 221 186 221 192 222 199 222 205 222 213 222 220 222 222 217 223 212 220 207 213 206 208 206 201 206 195 206 188 206 181 206 174 206 172 210\n80 0.999 293 208 292 211 292 215 292 219 294 221 298 221 301 221 306 221 310 221 312 221 314 220 314 216 314 213 314 209 312 208 308 208 305 208 301 208 298 207 295 208\n80 0.999 48 213 48 222 53 229 63 229 73 229 82 230 91 230 103 230 113 230 121 230 128 229 129 218 123 213 114 213 104 213 95 213 85 212 75 212 65 212 57 212\n80 0.999 455 212 444 212 434 212 424 212 415 212 405 212 396 212 388 212 378 212 373 217 372 228 382 229 391 229 401 229 411 229 421 229 431 229 441 228 452 228 455 222\n80 0.999 258 221 258 231 267 234 278 233 288 233 298 233 309 233 320 233 330 233 340 233 349 232 347 221 337 220 327 220 317 220 307 221 297 221 286 220 276 220 267 220\n80 0.999 173 224 173 231 176 235 182 235 189 235 195 235 201 235 209 235 216 235 221 235 223 228 222 222 216 222 210 222 204 222 198 221 193 221 187 222 182 221 176 222\n81 0.999 349 153 325 123 290 111 253 104 216 103 180 103 146 100 105 110 73 118 48 139 45 191 76 203 114 193 146 178 177 168 211 164 250 167 287 176 319 191 350 192\n82 0.999 231 35 245 50 260 42 276 38 293 36 310 35 327 36 342 38 359 43 375 50 389 45 384 31 368 25 350 20 332 17 316 17 298 18 280 18 263 21 246 27\n83 0.999 130 45 131 69 147 78 173 73 200 69 225 67 252 67 278 69 304 72 329 76 351 78 352 51 335 40 309 37 284 34 257 33 231 32 205 34 179 36 154 41\n83 0.999 469 48 462 47 454 49 446 50 446 59 446 68 446 77 446 85 446 94 446 102 446 110 455 110 465 108 470 101 471 91 470 83 470 77 471 68 472 60 472 52\n83 0.999 474 59 471 64 471 70 471 77 471 83 471 88 471 91 471 98 470 104 475 108 483 108 485 103 485 97 485 91 485 85 485 79 485 73 487 66 486 60 480 59\n83 0.999 189 73 190 83 201 83 212 82 224 81 235 81 246 81 258 81 269 82 279 83 289 82 289 71 278 70 267 69 256 67 244 67 233 67 222 68 210 69 200 70\n83 0.999 343 91 316 92 281 94 248 96 216 98 188 98 160 96 136 99 134 134 131 166 127 196 150 194 181 192 214 186 246 183 276 180 304 177 328 176 338 150 341 118\n83 0.999 446 107 446 110 446 117 446 124 446 129 453 130 458 130 463 130 470 130 476 130 484 130 485 125 486 119 486 113 484 108 478 107 471 107 465 106 458 106 452 106\n83 0.999 449 144 447 156 446 169 446 182 447 195 447 208 446 222 446 233 446 246 447 258 460 262 463 249 463 237 463 223 463 209 463 196 463 183 463 170 461 157 460 151\n83 0.999 192 192 197 205 215 204 234 204 253 205 272 205 290 205 309 205 328 205 345 205 361 201 357 185 339 185 320 185 301 185 282 185 263 185 243 186 225 186 207 186\n83 0.999 174 216 181 224 196 223 210 221 224 221 238 221 252 221 265 222 280 224 293 225 307 224 299 215 285 213 271 212 258 211 242 211 229 211 215 211 201 212 186 214\n83 0.999 124 222 119 219 113 219 108 220 103 222 98 223 92 224 89 228 90 234 91 238 92 243 96 244 101 242 106 241 112 239 117 238 122 238 127 235 126 230 125 226\n83 0.999 350 227 349 231 351 234 356 236 361 237 366 238 371 239 376 241 381 243 385 244 389 242 390 237 388 231 384 230 380 228 374 227 369 226 364 224 359 223 354 223\n83 0.999 277 261 277 270 277 281 278 288 285 293 294 293 304 293 315 293 325 294 334 294 344 295 344 285 343 275 342 266 336 261 326 261 315 261 305 260 295 260 285 260\n83 0.999 263 296 263 306 270 311 279 308 289 307 299 307 309 307 319 307 329 309 338 310 349 310 349 299 342 295 332 294 322 294 312 293 302 293 292 293 281 293 272 294\n83 0.999 277 310 280 313 286 313 291 312 296 312 301 312 306 312 312 312 317 312 322 312 328 310 322 307 318 306 313 306 307 306 302 306 297 306 292 306 286 307 281 307\n84 0.999 262 73 245 70 226 68 207 67 187 67 167 69 147 72 128 76 110 82 94 91 101 110 114 122 132 115 151 110 170 106 189 105 208 105 226 108 245 113 254 100\n84 0.999 295 215 272 215 244 216 218 218 191 221 163 223 135 225 105 228 77 230 60 235 59 274 82 274 111 272 141 271 169 270 195 267 223 265 249 263 276 261 296 256\n84 0.999 303 281 298 281 289 281 282 283 273 283 264 283 255 284 247 284 239 289 240 298 241 308 248 310 255 310 265 309 273 309 281 308 289 308 298 308 303 307 304 297\n84 0.999 225 286 210 287 192 287 175 289 157 290 140 291 122 292 104 293 86 294 71 295 70 318 83 320 102 319 120 318 138 317 155 317 172 316 189 314 205 313 223 311\n84 0.999 288 312 279 312 267 312 255 313 244 313 233 313 220 314 208 315 197 317 194 325 195 341 204 343 217 341 228 341 239 341 251 340 263 340 273 340 285 340 288 331\n84 0.999 154 319 144 318 133 319 124 319 113 320 101 320 89 320 77 321 67 324 67 334 67 350 74 351 87 351 98 350 109 350 119 349 130 347 141 347 151 347 153 337\n84 0.999 230 344 218 343 204 343 189 344 175 346 160 346 145 347 131 348 116 349 108 354 108 376 120 378 135 376 150 376 165 375 179 374 193 374 206 373 221 373 230 368\n85 0.999 174 90 161 90 146 89 132 90 117 91 102 93 87 93 72 95 55 96 54 111 54 128 67 126 83 121 98 118 111 116 126 115 141 116 155 119 169 122 174 113\n85 0.999 163 162 153 163 142 164 131 165 120 166 108 167 96 168 85 169 73 170 66 174 66 190 76 190 88 189 100 188 111 187 122 186 133 185 145 183 156 182 165 180\n85 0.999 164 200 159 190 149 191 139 191 129 192 119 193 109 193 99 194 89 194 79 195 72 198 76 207 87 206 97 206 107 205 117 205 127 204 136 204 146 203 156 203\n86 0.999 204 31 204 44 212 48 224 49 235 49 246 49 257 50 268 50 280 51 289 52 301 52 302 36 293 35 282 35 271 33 261 33 250 33 237 32 226 32 214 31\n86 0.999 213 68 212 78 212 94 215 106 226 105 237 106 248 106 259 106 270 106 278 106 288 107 290 95 291 79 289 70 280 69 269 69 259 69 246 68 234 67 222 67\n86 0.999 145 130 146 157 147 184 179 179 202 167 225 162 251 161 276 161 299 160 310 157 336 169 355 156 341 142 316 139 301 131 275 129 250 127 224 127 197 127 168 126\n86 0.999 151 266 167 268 189 269 213 270 234 272 257 274 278 274 301 274 321 274 343 274 349 252 333 244 311 242 289 241 266 241 246 240 224 240 200 240 180 239 156 240\n87 0.999 435 106 403 100 370 96 337 96 305 96 276 97 243 100 210 103 176 105 146 108 140 135 168 136 200 133 231 130 262 127 293 127 326 127 358 128 392 130 422 132\n87 0.999 275 139 275 149 284 151 295 149 304 149 314 150 323 152 333 155 342 159 351 164 358 157 356 148 348 144 338 140 329 136 319 133 309 132 299 132 290 133 280 134\n87 0.999 269 135 262 133 256 133 250 134 243 135 237 137 232 138 226 140 220 142 220 148 221 154 225 156 231 154 237 152 243 151 249 150 254 149 262 148 268 148 269 141\n87 0.999 269 148 262 148 254 148 245 150 237 151 231 153 222 156 215 159 208 164 207 171 209 179 214 181 221 176 228 172 235 169 242 167 251 165 259 165 267 164 270 157\n87 0.999 203 174 198 174 193 176 188 178 183 180 179 181 174 183 171 185 172 191 172 195 173 201 178 202 183 199 188 198 192 195 196 193 201 192 206 189 206 183 205 179\n87 0.999 355 185 338 186 322 189 305 190 290 192 275 192 259 192 242 192 224 191 208 189 203 202 218 204 233 205 250 206 268 207 284 207 302 206 318 205 335 203 352 200\n87 0.999 361 200 344 201 326 204 308 205 290 206 273 207 256 207 239 206 220 205 201 202 195 215 212 218 230 220 248 221 265 222 283 222 302 220 320 220 339 217 356 215\n87 0.999 287 227 286 231 285 233 286 238 286 241 289 243 292 243 295 244 299 244 303 244 306 244 307 241 308 237 308 234 308 230 305 229 301 229 298 228 295 228 290 228\n87 0.999 375 241 352 240 331 240 310 240 287 241 268 242 248 243 227 242 203 241 184 241 180 260 200 263 221 263 242 263 262 263 283 263 304 263 326 262 347 263 369 261\n88 0.999 86 97 93 110 104 108 120 103 134 100 148 99 163 99 177 102 191 107 204 113 214 106 212 96 199 89 186 85 171 82 156 80 140 81 126 82 111 86 99 91\n88 0.999 84 103 77 104 73 108 68 111 63 116 59 121 55 126 52 132 50 138 49 143 52 151 57 148 60 143 62 138 65 132 70 126 74 121 80 117 84 114 87 111\n88 0.999 56 195 66 204 86 202 106 200 128 200 148 200 169 201 191 203 213 208 228 205 230 188 217 177 198 174 176 173 157 172 136 171 115 172 96 172 76 173 56 176\n88 0.999 121 219 123 222 128 222 135 222 141 223 147 223 153 223 159 223 165 223 171 222 171 215 169 210 164 210 158 210 151 210 145 210 138 210 133 210 127 210 121 213\n89 0.999 346 50 321 44 292 42 266 43 238 45 211 50 185 59 162 69 136 82 127 106 134 118 156 101 180 88 207 79 231 74 258 72 284 76 303 92 341 96 345 75\n89 0.999 432 102 395 105 354 108 316 111 274 115 235 118 196 122 160 126 121 130 85 135 88 166 127 167 165 166 205 166 244 163 285 161 321 160 360 159 408 157 429 137\n89 0.999 407 232 407 237 407 242 410 246 416 246 420 246 426 245 430 246 435 246 440 247 446 245 447 241 446 235 443 231 438 231 433 232 428 231 417 231 417 231 412 231\n89 0.999 394 297 394 300 394 303 396 306 398 306 401 306 404 306 407 307 409 307 412 307 416 305 416 302 415 298 413 297 410 297 407 297 404 297 399 296 399 296 396 296\n89 0.999 440 300 440 303 440 306 441 309 444 309 447 309 450 309 453 310 455 310 458 310 461 307 462 305 461 301 459 300 456 300 453 299 451 299 445 299 444 299 442 299\n90 0.999 263 74 267 97 291 98 315 98 338 99 361 99 385 100 408 100 431 101 454 101 477 103 471 77 450 77 426 76 403 75 380 75 356 74 332 74 308 73 285 72\n90 0.999 264 113 268 134 293 135 316 135 338 136 361 136 384 137 407 137 431 138 453 138 475 138 472 113 447 112 425 111 402 111 379 110 356 110 332 109 308 110 286 110\n90 0.999 27 121 27 149 26 183 49 179 75 172 101 169 126 168 153 170 180 175 205 181 229 187 230 159 232 129 208 123 182 122 157 122 131 121 103 120 75 120 51 120\n90 0.999 64 181 68 196 82 194 96 192 110 190 124 190 138 190 152 192 166 194 179 196 192 193 190 181 176 177 161 175 147 173 133 173 119 173 104 174 90 175 77 177\n91 0.999 66 72 78 89 96 81 114 75 132 71 151 69 170 69 189 70 207 73 225 78 239 76 235 59 216 54 197 50 178 49 159 49 140 49 120 52 101 57 84 64\n91 0.999 73 99 85 117 103 110 120 102 137 98 156 97 175 97 194 100 211 106 228 113 241 106 238 93 220 86 202 79 183 76 163 75 144 76 126 79 107 84 90 91\n91 0.999 96 123 105 130 118 136 131 140 144 142 158 144 172 143 184 142 196 138 209 135 216 128 206 118 194 122 182 125 169 126 155 127 142 125 130 123 117 118 106 113\n91 0.999 60 164 70 165 83 165 96 165 110 166 123 166 136 166 150 166 162 166 175 166 179 150 168 148 155 148 142 148 129 148 116 148 102 147 88 147 75 147 61 148\n91 0.999 137 171 140 186 156 186 170 186 185 186 200 186 214 186 229 187 243 187 257 187 270 186 267 170 253 170 238 170 223 170 209 170 194 170 179 170 165 170 150 170\n91 0.999 110 198 120 204 134 204 148 204 161 204 175 204 189 204 202 204 216 204 229 204 240 197 232 190 218 190 204 190 190 190 177 190 163 190 149 190 135 191 120 191\n91 0.999 93 215 108 215 124 216 141 216 157 216 173 216 190 216 206 217 222 217 239 217 249 205 234 204 218 205 201 205 185 204 169 205 152 205 135 204 119 204 102 204\n91 0.999 63 219 76 228 94 228 113 228 131 228 150 228 168 229 187 229 205 229 223 229 242 228 230 219 212 219 193 219 174 219 156 219 137 219 118 219 100 219 81 219\n92 0.999 422 76 378 72 329 71 282 70 240 72 192 74 146 78 98 82 58 101 59 147 61 189 104 175 150 163 199 155 245 151 291 152 336 156 378 163 424 174 427 125\n92 0.999 226 154 226 159 226 166 226 171 230 174 235 174 241 173 247 173 252 172 259 170 265 169 268 164 267 159 264 155 258 153 253 152 247 152 240 151 235 151 230 152\n92 0.999 380 178 350 176 318 176 287 177 259 176 228 178 200 180 168 182 134 183 119 196 121 227 152 226 184 225 214 223 246 223 276 222 303 221 331 220 363 221 382 206\n92 0.999 116 236 108 237 101 237 94 238 89 238 81 238 76 239 68 240 68 248 68 256 70 262 77 262 84 262 91 262 98 261 106 261 113 262 116 256 116 249 116 244\n93 0.999 292 44 292 46 292 49 295 50 297 50 300 50 303 50 306 50 308 50 310 50 314 49 314 46 312 43 309 43 307 43 304 43 302 43 299 43 296 43 293 43\n93 0.999 327 44 329 49 333 49 337 49 341 49 345 49 350 49 354 49 358 49 363 49 367 46 364 43 359 43 355 43 351 43 347 43 343 43 339 43 334 43 330 43\n93 0.999 360 70 332 70 303 72 273 73 243 73 214 73 181 74 145 75 133 95 131 133 128 171 139 177 161 163 193 145 221 133 251 123 285 112 314 107 341 104 361 96\n93 0.999 355 132 330 132 305 135 279 139 255 145 230 153 205 162 181 175 161 190 142 207 137 224 160 224 180 210 202 197 224 188 250 177 276 167 305 161 329 158 354 155\n93 0.999 105 137 104 146 104 155 104 163 104 173 106 181 105 190 106 200 106 210 106 219 113 224 115 214 115 205 115 196 114 187 113 177 113 169 111 159 111 150 112 142\n93 0.999 245 243 238 239 229 239 220 239 211 239 203 239 194 240 183 241 174 241 166 243 166 252 172 257 181 256 191 255 200 255 209 255 218 254 227 254 236 254 245 252\n93 0.999 246 258 237 255 228 255 219 255 210 256 202 257 192 257 182 257 173 257 166 261 166 270 173 273 183 272 193 272 202 272 210 272 220 271 230 271 239 271 246 267\n93 0.999 257 274 246 273 235 273 223 273 212 273 202 273 190 273 178 273 166 273 157 278 157 288 168 290 180 289 191 289 202 289 214 289 225 289 236 290 248 290 257 286\n94 0.999 163 92 170 114 189 111 208 107 227 105 246 105 266 105 286 107 304 111 321 116 337 115 334 96 316 90 297 85 278 82 259 81 238 81 219 81 201 84 182 88\n94 0.999 209 124 217 126 228 126 238 126 248 126 258 127 268 127 278 127 289 128 298 128 301 116 293 114 283 114 273 114 264 114 253 114 243 113 232 113 222 112 211 112\n94 0.999 114 145 141 146 170 146 199 147 228 147 257 148 287 149 316 150 346 150 374 150 393 135 366 132 337 131 308 131 279 130 250 130 219 129 191 129 162 128 132 127\n94 0.999 194 171 196 188 211 188 226 188 240 188 254 187 269 187 283 188 298 188 312 188 326 188 324 170 311 170 296 170 281 170 267 171 252 170 238 170 223 170 208 170\n94 0.999 161 190 169 211 192 210 215 210 238 210 261 210 284 211 306 211 330 211 352 211 375 212 366 191 343 191 320 190 297 190 275 190 251 190 229 190 207 190 183 189\n95 0.999 127 81 139 107 161 107 187 97 211 93 238 91 263 92 289 97 313 105 335 116 352 102 354 82 327 68 304 60 278 54 254 50 227 50 200 54 175 60 149 71\n95 0.999 158 169 174 171 194 171 214 171 234 171 254 170 272 170 293 171 311 171 331 171 336 148 320 146 300 146 281 147 261 147 243 147 223 147 203 146 182 146 161 147\n95 0.999 132 207 154 208 180 208 205 207 229 207 254 207 278 207 302 207 327 207 352 207 361 185 340 183 314 183 289 183 266 183 241 183 216 184 191 184 167 184 139 184\n95 0.999 192 220 192 237 193 252 207 253 219 253 233 254 246 254 259 254 272 254 284 254 294 252 295 235 293 219 282 219 270 218 257 219 244 219 230 218 217 218 200 219\n95 0.999 147 259 164 268 184 277 204 283 226 286 248 287 268 285 289 282 308 275 325 268 333 250 320 242 300 251 281 257 261 261 242 262 221 260 200 256 180 247 160 239\n96 0.999 50 85 51 100 63 101 79 100 94 99 109 98 124 98 139 98 155 100 169 103 180 105 181 90 168 85 153 81 138 80 123 80 108 79 92 80 76 81 62 82\n96 0.999 92 106 91 117 90 126 89 136 91 143 103 143 113 143 125 143 140 143 141 143 148 143 149 133 149 123 150 113 148 108 137 107 128 108 117 107 108 107 99 106\n96 0.999 66 155 67 167 74 173 86 172 98 172 110 172 121 172 133 172 148 172 159 173 168 172 168 160 160 155 149 155 137 154 125 154 113 154 101 154 89 154 77 155\n96 0.999 105 182 105 188 105 192 106 196 110 198 115 198 119 198 124 198 131 198 135 197 137 196 137 192 136 188 137 183 132 182 127 182 123 182 118 182 113 181 109 182\n96 0.999 178 203 163 203 148 204 134 204 120 204 105 204 91 204 77 204 64 204 48 204 48 220 61 221 76 221 91 221 105 221 120 221 134 221 149 221 164 220 177 217\n96 0.999 158 225 148 226 138 226 129 226 120 226 111 226 101 226 92 226 83 226 78 233 78 243 87 243 97 243 106 243 116 243 125 243 135 243 145 242 154 241 158 235\n97 0.999 66 3 66 3 64 2 61 2 57 3 55 3 53 4 51 5 47 7 54 6 59 4 53 9 50 8 47 9 53 8 62 8 69 9 67 8 67 7 65 4\n97 0.999 301 127 297 108 288 110 277 114 265 119 255 122 245 124 235 124 223 122 210 117 197 111 199 124 205 133 218 138 231 143 248 143 260 139 272 136 283 132 294 130\n97 0.999 295 155 293 143 289 134 280 137 272 141 265 144 257 146 250 147 241 147 231 147 221 143 218 150 221 159 229 164 240 166 255 168 265 166 274 164 282 160 291 158\n97 0.999 313 159 303 161 293 164 283 168 274 172 266 175 257 176 249 175 239 170 229 166 228 177 234 184 243 188 254 191 265 189 277 187 287 183 297 181 307 177 315 174\n98 0.999 214 105 213 132 236 143 260 143 290 142 319 141 329 141 354 141 379 141 413 140 430 141 430 116 413 106 385 105 361 105 339 105 314 104 286 104 261 104 235 103\n98 0.999 93 154 109 158 115 148 121 137 131 130 145 128 154 131 163 138 170 148 179 154 186 145 180 134 171 124 161 116 150 112 137 111 125 114 113 122 104 131 98 143\n98 0.999 428 158 406 159 386 159 362 160 337 161 314 163 286 164 263 165 238 167 216 167 213 190 235 189 260 188 284 187 308 185 331 184 354 183 379 182 402 180 425 178\n98 0.999 181 181 172 176 166 180 158 186 149 190 141 192 133 191 125 187 117 181 108 181 105 188 110 194 119 199 127 202 136 205 144 206 154 203 162 200 170 194 177 188\n99 0.999 66 116 65 162 95 160 135 152 175 145 214 141 253 141 290 147 329 156 365 168 401 180 407 145 377 125 336 108 298 96 260 92 220 88 178 90 140 96 100 104\n99 0.999 100 171 99 182 99 194 108 197 119 198 130 198 140 200 151 200 162 201 171 201 181 200 182 187 182 173 175 172 163 171 153 170 143 170 130 169 120 168 108 168\n99 0.999 210 173 210 184 210 200 216 207 226 207 238 207 250 208 262 209 274 209 283 208 288 210 293 201 291 189 285 178 278 172 268 172 256 171 243 171 231 170 218 170\n99 0.999 313 186 313 200 317 208 329 209 340 209 351 210 363 211 375 211 388 213 398 213 407 211 408 197 406 184 394 184 383 183 371 183 360 183 346 182 334 181 321 181\n99 0.999 63 238 62 276 83 284 116 285 147 285 178 286 210 288 240 289 272 290 302 291 333 291 337 258 315 249 283 247 251 245 220 243 188 241 156 238 124 236 91 236\n100 0.999 273 46 245 43 219 46 198 54 179 64 161 78 148 94 138 113 131 136 128 161 127 183 157 184 160 164 165 138 175 117 188 100 205 87 224 78 246 74 267 72\n100 0.999 273 47 270 69 283 81 300 91 313 103 325 119 331 134 336 151 337 174 343 190 367 192 371 178 371 157 368 135 362 115 352 98 342 84 328 71 312 59 294 51\n100 0.999 129 206 141 224 153 241 165 254 179 264 196 273 212 279 230 283 250 284 273 282 284 271 277 253 259 255 240 256 219 250 200 243 188 232 176 218 165 202 147 198\n100 0.999 364 211 351 204 339 198 330 206 325 214 318 222 311 230 304 237 295 243 283 250 284 260 289 270 297 274 310 267 322 259 331 253 338 246 344 238 351 231 358 221\n101 0.999 195 182 204 194 215 190 225 186 236 184 247 183 259 184 270 186 281 190 291 196 300 188 298 179 287 173 276 168 264 166 251 165 239 165 227 168 216 171 205 175\n101 0.999 215 203 218 213 226 213 233 215 241 215 248 215 256 215 265 215 273 215 281 215 284 210 284 198 279 197 270 197 261 197 253 197 245 197 236 197 228 198 222 200\n102 0.999 247 85 226 64 202 50 173 44 144 45 117 52 94 65 75 83 62 104 55 130 45 162 64 180 100 171 112 145 117 121 131 106 157 98 183 103 206 114 224 104\n103 0.999 82 63 82 68 82 72 81 75 81 82 82 87 86 87 91 87 96 87 101 87 106 86 109 84 109 79 109 74 109 70 108 65 103 64 98 63 92 62 86 62\n103 0.999 361 113 320 106 285 104 242 104 198 104 154 108 120 118 78 125 39 132 28 172 31 212 57 224 95 217 135 201 170 184 206 173 245 164 288 161 329 164 353 149\n103 0.999 447 182 411 185 377 187 337 189 294 191 254 192 219 194 181 197 145 201 143 243 142 275 182 275 221 274 262 272 300 270 335 268 370 267 412 265 450 255 447 218\n104 0.999 41 85 44 97 50 104 61 101 73 100 85 100 97 100 109 102 120 106 131 109 140 111 143 98 135 91 123 86 111 83 99 80 87 80 75 80 64 80 52 83\n104 0.999 56 104 56 114 61 118 70 118 79 119 87 120 96 120 104 121 113 122 121 123 129 123 130 112 125 108 116 107 107 106 99 105 90 105 82 104 73 104 64 103\n104 0.999 81 136 80 139 81 143 83 145 86 145 89 145 92 145 95 146 99 146 102 146 105 145 105 140 105 136 103 135 100 135 97 135 93 135 90 135 87 135 83 135\n104 0.999 76 148 76 152 77 156 81 157 85 157 89 157 93 158 97 158 101 158 105 159 109 158 109 152 108 148 105 148 100 148 96 148 92 147 88 147 84 147 80 146\n104 0.999 66 165 65 167 65 170 67 172 70 172 72 172 75 172 77 173 80 173 82 173 85 171 85 169 85 165 83 164 81 164 78 164 75 164 73 164 70 164 68 164\n104 0.999 100 166 99 168 99 171 100 174 103 174 105 174 108 174 110 174 113 174 115 175 118 174 118 170 118 167 116 166 114 166 111 166 109 166 106 166 104 165 101 165\n104 0.999 95 179 95 183 97 185 100 186 103 186 107 186 110 186 114 187 117 187 120 187 123 186 123 180 121 178 118 178 115 178 111 178 108 178 105 178 101 177 98 177\n104 0.999 59 196 65 198 71 198 78 198 85 198 92 199 99 199 106 199 113 200 120 200 124 195 118 193 111 193 104 193 97 193 90 192 83 192 77 192 70 191 63 191\n104 0.999 58 202 63 205 69 205 77 206 84 206 91 206 98 206 105 206 112 206 120 207 125 203 120 200 113 200 105 200 99 200 92 200 85 200 78 199 70 198 63 198\n104 0.999 85 213 85 215 88 215 90 215 91 215 93 215 95 215 97 215 100 215 102 215 104 212 102 211 100 211 98 211 96 211 94 211 92 211 90 210 88 210 86 210\n104 0.999 79 219 80 220 82 220 85 220 88 220 91 221 94 221 96 221 99 221 102 222 104 217 102 215 99 215 96 215 94 215 91 215 88 215 85 215 82 215 80 215\n105 0.999 409 175 390 137 351 102 306 81 263 76 239 84 197 83 152 99 123 131 102 166 76 215 109 236 159 231 178 196 206 165 244 149 263 154 308 174 337 206 373 199\n106 0.999 182 92 182 97 192 98 212 98 227 97 240 97 263 97 277 94 284 91 283 72 283 74 281 62 268 57 256 58 240 57 227 56 213 56 198 56 186 60 185 73\n106 0.999 179 97 179 112 179 129 179 144 193 145 210 144 235 142 253 141 266 141 289 138 288 138 290 129 290 111 289 99 274 98 257 98 240 97 224 97 209 97 192 98\n106 0.999 188 147 188 160 188 176 188 192 198 195 213 194 239 192 251 191 261 191 281 188 281 189 282 177 282 162 282 149 272 146 257 146 241 146 226 146 212 145 198 146\n106 0.999 294 189 278 189 260 191 243 192 227 193 212 194 204 194 191 194 175 202 176 205 174 235 187 236 203 236 221 235 237 234 255 234 269 233 286 232 293 221 293 205\n106 0.999 327 250 318 228 302 230 279 240 259 244 236 247 214 248 194 246 174 240 152 234 145 246 149 263 166 268 185 273 206 275 227 276 249 274 271 271 292 266 310 258\n106 0.999 353 343 352 339 346 338 341 338 336 338 331 338 328 338 323 338 318 338 313 338 311 342 312 348 317 350 321 350 326 350 332 349 337 349 342 349 347 349 352 348\n106 0.999 352 351 347 350 343 350 339 350 335 350 331 350 328 350 325 350 320 351 317 351 315 355 319 358 323 358 326 358 331 358 335 358 339 358 343 358 347 358 352 356\n107 0.999 42 104 74 125 116 98 155 79 194 67 242 62 283 64 331 72 373 92 414 119 447 117 427 80 388 56 347 37 301 23 254 21 205 23 161 32 117 53 79 77\n107 0.999 59 141 69 179 120 179 162 179 192 179 237 179 273 179 315 178 357 178 399 178 439 177 430 142 389 141 349 140 305 138 267 137 220 137 180 137 137 138 97 139\n107 0.999 323 193 310 193 294 193 276 193 254 194 240 194 223 195 205 195 187 195 175 200 176 215 193 216 211 216 228 216 243 216 259 216 277 216 292 216 310 216 322 211\n108 0.999 183 48 184 53 188 54 192 55 195 55 199 56 203 58 207 58 211 59 215 61 219 58 217 54 213 54 209 52 206 52 202 50 198 49 194 49 190 48 186 47\n108 0.999 173 56 173 63 180 64 185 66 192 67 197 68 203 69 208 71 215 72 220 73 227 72 225 66 219 65 214 63 208 62 202 61 196 59 191 58 185 57 179 55\n108 0.999 51 94 43 96 43 103 43 110 43 118 42 124 41 130 39 137 39 144 38 150 43 155 52 152 52 145 51 137 52 130 53 123 54 117 55 109 56 103 57 97\n108 0.999 143 194 145 198 152 198 158 199 165 200 171 200 178 201 184 202 190 202 195 198 196 194 192 187 186 186 180 186 173 185 166 184 159 183 154 181 146 180 144 184\n108 0.999 223 192 222 199 222 204 228 206 233 206 238 208 243 208 248 209 254 210 259 211 264 208 266 200 265 197 259 196 254 195 249 195 244 194 239 193 234 191 229 190\n108 0.999 165 205 169 211 178 212 186 212 196 213 204 214 214 216 222 216 231 216 240 218 250 215 244 209 236 208 227 208 218 208 208 206 200 206 191 204 182 203 173 202\n108 0.999 141 223 147 225 154 224 161 223 168 223 175 223 182 223 189 223 196 223 204 223 209 218 202 215 195 215 188 215 180 215 173 215 166 215 159 215 151 215 143 215\n108 0.999 220 219 222 224 228 226 232 226 238 227 242 228 247 228 253 230 257 232 262 233 268 229 265 225 260 223 255 223 250 222 245 221 240 220 235 219 230 218 225 218\n108 0.999 140 243 152 247 165 247 178 248 191 249 204 249 217 249 231 250 244 250 257 250 269 246 256 244 243 242 230 242 217 241 204 241 191 240 178 240 164 239 151 238\n109 0.999 139 99 136 111 135 124 134 137 133 151 132 164 132 176 131 190 130 203 129 215 140 222 146 211 146 198 146 185 147 171 147 157 148 145 148 131 150 117 151 104\n109 0.999 360 185 355 184 351 185 345 185 341 186 336 186 331 187 328 189 323 190 323 196 322 200 328 202 333 200 338 198 341 197 346 196 350 196 355 196 360 196 361 190\n109 0.999 319 198 314 196 310 189 303 189 296 190 289 191 286 200 290 205 296 205 305 204 304 204 300 204 292 206 286 203 283 212 291 215 298 215 305 213 311 210 317 207\n109 0.999 408 215 408 221 407 224 408 227 411 229 415 229 418 230 421 230 425 231 428 231 433 228 433 225 433 221 430 217 426 217 423 217 420 216 418 216 414 215 410 215\n109 0.999 468 224 461 222 452 221 444 221 438 227 433 227 425 230 418 229 410 228 401 228 402 236 409 241 416 242 425 242 433 240 439 235 446 234 454 234 461 234 469 233\n109 0.999 116 231 120 234 128 234 134 234 141 234 148 235 155 235 161 235 168 236 176 237 178 227 173 227 167 227 160 225 152 225 146 225 138 225 132 225 125 223 118 223\n109 0.999 101 251 100 255 103 257 105 257 108 257 110 257 113 257 116 257 119 258 121 258 125 253 125 251 122 249 119 249 116 249 113 249 111 249 108 249 105 249 103 249\n110 0.999 81 143 108 159 134 142 164 128 196 120 227 117 259 118 293 124 325 136 352 152 375 140 355 120 329 104 295 94 263 89 228 87 197 90 166 97 135 106 106 122\n110 0.999 444 127 443 131 442 133 442 137 442 139 442 143 444 144 448 145 451 146 454 146 456 146 459 144 460 140 460 137 460 134 459 132 456 130 453 129 450 129 447 127\n110 0.999 433 144 432 150 433 156 436 160 442 161 447 163 454 163 459 165 465 166 471 168 476 168 478 162 478 156 474 151 468 150 462 149 456 147 450 146 444 145 438 144\n110 0.999 435 164 434 169 434 175 439 178 445 179 450 181 455 182 460 184 466 185 472 187 477 185 478 179 477 174 472 172 467 170 461 169 456 168 450 167 445 165 439 164\n110 0.999 387 171 347 170 310 171 272 172 234 173 196 174 158 175 120 177 82 178 55 186 58 224 90 230 127 227 165 224 206 222 240 219 276 217 314 215 354 213 381 202\n110 0.999 435 182 434 188 434 193 439 196 446 198 451 199 456 201 463 202 469 203 475 206 480 203 481 198 480 191 475 189 469 188 463 187 457 185 451 185 445 183 439 182\n110 0.999 437 197 436 201 436 205 436 209 441 212 445 212 449 214 453 216 457 217 462 218 467 218 468 214 468 209 467 205 463 203 458 202 454 201 450 199 445 198 441 197\n110 0.999 434 211 433 216 432 220 433 225 439 227 444 229 450 231 454 233 459 234 465 237 470 238 472 232 472 226 469 222 464 219 459 218 454 216 449 215 444 213 438 211\n110 0.999 356 224 332 218 302 218 274 219 246 220 218 222 190 223 161 225 134 227 105 231 102 254 126 259 154 257 181 255 209 253 238 251 266 250 295 250 324 248 352 247\n111 0.999 161 85 176 109 187 126 210 119 235 110 261 109 287 112 312 119 335 130 355 142 370 143 378 120 357 107 336 95 312 85 287 76 261 71 235 71 209 73 184 79\n111 0.999 242 125 249 128 258 128 267 130 276 131 286 133 296 134 305 135 315 136 324 138 328 132 322 125 313 124 303 122 294 120 284 119 274 117 266 116 255 115 246 114\n111 0.999 220 127 220 142 227 150 242 152 258 153 273 154 287 156 302 157 317 158 331 160 345 161 345 147 337 139 323 138 308 136 293 135 278 133 264 131 249 129 235 127\n111 0.999 247 167 247 180 248 190 258 192 270 192 281 193 292 193 303 194 314 194 324 194 336 195 335 184 334 174 325 172 313 171 302 171 290 170 279 169 268 168 258 167\n111 0.999 67 224 74 241 81 255 98 247 115 244 132 243 149 242 167 244 184 249 198 256 210 248 217 234 205 224 188 219 171 215 154 213 135 213 119 214 100 216 83 220\n111 0.999 329 226 320 227 312 227 303 228 293 229 284 230 275 230 273 237 274 247 274 254 276 259 285 260 294 259 302 258 311 258 320 258 329 257 331 250 330 241 330 233\n111 0.999 93 249 94 258 104 259 115 259 125 259 135 259 147 259 156 258 166 258 176 258 185 257 183 248 173 248 163 247 153 246 143 246 133 246 122 246 113 245 102 246\n111 0.999 212 257 194 257 177 258 158 258 140 258 122 259 105 259 85 259 68 259 63 270 62 288 79 289 98 288 114 287 133 286 150 285 168 285 188 285 203 285 211 274\n111 0.999 208 288 190 289 175 289 159 290 143 290 127 291 110 291 93 292 77 293 64 298 59 316 74 318 89 316 105 315 121 314 138 313 158 313 178 312 192 311 206 304\n112 0.999 24 44 24 62 38 67 57 67 74 67 91 67 109 67 128 67 145 67 161 68 177 66 177 49 164 44 146 43 128 43 111 43 93 42 75 42 57 42 40 43\n112 0.999 153 70 135 70 116 69 97 69 78 69 59 69 44 68 24 70 26 93 26 109 26 129 31 144 50 136 70 128 84 120 105 116 122 113 140 111 151 101 151 86\n112 0.999 176 108 164 109 149 109 136 111 121 113 109 115 97 117 84 123 85 135 84 147 87 155 99 156 114 155 131 155 142 155 155 155 168 155 178 149 177 136 176 122\n112 0.999 34 162 33 180 32 199 49 200 66 200 85 201 102 201 122 201 139 202 155 202 170 201 170 183 170 167 155 164 136 163 119 164 100 164 82 163 67 163 49 163\n112 0.999 34 202 33 219 36 233 54 233 71 233 88 233 104 233 123 233 139 233 154 234 169 234 169 216 165 204 148 204 132 204 115 204 98 203 81 203 65 202 48 202\n112 0.999 30 254 44 255 58 255 73 256 88 256 104 256 119 256 135 257 151 257 166 257 173 245 161 241 146 241 131 240 116 240 100 240 84 239 68 239 52 239 35 238\n113 0.999 435 13 436 17 441 19 445 17 449 16 454 15 459 15 463 16 468 17 471 19 476 16 475 11 470 9 466 8 461 7 456 7 452 7 447 7 443 8 439 10\n113 0.999 440 22 440 26 443 30 446 28 450 27 453 26 456 26 461 27 464 28 468 29 472 27 473 22 470 20 466 19 462 18 458 18 454 18 449 18 446 19 442 20\n113 0.999 25 29 26 39 27 47 31 51 40 52 49 52 58 53 66 53 75 53 84 53 93 53 94 46 92 38 86 34 78 34 69 33 60 32 52 31 43 30 35 29\n113 0.999 463 46 462 45 460 44 457 45 455 45 452 45 449 45 448 46 448 48 448 50 448 52 449 55 452 56 454 56 456 56 459 56 461 56 463 52 464 50 464 48\n113 0.999 490 58 485 55 481 61 475 68 467 71 459 74 449 73 441 69 434 63 428 57 421 57 424 64 429 71 436 76 445 80 454 82 463 81 472 79 480 74 487 67\n113 0.999 43 55 44 60 44 65 49 67 54 67 59 67 65 67 70 67 75 67 80 67 85 65 85 60 82 55 78 55 72 56 67 56 63 55 59 55 53 54 48 54\n113 0.999 110 52 110 65 111 76 125 79 140 82 153 84 166 87 179 91 192 94 206 97 218 101 219 86 214 77 201 74 188 71 175 68 162 64 149 61 136 57 123 54\n113 0.999 435 57 435 62 440 67 444 70 449 71 454 72 460 72 466 70 470 68 474 65 478 61 476 57 471 57 467 60 462 62 457 63 452 63 446 61 442 58 438 55\n113 0.999 46 67 46 73 46 80 46 86 47 92 54 93 62 93 68 93 75 93 82 93 89 92 89 85 88 77 88 71 86 67 79 67 72 67 65 67 59 66 52 66\n113 0.999 133 85 137 88 143 90 149 92 155 93 162 94 169 96 175 97 182 99 188 101 192 96 186 93 181 91 174 90 168 88 162 86 155 85 149 83 143 82 136 81\n113 0.999 232 86 232 88 232 91 234 92 236 93 238 93 241 94 243 94 245 95 248 95 251 94 251 91 250 88 248 87 246 87 244 86 241 86 239 86 237 85 234 84\n113 0.999 220 90 220 94 223 98 226 98 231 99 235 100 239 101 244 102 248 103 252 104 257 105 258 100 254 97 251 95 246 95 241 94 237 93 233 92 229 90 224 89\n113 0.999 112 95 113 99 116 101 120 101 123 102 127 102 130 103 134 104 137 104 140 105 144 103 142 99 139 98 136 97 133 97 129 96 126 95 122 95 119 94 114 93\n113 0.999 149 98 149 103 149 106 150 109 154 110 158 111 162 112 165 112 170 113 174 114 178 114 178 109 179 105 176 102 172 102 169 101 164 100 161 99 157 98 153 98\n113 0.999 221 104 221 107 220 111 223 113 226 114 229 114 233 114 235 115 239 115 242 115 246 115 247 112 244 109 241 108 239 107 235 106 233 106 230 105 227 104 224 104\n113 0.999 184 107 183 110 185 114 189 115 192 115 196 116 200 117 204 118 207 119 212 120 216 119 217 114 214 113 210 112 206 112 203 110 198 110 195 109 190 108 187 107\n113 0.999 232 115 232 116 232 118 232 121 234 122 237 123 239 123 241 124 244 124 246 124 249 124 250 121 250 118 248 116 246 116 244 115 241 115 239 115 237 114 234 114\n113 0.999 224 120 223 125 225 129 228 129 233 130 237 131 241 131 246 132 250 134 254 134 259 134 260 129 258 126 253 125 249 124 245 124 241 123 237 122 232 120 228 120\n113 0.999 422 131 410 135 398 139 386 143 373 147 362 151 349 155 337 159 326 163 327 174 328 185 342 182 353 179 366 175 378 171 390 167 403 163 415 160 425 154 424 143\n113 0.999 307 154 301 155 295 158 290 159 285 161 279 163 273 165 267 168 266 172 267 177 267 183 273 182 278 181 283 179 289 176 295 174 300 172 306 171 308 166 308 160\n113 0.999 459 163 444 166 428 170 412 174 396 179 380 182 364 186 349 191 334 195 330 206 330 219 348 217 363 213 379 210 395 206 411 203 428 198 444 195 460 192 461 179\n113 0.999 307 177 299 179 290 182 281 185 271 188 266 193 266 202 267 210 267 218 266 226 267 233 278 232 288 231 297 228 307 225 314 221 313 212 311 203 309 194 308 187\n113 0.999 49 208 48 215 51 219 57 220 63 220 69 221 75 221 82 222 88 222 94 222 101 221 101 215 96 211 91 211 85 210 79 210 72 209 66 209 60 208 54 207\n113 0.999 50 221 49 226 51 232 57 232 63 232 69 232 75 233 81 233 87 233 94 235 100 234 101 227 97 223 91 222 85 222 78 221 73 221 67 221 61 221 55 220\n113 0.999 49 239 49 243 53 246 59 246 65 246 71 246 76 246 81 247 87 247 93 248 96 242 95 236 91 236 85 235 79 235 74 235 68 234 63 233 57 233 51 233\n113 0.999 443 237 426 238 409 241 392 243 375 246 358 249 341 251 324 253 308 255 292 258 284 268 301 267 317 265 335 263 352 261 369 259 386 257 403 254 420 252 438 250\n113 0.999 468 250 446 252 425 254 404 257 383 259 363 261 342 264 321 266 301 268 281 270 269 280 290 280 311 277 332 275 352 273 373 272 394 270 415 268 436 265 457 263\n113 0.999 476 266 455 265 433 267 411 269 390 271 368 272 347 274 325 276 304 278 284 280 269 286 288 289 310 287 331 286 353 285 375 283 397 282 419 280 440 278 462 277\n113 0.999 458 280 456 278 453 278 450 278 447 278 444 279 441 279 439 280 435 279 434 281 435 284 437 287 440 286 443 286 446 286 449 286 452 286 455 285 458 285 459 282\n113 0.999 427 285 411 285 395 286 380 287 363 288 348 288 331 289 316 289 300 290 285 292 276 299 292 300 307 299 323 298 339 298 355 297 370 296 385 295 401 295 418 295\n113 0.999 1 314 26 331 61 331 99 331 137 331 176 331 213 331 252 331 289 331 327 331 364 328 341 313 304 313 266 313 228 313 190 313 152 313 114 313 77 313 38 313\n114 0.999 234 104 237 107 242 107 248 107 253 107 257 106 262 106 267 107 273 107 278 107 279 96 276 95 271 95 266 95 260 94 255 95 249 94 244 94 238 95 234 97\n114 0.999 118 175 136 181 174 168 205 161 237 157 266 155 302 161 337 169 366 179 387 183 390 155 373 130 345 119 310 113 276 110 242 110 209 111 175 117 143 126 121 136\n114 0.999 88 233 109 244 146 245 183 245 220 245 257 246 297 247 338 248 378 248 413 248 417 215 399 193 363 193 323 191 285 191 247 190 210 191 170 190 130 191 93 192\n114 0.999 68 261 83 268 106 269 128 269 150 269 172 270 194 270 216 271 239 271 261 270 284 269 267 260 244 260 223 260 200 260 178 259 156 259 133 258 111 258 88 258\n114 0.999 292 280 303 282 320 283 338 283 356 283 371 282 387 282 408 283 426 283 439 282 439 263 426 262 412 261 395 261 377 260 359 260 344 259 328 259 310 258 295 259\n114 0.999 67 270 83 278 105 278 127 279 150 279 172 280 193 280 215 280 238 281 260 281 283 277 266 271 244 271 222 270 199 270 178 270 155 269 133 269 111 269 88 269\n114 0.999 65 280 65 289 71 289 78 288 84 288 91 289 97 289 104 289 111 289 118 289 126 287 125 280 118 279 112 279 104 279 97 279 90 279 84 279 77 278 70 278\n114 0.999 405 285 393 282 381 282 370 282 359 283 348 284 336 284 325 284 312 285 310 292 310 304 319 308 331 307 343 306 354 305 364 303 375 302 387 302 398 302 404 297\n114 0.999 63 298 82 302 105 302 128 302 150 303 172 303 195 303 217 303 240 303 262 303 281 297 262 294 239 294 217 293 194 293 172 293 148 292 126 292 104 292 82 292\n114 0.999 406 295 403 303 401 311 399 319 398 327 400 332 407 332 414 332 421 332 428 332 437 332 439 327 440 318 441 311 442 302 439 297 433 297 425 296 417 295 411 295\n114 0.999 63 304 80 313 101 313 123 313 146 313 168 314 191 314 213 314 236 315 258 316 281 312 264 305 242 305 219 304 196 304 174 304 151 303 128 303 106 303 84 303\n114 0.999 62 315 75 322 92 323 109 323 127 323 144 324 162 324 179 324 196 325 214 325 232 323 219 314 201 314 183 314 164 313 147 313 130 313 113 313 95 313 79 313\n115 0.999 378 50 347 50 320 51 293 51 263 51 237 53 216 53 186 53 153 53 122 55 124 86 151 88 180 88 209 87 238 87 265 87 293 87 323 87 353 85 375 77\n115 0.999 66 113 67 152 95 162 143 155 190 151 240 148 287 150 327 152 362 155 403 161 439 164 439 124 408 105 362 100 320 97 278 96 236 96 195 97 150 99 105 106\n115 0.999 194 161 194 191 192 217 222 218 258 217 294 217 328 219 356 218 375 219 402 219 427 219 428 191 422 166 393 162 364 161 336 158 310 158 281 158 251 158 221 159\n115 0.999 90 195 92 202 103 206 108 200 116 194 125 192 133 193 140 198 147 203 152 206 161 202 159 194 152 188 146 182 138 178 129 176 120 177 110 179 102 183 96 190\n115 0.999 166 222 155 219 149 227 143 237 131 243 121 244 111 240 102 232 96 222 86 221 84 229 88 239 96 249 106 256 117 257 126 258 140 258 149 252 157 243 163 233\n115 0.999 185 226 195 247 225 247 251 247 278 247 305 248 331 249 358 249 385 250 410 250 436 250 428 229 400 229 373 227 344 227 318 227 290 227 264 226 236 226 210 226\n115 0.999 167 252 180 273 209 273 235 273 265 274 295 274 323 275 352 275 379 276 406 277 433 276 424 257 393 256 364 256 336 254 308 254 278 253 250 253 221 252 192 251\n116 0.999 118 109 141 118 163 111 190 106 217 104 244 105 271 106 297 111 323 116 348 123 353 99 336 83 308 80 279 78 252 77 222 76 195 77 169 78 143 81 119 84\n116 0.999 451 87 448 95 447 100 444 117 442 126 442 135 452 144 465 145 476 145 485 148 493 148 495 140 496 128 496 130 497 121 497 107 495 93 481 87 469 87 460 87\n116 0.999 113 139 119 163 123 193 149 204 176 203 213 202 243 204 273 203 301 200 325 197 353 195 350 162 343 138 314 126 282 123 256 113 225 110 190 112 165 118 141 125\n116 0.999 335 205 335 206 334 209 334 215 335 221 338 223 342 222 345 222 349 222 352 222 356 220 356 217 356 212 356 207 355 203 354 203 348 203 344 203 341 203 338 204\n116 0.999 307 205 306 210 304 217 303 223 302 229 302 233 307 234 314 233 319 233 325 233 330 232 333 229 333 223 334 216 335 215 335 207 330 205 321 204 316 204 311 204\n116 0.999 118 225 118 236 132 236 144 235 157 235 170 235 182 235 193 234 206 234 210 226 211 220 208 210 194 207 181 207 168 206 156 206 144 205 134 205 121 207 119 217\n116 0.999 498 233 486 234 471 235 456 236 444 236 438 241 437 251 437 263 438 272 438 281 440 291 451 293 464 293 479 291 494 290 498 283 498 273 498 264 498 254 498 244\n116 0.999 298 240 297 245 295 254 293 262 291 269 295 271 302 271 309 272 315 272 320 272 328 273 330 269 331 260 332 253 333 246 331 243 323 242 316 241 310 241 304 241\n116 0.999 117 261 119 273 133 273 146 274 160 274 173 274 186 273 198 274 211 274 216 263 219 254 216 244 201 244 188 244 173 244 160 244 148 244 135 243 120 243 118 252\n117 0.999 406 102 378 70 344 48 307 33 268 24 226 24 188 35 170 35 128 52 96 74 69 110 99 132 125 102 162 82 194 61 225 65 266 65 305 74 339 91 370 119\n117 0.999 281 91 273 89 264 89 254 89 245 89 234 89 224 89 214 89 204 90 202 100 203 113 210 118 220 118 230 118 241 117 249 117 259 116 268 116 277 116 280 108\n117 0.999 347 133 333 119 309 120 286 121 263 122 239 123 215 125 193 125 170 126 147 127 130 134 144 150 167 149 191 148 215 147 237 146 260 145 283 145 306 144 329 143\n117 0.999 279 155 272 154 262 155 253 155 245 155 234 156 225 156 216 157 206 158 206 168 206 181 213 183 223 182 232 182 242 182 250 181 259 181 268 180 276 180 278 172\n117 0.999 380 182 356 161 329 179 301 190 270 198 239 201 208 201 176 195 147 184 121 169 93 186 112 206 140 219 172 229 204 237 236 238 269 235 301 227 329 217 355 202\n118 0.999 39 63 39 71 44 73 51 73 59 73 65 73 72 73 80 73 87 73 95 73 97 71 97 62 92 62 85 62 78 62 71 63 64 62 57 62 50 61 42 61\n118 0.999 113 82 106 78 96 75 86 74 76 73 67 73 57 74 48 74 38 75 29 77 25 82 31 87 40 84 51 82 60 81 70 82 79 82 89 83 99 85 109 88\n118 0.999 45 114 47 116 52 116 57 116 63 116 67 116 73 116 79 116 84 116 90 116 92 111 90 108 84 108 79 108 74 108 69 108 63 108 58 108 53 108 47 108\n118 0.999 23 116 23 127 32 127 42 127 53 127 63 127 73 127 83 128 94 128 105 128 113 128 113 116 104 116 93 116 83 116 72 116 62 116 52 116 41 115 32 115\n118 0.999 22 131 29 136 37 136 47 136 57 136 67 136 78 137 87 137 96 137 106 138 113 133 107 129 97 129 88 129 78 129 68 129 58 129 48 129 39 128 30 128\n118 0.999 32 141 37 144 45 144 52 144 60 144 67 144 76 144 84 144 92 144 99 144 104 140 99 137 91 137 84 137 76 136 68 136 60 136 52 136 44 136 37 136\n118 0.999 42 144 42 150 48 151 53 151 60 151 65 151 71 151 77 151 83 151 90 151 94 151 93 145 88 144 82 144 77 144 71 144 64 144 58 144 53 144 47 144\n119 0.999 292 41 286 32 277 25 268 21 257 21 247 21 237 23 228 28 220 35 213 42 209 51 220 54 225 47 232 40 239 34 249 32 258 31 267 32 275 38 281 46\n119 0.999 287 96 281 83 273 83 265 85 261 84 250 85 242 87 233 88 224 90 216 91 208 95 212 106 220 110 229 112 239 114 249 114 257 111 266 108 273 105 281 100\n119 0.999 390 117 366 101 326 106 294 111 254 118 224 120 186 123 142 127 112 139 114 174 113 216 135 238 174 235 220 214 246 204 279 198 308 198 345 197 380 187 391 159\n120 0.999 198 67 202 74 205 80 212 80 219 78 227 77 234 77 241 78 248 80 254 82 258 76 259 68 256 61 249 59 241 58 233 58 226 58 218 59 212 61 205 64\n120 0.999 95 68 94 101 111 117 145 117 178 118 210 119 250 118 271 117 305 117 336 117 366 116 368 85 340 76 308 74 270 69 250 81 218 79 186 75 156 70 124 68\n120 0.999 96 131 106 133 119 133 132 133 145 133 159 133 173 133 186 132 199 132 211 132 221 122 210 121 197 121 184 121 170 121 157 121 144 120 131 121 118 121 104 121\n120 0.999 262 131 271 134 282 134 294 134 306 134 318 134 330 134 342 134 354 134 366 134 372 123 363 120 351 120 339 120 327 120 315 120 304 120 291 120 280 120 268 120\n120 0.999 147 164 163 167 176 169 196 172 214 173 232 174 250 173 269 173 287 172 307 169 311 151 303 146 285 148 266 149 248 150 229 150 210 150 192 149 173 146 154 144\n120 0.999 68 172 57 173 45 175 34 176 24 178 13 179 2 181 0 188 0 198 0 208 0 215 12 215 23 214 34 213 44 212 55 210 64 209 70 201 69 193 69 183\n120 0.999 124 197 146 201 167 204 191 207 215 208 238 209 263 208 287 208 308 205 329 203 337 185 318 180 296 183 273 185 249 185 227 185 204 184 179 181 156 178 131 176\n121 0.999 197 90 197 95 197 100 198 105 198 111 200 114 206 115 213 114 219 114 224 114 230 113 230 107 230 101 229 96 229 89 224 88 218 88 213 88 207 89 202 89\n121 0.999 361 88 355 89 349 89 342 90 335 91 329 92 324 94 325 101 326 107 327 113 328 118 332 120 338 118 344 117 352 116 358 115 362 111 362 106 361 100 361 95\n121 0.999 352 152 344 157 336 161 331 166 322 171 314 176 304 179 296 182 287 184 283 187 285 198 295 198 304 197 313 195 322 191 330 186 337 182 345 176 353 171 356 163\n121 0.999 215 171 215 176 222 180 227 183 233 186 238 189 245 192 252 193 259 195 265 196 269 190 266 183 260 181 254 179 246 176 240 173 235 171 230 168 224 164 219 164\n121 0.999 197 201 198 220 200 237 219 237 242 237 262 237 283 238 302 238 323 239 343 239 361 237 361 217 358 202 337 203 316 203 296 203 276 203 257 203 237 202 216 202\n121 0.999 246 246 246 260 256 267 274 267 290 266 306 266 322 266 338 266 355 267 370 267 385 265 385 249 373 244 356 244 340 244 323 244 308 244 293 244 277 244 260 244\n121 0.999 173 242 172 253 171 264 171 275 173 289 177 295 190 297 201 297 215 297 228 297 233 295 233 283 233 272 232 260 230 244 225 240 216 240 206 240 195 239 182 240\n121 0.999 248 285 251 296 270 295 289 295 307 295 324 295 341 295 359 295 377 295 393 293 396 278 384 270 369 270 351 269 333 269 316 268 298 268 282 268 264 267 248 270\n122 0.999 317 82 305 85 291 88 278 91 264 95 252 98 239 101 227 105 214 108 202 112 204 127 216 126 229 123 243 120 256 117 269 114 282 111 294 108 307 104 318 101\n122 0.999 344 104 326 108 306 111 286 116 267 121 248 125 228 130 210 135 190 139 176 147 181 171 198 170 220 165 240 160 258 156 278 151 296 147 316 143 335 139 347 135\n122 0.999 326 145 311 147 295 150 280 154 265 157 250 160 234 163 219 167 203 171 196 178 203 198 217 201 233 198 250 194 264 190 279 186 294 183 308 181 323 179 327 171\n122 0.999 312 185 301 186 286 187 274 191 261 195 250 201 239 207 229 213 218 220 207 227 209 238 220 240 231 234 242 227 254 221 265 215 277 212 288 208 301 207 309 203\n122 0.999 494 233 490 233 482 234 476 234 469 235 462 237 461 242 462 251 462 257 463 265 464 269 469 271 476 270 484 270 491 269 496 268 496 263 496 255 495 248 495 239\n122 0.999 50 277 44 277 34 278 26 280 18 282 10 283 1 286 0 292 0 300 0 312 0 316 7 317 17 316 27 315 35 313 43 312 51 312 52 309 51 300 51 288\n122 0.999 122 331 120 312 113 303 101 305 89 307 77 309 65 312 54 314 43 316 32 319 22 321 17 331 29 332 41 332 55 332 66 331 78 331 89 331 102 331 114 332\n123 0.999 315 49 292 40 266 34 241 31 215 30 189 32 164 35 139 42 115 50 91 63 99 86 117 86 138 78 164 72 192 68 216 68 244 70 272 75 298 84 310 74\n123 0.999 106 143 103 143 101 143 98 143 95 143 92 143 90 143 88 144 88 147 88 149 88 153 91 154 93 154 95 154 99 154 102 153 103 153 106 151 106 148 107 146\n123 0.999 122 154 117 153 110 153 105 153 99 153 93 154 87 154 82 154 76 154 72 156 71 162 76 164 82 164 87 164 94 164 100 164 106 164 112 163 118 163 122 160\n123 0.999 146 172 138 172 130 172 122 172 114 172 107 173 100 173 93 173 84 173 82 178 82 187 90 188 96 188 104 188 113 188 120 188 127 187 135 187 143 187 146 180\n123 0.999 406 180 401 178 394 178 389 178 382 178 376 178 370 180 365 180 358 180 353 182 354 188 358 191 364 191 370 190 377 190 383 190 389 189 395 189 401 189 406 186\n123 0.999 403 189 397 189 392 189 387 190 381 190 376 190 371 190 366 190 360 191 357 195 358 200 363 201 368 201 373 201 379 201 384 200 390 200 395 200 401 200 404 195\n123 0.999 83 200 84 205 93 206 100 206 107 206 114 206 122 206 131 205 137 205 141 203 142 196 139 191 131 191 125 191 118 191 111 191 104 191 96 191 90 191 84 192\n123 0.999 407 202 401 202 395 202 390 202 383 202 377 203 371 203 366 203 360 203 355 203 354 210 359 211 365 211 370 211 377 211 383 211 389 210 395 210 401 210 407 208\n123 0.999 288 213 289 217 294 217 301 217 306 217 311 217 318 217 325 217 330 217 333 215 334 210 331 205 326 204 320 204 315 204 309 204 303 204 298 204 292 204 289 205\n123 0.999 94 210 94 216 93 222 96 225 102 225 107 225 112 225 118 225 122 224 128 225 131 224 131 218 131 211 128 210 123 210 118 210 113 210 107 210 103 210 98 210\n123 0.999 229 212 230 218 233 223 238 223 243 222 249 222 255 222 261 222 266 222 271 223 276 220 276 214 271 212 266 211 261 211 256 211 251 211 245 211 239 211 234 211\n123 0.999 286 220 286 227 287 233 294 232 300 232 306 232 312 232 319 232 325 232 330 232 336 232 336 225 333 219 328 219 322 219 316 219 309 219 303 219 297 219 292 219\n123 0.999 223 225 223 232 229 235 236 235 243 235 248 234 255 235 264 235 269 236 276 236 282 234 282 226 277 223 270 224 263 224 256 224 249 224 242 224 236 224 229 224\n123 0.999 298 235 298 239 298 243 298 247 302 247 306 247 310 247 315 247 318 247 321 247 325 247 325 242 325 237 323 234 319 234 316 234 312 234 308 234 304 234 301 234\n123 0.999 125 251 136 264 155 264 173 264 192 264 211 265 231 265 251 265 269 265 287 265 306 264 296 252 276 252 257 252 239 252 220 252 201 251 181 251 163 251 144 251\n124 0.999 69 91 112 113 150 92 179 72 215 58 255 54 289 58 312 67 349 91 384 99 419 76 387 49 347 24 305 12 289 12 246 12 202 14 165 26 134 41 103 63\n124 0.999 375 174 370 141 362 110 350 97 320 94 286 93 253 93 210 96 179 96 172 95 144 99 146 129 149 155 167 160 225 158 255 157 266 159 299 165 337 171 365 175\n124 0.999 337 173 321 172 301 172 280 173 257 174 237 174 216 174 191 174 170 172 156 176 155 202 172 205 191 204 212 203 236 202 256 202 275 201 295 200 317 200 335 199\n124 0.999 296 203 290 203 278 203 266 204 253 205 239 205 227 205 204 205 198 215 198 219 197 237 203 240 216 239 230 239 250 238 262 237 266 238 281 239 295 237 295 226\n124 0.999 417 284 406 249 389 229 354 248 305 268 266 276 224 276 173 264 135 249 106 233 94 248 84 299 116 334 150 353 203 367 242 370 284 367 331 356 368 336 402 309\n125 0.999 336 30 317 22 296 18 276 19 256 22 237 27 218 33 200 41 183 50 167 62 154 74 173 86 190 74 208 64 226 55 246 50 265 45 283 44 302 47 322 49\n125 0.999 333 60 315 62 296 66 277 70 259 75 240 78 222 83 203 87 184 91 166 95 158 115 177 114 196 110 215 106 234 101 253 97 270 92 288 89 306 86 324 82\n125 0.999 350 114 335 92 314 101 294 114 274 125 252 131 231 135 209 138 187 136 164 131 148 148 159 164 182 171 205 173 229 170 251 164 273 158 295 150 314 139 332 128\n126 0.999 357 80 336 62 309 44 282 34 254 29 234 36 215 46 189 52 161 59 140 71 143 90 158 115 179 104 202 86 226 78 242 78 271 86 301 96 326 104 343 96\n126 0.999 110 125 112 156 114 195 122 210 158 196 197 183 226 179 258 178 292 182 328 191 363 199 365 176 363 140 347 124 313 124 278 124 246 123 210 122 174 122 142 122\n126 0.999 219 197 205 200 192 205 182 213 173 224 165 234 156 244 146 251 134 258 135 270 137 283 149 278 160 270 170 260 179 249 186 241 195 232 207 225 219 222 221 209\n126 0.999 281 200 277 212 277 225 289 233 299 242 308 252 317 262 327 270 338 277 350 283 362 286 365 273 365 263 354 256 343 247 333 239 323 229 314 219 304 210 293 203\n126 0.999 45 293 43 301 42 310 47 314 55 315 63 315 71 315 79 316 87 316 95 317 103 315 105 307 106 299 101 293 92 294 85 294 76 293 67 293 58 292 52 292\n126 0.999 404 307 403 314 410 317 417 317 426 317 432 316 438 316 445 316 453 314 455 309 457 302 455 293 451 293 443 292 435 292 428 292 420 292 412 291 405 299 405 301\n126 0.999 20 326 50 355 92 355 142 355 193 355 243 355 291 355 340 355 389 355 439 355 484 349 451 321 407 321 361 320 311 320 263 318 213 318 163 318 112 318 64 318\n127 0.999 177 78 179 89 186 95 197 95 208 95 220 98 231 102 241 108 250 115 258 122 267 118 270 108 261 101 252 94 243 87 233 83 222 79 210 77 199 75 189 75\n127 0.999 128 78 113 92 110 117 108 138 105 157 101 188 100 212 98 236 95 262 92 289 107 294 120 281 122 259 124 234 127 215 129 193 132 170 135 139 138 113 140 92\n127 0.999 161 109 159 125 156 138 170 143 184 147 198 151 212 155 225 160 239 163 252 167 265 172 269 156 270 145 257 141 243 136 229 131 215 127 201 122 188 117 174 113\n127 0.999 172 186 176 190 185 194 193 198 201 201 210 203 219 205 229 206 240 206 244 205 245 191 242 188 233 188 223 187 214 185 205 183 197 179 189 174 180 170 176 176\n127 0.999 78 301 78 304 82 306 86 307 89 307 92 309 96 310 100 311 103 311 106 313 110 309 109 305 105 305 102 304 99 303 95 301 92 300 88 299 84 299 81 298\n127 0.999 80 310 78 313 81 315 85 316 87 318 91 319 94 319 97 320 100 320 103 321 108 320 108 315 104 314 102 313 98 311 95 311 92 310 89 310 85 309 82 308\n127 0.999 75 320 75 324 78 325 82 326 86 328 90 329 94 330 98 330 102 332 105 332 110 329 108 325 104 324 101 323 97 322 93 320 89 320 86 319 81 318 78 317\n127 0.999 159 324 161 328 167 329 172 329 178 329 183 330 189 330 194 331 199 332 205 332 211 330 208 327 203 325 197 325 192 324 187 324 181 323 175 323 170 323 165 323\n127 0.999 151 328 153 335 160 334 167 335 174 335 181 335 188 336 195 337 201 337 208 338 216 337 212 331 205 331 199 331 192 330 185 329 178 329 172 329 165 329 157 327\n127 0.999 79 331 79 333 80 335 82 337 85 337 87 338 89 338 92 339 94 339 97 339 100 337 99 333 97 332 95 332 93 332 91 330 88 330 86 329 83 329 81 328\n127 0.999 159 336 161 340 166 340 171 341 176 342 181 343 187 343 191 342 195 343 200 343 205 341 205 337 200 337 194 337 189 336 184 336 179 335 173 335 168 335 163 335\n127 0.999 70 340 69 344 74 345 77 346 82 347 86 348 90 349 94 350 98 352 102 352 107 349 106 344 102 343 98 342 94 341 89 339 86 338 82 338 77 336 72 336\n127 0.999 125 338 125 344 130 346 136 346 141 346 146 347 152 347 157 347 162 347 168 347 173 344 171 340 165 340 160 340 154 339 149 339 144 338 139 338 134 338 128 337\n127 0.999 190 343 191 349 195 349 200 350 204 350 209 350 213 350 218 351 222 351 226 351 231 348 229 343 225 343 220 343 216 343 212 343 207 343 202 343 198 343 194 343\n127 0.999 114 359 103 359 92 362 81 364 69 368 56 371 43 374 33 378 30 388 31 399 31 413 40 413 51 409 61 405 71 401 83 396 94 391 104 388 114 381 114 370\n127 0.999 311 368 286 370 259 373 234 376 208 378 180 380 156 381 129 385 107 387 103 408 103 433 126 431 153 429 178 427 203 424 229 421 254 419 278 416 307 413 309 392\n127 0.999 323 416 292 419 261 423 230 427 200 430 170 433 139 435 109 438 78 440 49 441 29 455 59 455 89 453 120 450 151 448 181 445 212 442 242 439 272 436 302 431\n128 0.999 206 54 211 60 222 57 230 53 238 51 248 50 257 51 267 53 275 57 283 61 291 58 286 50 277 46 269 42 259 40 249 40 240 40 231 41 222 44 214 48\n128 0.999 226 56 226 63 230 65 236 65 242 65 247 65 253 65 259 65 264 65 269 65 274 63 273 57 268 57 263 56 258 56 252 55 247 55 241 55 236 55 231 55\n128 0.999 195 69 198 84 205 98 215 107 228 113 243 115 259 114 272 110 285 102 295 91 302 78 295 68 285 76 278 88 265 96 250 99 237 99 224 92 214 79 209 66\n128 0.999 107 77 110 79 114 79 117 79 121 79 125 79 128 79 132 79 135 79 140 78 140 71 138 71 135 71 131 71 127 71 124 71 120 71 116 71 113 71 109 73\n128 0.999 0 101 0 105 0 110 2 114 7 114 12 114 17 114 21 114 25 115 30 115 35 113 37 108 36 103 31 101 26 101 21 101 17 100 12 100 7 100 3 100\n128 0.999 440 164 434 163 429 164 423 165 417 166 412 167 406 169 400 169 395 170 389 171 389 178 395 179 400 178 406 176 412 176 417 175 423 174 429 173 434 172 439 171\n128 0.999 449 192 447 191 444 191 441 192 438 192 436 193 434 193 431 193 428 193 427 194 426 197 428 200 431 200 434 199 437 199 439 199 441 198 444 198 447 198 450 195\n128 0.999 197 194 193 195 188 196 184 197 180 198 176 199 171 201 167 202 163 203 163 207 163 211 167 212 171 211 176 210 180 209 184 207 188 206 192 205 197 204 197 199\n128 0.999 101 225 101 230 103 233 107 237 111 237 115 237 120 238 125 238 129 239 133 240 137 237 137 232 135 227 131 227 127 227 123 226 118 225 113 225 109 224 105 224\n128 0.999 359 227 358 230 358 234 359 238 362 240 366 240 370 239 373 238 376 238 379 237 384 237 385 233 385 230 383 226 380 225 376 225 372 225 369 225 366 225 362 225\n128 0.999 305 242 301 241 296 242 292 242 288 243 284 244 280 246 276 246 272 247 272 252 272 257 277 257 281 256 285 254 289 254 293 253 297 253 301 252 306 250 306 247\n128 0.999 154 250 154 253 154 255 155 258 158 259 161 259 163 259 166 259 168 259 171 259 173 257 174 254 174 250 172 249 169 249 166 249 164 248 161 248 159 248 156 248\n128 0.999 473 250 473 254 474 257 476 261 479 260 482 259 485 259 488 259 492 258 495 258 498 257 499 253 497 248 495 248 491 248 488 248 485 248 482 249 479 249 477 249\n128 0.999 228 252 228 257 230 261 234 261 238 260 242 260 247 260 250 260 254 261 259 260 262 256 262 252 258 250 254 250 250 250 246 249 241 249 238 249 234 249 230 250\n128 0.999 316 250 316 253 317 256 318 259 320 259 322 259 325 259 328 259 330 260 333 259 335 257 335 255 334 251 333 251 331 250 328 250 325 250 322 249 321 249 318 249\n128 0.999 473 263 471 261 467 261 464 261 462 262 459 262 456 262 453 262 450 262 448 263 447 266 449 269 452 269 455 268 458 268 461 268 463 268 466 267 469 267 471 267\n128 0.999 422 268 418 263 413 264 408 266 403 267 398 269 393 270 388 271 382 273 379 275 378 280 381 284 386 283 392 281 396 280 401 278 407 277 412 276 416 274 421 273\n128 0.999 264 266 259 266 253 267 248 268 242 269 237 270 231 270 226 271 225 276 225 282 226 288 231 288 236 286 241 285 247 284 252 283 257 282 263 282 265 277 265 272\n128 0.999 267 269 266 271 266 273 266 275 266 278 267 280 268 280 270 279 273 279 275 279 277 278 278 277 279 275 279 272 279 270 277 269 274 269 272 269 270 269 269 268\n128 0.999 178 272 178 276 178 279 177 282 177 286 180 287 183 287 187 287 190 287 194 287 197 285 197 282 197 278 199 275 197 272 193 271 190 271 186 271 184 271 181 271\n129 0.999 131 93 140 120 149 144 183 138 208 132 232 129 260 129 292 131 324 138 346 149 359 133 368 108 360 88 330 76 301 69 271 67 242 68 211 69 181 75 155 83\n129 0.999 153 207 153 227 180 230 202 229 222 228 242 228 264 228 286 227 313 228 333 227 346 214 344 195 324 194 300 194 277 194 255 194 232 194 208 194 185 194 165 194\n129 0.999 344 242 321 240 305 240 285 240 260 240 234 241 212 242 189 242 167 242 155 253 157 273 176 275 199 275 221 275 242 275 264 275 285 275 310 275 331 276 342 264\n129 0.999 119 348 120 380 120 421 151 413 181 398 207 392 236 391 269 393 304 398 332 406 357 415 363 385 367 355 338 340 307 333 272 329 240 327 207 329 175 333 146 341\n130 0.999 99 17 97 25 95 36 96 43 105 47 112 50 120 54 127 57 136 61 142 63 151 66 152 60 152 49 151 41 144 37 137 34 129 30 121 26 114 22 105 18\n130 0.999 92 42 91 53 90 64 93 71 103 74 111 77 119 81 127 84 136 88 144 91 152 93 153 85 154 77 150 69 141 65 133 61 125 57 117 53 108 49 99 44\n130 0.999 97 74 97 79 97 92 97 101 103 104 111 107 118 110 126 112 134 115 141 117 149 119 150 112 151 102 150 97 142 93 135 90 127 86 120 83 112 79 103 75\n130 0.999 89 101 89 110 88 119 98 123 107 126 115 128 124 131 133 134 142 136 150 139 160 141 161 131 160 123 151 119 142 117 133 114 124 111 115 108 106 105 97 102\n130 0.999 61 145 70 152 82 158 93 164 106 170 118 174 131 176 144 177 157 176 169 174 172 165 163 158 150 159 138 159 126 157 114 153 102 148 91 142 79 135 68 131\n130 0.999 99 196 100 203 105 204 111 205 117 205 123 206 129 206 135 207 141 208 147 208 153 208 152 202 146 200 141 200 135 200 128 199 123 198 116 197 111 196 104 196\n130 0.999 102 205 102 212 106 214 112 215 118 215 123 215 129 216 135 216 141 217 146 218 151 217 151 209 146 208 141 207 135 207 129 206 123 206 118 205 112 205 106 204\n130 0.999 111 216 111 221 113 225 117 225 121 225 124 225 128 225 132 226 136 226 139 226 144 225 144 220 142 217 138 217 134 216 130 216 126 216 123 216 119 216 114 215\n130 0.999 110 226 110 230 111 235 116 235 120 235 124 236 128 236 132 236 137 236 141 236 145 236 145 230 144 226 139 226 135 226 131 226 127 225 123 225 118 225 114 225\n131 0.999 61 155 96 182 128 151 164 120 203 101 245 94 288 100 331 122 364 151 394 180 431 152 416 121 374 87 334 63 293 47 249 41 205 47 158 63 120 86 92 112\n131 0.999 145 138 143 164 162 185 194 185 218 184 233 184 248 183 276 183 302 183 326 186 350 186 351 160 348 139 320 137 296 135 271 134 245 134 217 134 190 135 169 136\n131 0.999 360 215 335 215 314 215 290 216 265 216 230 216 201 217 172 218 149 216 145 243 145 269 171 269 199 268 225 268 252 268 277 267 300 267 328 267 354 267 358 241\n132 0.999 435 187 408 156 372 127 331 111 287 102 242 101 198 111 158 129 115 154 82 179 59 212 97 221 127 193 167 173 209 154 246 142 290 144 331 154 364 170 398 196\n132 0.999 364 169 351 169 341 170 326 171 317 171 304 171 292 171 277 171 275 182 274 193 274 207 284 209 294 209 308 209 320 207 329 206 341 205 353 203 356 199 360 192\n132 0.999 183 179 183 185 182 191 181 198 182 205 187 206 194 206 201 206 210 206 217 205 224 204 225 198 225 191 225 183 224 178 218 176 210 176 203 176 191 177 183 180\n132 0.999 283 229 277 226 269 226 261 227 254 227 245 227 237 228 229 228 220 229 213 230 214 239 220 243 227 243 236 242 244 242 251 241 259 241 267 241 274 240 281 240\n132 0.999 397 232 364 233 331 236 295 237 263 240 228 242 194 243 159 246 122 248 95 254 96 287 129 286 162 284 198 283 231 280 263 278 296 277 331 275 360 273 393 270\n133 0.999 489 0 451 8 411 23 374 39 339 56 301 77 269 96 238 117 207 138 180 166 193 195 222 180 253 159 284 138 319 118 355 98 389 82 422 69 461 54 495 36\n133 0.999 143 41 127 36 110 35 95 37 81 42 69 51 57 60 48 70 41 82 33 94 30 108 42 105 49 93 57 82 67 71 79 62 92 55 105 51 120 52 133 54\n133 0.999 109 65 115 72 121 74 131 75 140 78 147 81 155 86 160 91 167 96 175 100 184 94 180 89 173 83 166 77 159 71 152 67 144 63 135 61 126 60 117 62\n133 0.999 95 67 83 71 71 78 63 86 55 94 48 104 41 114 37 123 33 134 29 145 30 157 40 152 46 145 52 135 58 126 65 116 72 105 79 95 88 86 96 80\n133 0.999 124 91 115 93 103 98 91 104 80 109 68 116 68 124 70 130 72 141 75 150 77 165 85 163 92 159 101 153 111 147 123 141 129 137 131 132 131 118 129 107\n134 0.999 2 11 2 22 11 26 21 26 32 26 42 26 52 26 62 26 72 26 83 25 94 22 94 12 83 11 73 11 63 11 53 11 43 11 33 11 22 11 12 10\n134 0.999 158 16 163 26 176 26 189 26 202 26 214 26 227 26 239 26 253 26 265 26 274 21 270 11 258 11 245 11 232 11 219 11 207 12 195 11 181 11 167 11\n134 0.999 153 30 161 36 174 36 186 36 199 36 211 36 225 36 237 36 249 36 262 36 273 29 264 25 252 25 239 25 227 25 214 25 202 25 189 25 176 25 164 25\n134 0.999 153 40 162 44 175 44 188 44 201 45 214 45 228 45 240 44 253 44 265 45 275 37 266 35 253 35 240 35 227 35 214 35 202 35 189 35 176 35 164 35\n134 0.999 147 46 156 54 166 54 179 54 190 54 203 54 215 54 227 54 238 54 250 54 260 48 251 44 240 44 228 44 216 44 205 44 193 44 182 44 169 44 157 43\n134 0.999 129 113 136 138 158 137 185 128 209 122 235 119 259 119 285 122 308 130 331 140 350 133 350 114 325 104 301 95 275 90 250 87 224 88 200 92 175 97 151 105\n134 0.999 104 98 98 105 99 116 99 127 99 139 99 151 98 159 98 170 98 183 98 197 108 198 118 191 117 180 117 168 117 157 117 145 117 136 117 125 117 114 116 100\n134 0.999 191 127 191 141 196 150 211 150 225 150 238 150 250 150 264 150 277 150 290 150 303 149 303 135 296 127 283 127 270 126 256 126 243 126 230 126 217 126 204 126\n134 0.999 137 155 137 181 145 199 173 199 199 199 226 200 252 200 277 201 303 201 329 202 353 201 354 173 342 158 315 157 291 157 267 156 240 156 214 156 187 155 162 155\n134 0.999 433 268 432 273 432 280 436 281 441 281 447 281 452 281 458 281 462 281 468 281 474 280 474 274 474 267 469 267 464 267 459 267 454 267 448 267 442 267 437 267\n134 0.999 161 276 161 283 166 284 172 284 177 285 182 285 187 285 192 285 197 285 202 285 208 282 207 277 203 274 198 274 193 275 188 274 182 274 177 274 171 274 166 274\n134 0.999 430 289 431 293 437 294 443 293 448 293 453 292 458 292 463 292 469 291 474 290 474 283 472 281 467 281 462 281 458 281 453 281 447 281 442 281 437 281 432 282\n134 0.999 217 284 210 283 204 284 197 284 190 284 184 284 177 284 170 284 164 284 159 287 161 293 166 296 173 296 180 296 187 296 193 295 199 295 206 296 213 296 217 290\n134 0.999 474 294 470 291 465 292 460 292 455 292 450 293 445 292 440 293 436 294 434 297 434 301 438 303 443 304 448 304 453 304 458 303 461 303 467 303 472 303 474 298\n135 0.999 158 60 175 62 198 62 221 62 245 62 268 62 295 62 318 62 340 62 363 62 365 40 346 38 326 38 303 38 280 38 258 38 233 37 210 37 185 37 161 37\n135 0.999 106 106 133 106 166 106 197 106 230 106 261 106 293 106 325 107 358 108 388 108 405 83 378 81 350 81 317 81 285 81 254 81 220 81 187 80 153 81 122 81\n135 0.999 145 160 160 184 181 186 201 172 224 162 247 159 271 161 293 169 313 182 335 188 352 166 343 154 321 141 302 132 279 126 256 123 230 125 207 131 187 141 167 152\n135 0.999 96 246 102 274 138 269 170 266 203 266 235 270 266 275 296 281 328 283 361 280 390 272 373 251 348 259 313 258 284 251 256 244 223 237 189 235 157 238 125 243\n135 0.999 60 293 56 293 54 293 52 293 50 293 47 293 45 293 44 295 45 298 45 300 45 301 45 303 50 303 51 303 54 303 56 303 59 303 60 300 61 297 60 295\n136 0.999 137 33 151 57 172 60 194 51 216 45 240 43 263 44 286 46 309 52 330 63 347 52 346 31 325 20 301 14 277 10 253 8 229 8 205 11 181 16 159 24\n136 0.999 150 96 162 109 185 109 205 109 226 109 246 109 267 110 287 109 308 110 329 110 350 110 337 96 316 96 295 96 275 96 254 96 233 96 212 96 192 96 170 96\n136 0.999 19 121 41 156 93 157 140 158 188 159 237 159 285 159 330 157 379 157 427 156 472 152 449 123 401 123 352 123 304 123 257 122 209 122 162 121 113 120 65 119\n136 0.999 213 170 216 175 226 176 234 176 243 177 252 178 260 178 267 178 276 179 285 178 287 166 284 159 275 158 267 159 260 159 250 159 242 159 233 159 224 159 214 160\n136 0.999 33 176 64 196 108 197 152 197 196 198 239 199 283 200 327 200 371 199 415 198 458 196 429 180 384 179 338 179 296 179 252 179 208 179 166 177 122 177 78 176\n136 0.999 83 239 118 240 154 240 190 240 227 241 264 241 299 241 335 241 372 241 408 241 424 215 389 212 353 212 314 212 280 212 243 212 207 212 171 212 135 211 98 211\n137 0.999 62 34 67 35 74 35 80 35 85 35 92 35 98 35 105 35 111 35 118 35 120 29 115 28 109 28 103 28 96 27 90 27 84 27 77 27 71 27 64 27\n137 0.999 35 42 45 43 57 43 68 43 80 43 92 43 104 43 117 43 128 43 140 43 148 36 138 35 127 35 116 35 104 35 91 35 80 35 68 35 55 34 44 34\n137 0.999 160 74 161 77 165 76 168 74 171 72 175 72 178 72 182 74 184 77 187 78 191 76 189 72 187 69 183 67 179 66 175 65 172 66 168 66 165 68 162 71\n137 0.999 161 89 161 94 164 96 167 98 171 100 174 101 178 101 182 100 185 98 188 96 190 92 189 89 186 88 182 91 180 94 175 94 172 93 169 92 166 89 163 87\n137 0.999 80 120 82 130 92 130 101 129 110 129 120 130 129 130 139 130 147 130 157 130 164 129 161 120 152 120 144 120 134 120 125 120 116 120 107 120 98 120 88 120\n137 0.999 93 151 94 159 102 160 110 161 119 161 127 162 135 163 144 164 152 164 162 165 169 164 168 155 160 154 151 154 143 153 134 153 126 152 117 151 109 150 100 150\n137 0.999 151 166 145 164 141 164 135 164 130 164 125 165 120 166 115 166 110 167 106 168 105 172 107 177 112 176 118 175 124 174 129 174 134 172 139 172 144 171 148 170\n137 0.999 166 175 158 175 148 176 139 177 130 178 121 179 111 180 102 181 92 182 84 182 83 191 91 192 100 191 110 190 119 189 129 188 138 187 147 186 156 185 165 185\n138 0.999 428 61 397 62 365 63 332 66 302 71 268 77 235 86 206 96 176 107 147 121 143 148 171 144 201 133 232 123 263 114 295 107 326 102 356 98 388 97 421 97\n138 0.999 166 97 162 97 158 98 153 98 149 100 144 101 140 102 135 103 131 105 131 108 132 114 136 117 141 116 145 114 149 113 153 113 158 112 161 111 165 110 166 104\n138 0.999 394 100 370 102 346 106 321 111 297 116 272 122 247 127 223 133 200 137 176 142 171 165 195 164 221 159 245 155 269 150 293 145 317 140 341 135 365 131 390 126\n138 0.999 346 153 330 157 310 160 289 164 272 166 254 167 237 172 220 176 200 179 195 192 192 221 201 226 224 223 245 220 264 217 284 214 304 210 322 206 336 202 341 185\n139 0.999 443 156 413 146 375 137 330 122 280 114 219 117 178 121 135 124 96 126 72 167 73 200 120 218 168 213 206 209 246 208 292 208 334 209 380 214 420 219 445 199\n139 0.999 29 250 74 252 123 252 171 252 215 252 256 252 301 252 349 252 405 251 445 250 477 229 439 225 389 224 342 223 295 223 250 223 203 222 155 222 105 224 61 226\n140 0.999 54 146 85 136 122 120 171 112 218 110 260 110 306 110 355 110 402 112 441 113 451 82 402 80 365 69 325 68 280 67 236 67 193 67 147 68 101 70 59 79\n140 0.999 128 145 160 147 194 147 228 146 263 147 296 147 330 147 368 148 406 148 436 149 455 124 426 123 390 122 356 121 321 121 287 121 253 120 217 120 182 119 145 119\n140 0.999 416 259 418 261 422 260 425 259 428 258 431 257 434 257 438 258 441 258 445 259 447 254 445 253 441 252 438 251 434 250 431 250 428 250 423 251 420 253 416 254\n140 0.999 423 261 423 266 423 270 423 274 423 276 426 276 429 276 434 276 439 276 439 276 442 274 445 273 445 267 444 264 443 261 440 258 437 258 433 258 429 258 425 259\n140 0.999 450 274 447 272 444 274 440 275 435 276 431 276 428 276 425 275 422 273 417 274 416 277 418 281 422 282 425 283 429 284 434 283 438 283 441 283 445 281 448 279\n141 0.999 127 234 135 262 163 258 190 256 216 255 245 254 272 254 297 255 325 257 352 258 376 257 374 227 346 225 319 222 292 222 265 222 236 222 208 223 180 225 153 228\n142 0.999 259 61 262 73 274 77 286 74 299 73 313 72 323 73 337 74 351 77 362 80 372 78 373 63 361 58 348 57 335 54 323 53 310 53 296 53 283 55 271 58\n142 0.999 284 84 285 92 291 97 298 97 305 97 313 98 320 98 328 98 336 97 343 97 348 94 347 84 341 84 333 84 326 84 320 83 313 83 305 83 298 83 291 83\n142 0.999 23 147 36 200 113 198 172 198 240 198 305 198 365 198 427 198 496 198 551 198 613 197 596 148 531 149 468 149 402 149 338 149 275 147 211 148 148 147 84 147\n142 0.999 629 262 602 260 589 261 555 260 531 260 504 261 478 261 453 261 427 261 408 272 408 299 431 301 458 300 485 300 511 300 536 299 563 299 590 299 615 298 628 286\n142 0.999 384 284 351 266 316 265 275 265 236 266 197 266 156 267 118 267 79 267 41 267 8 272 36 290 77 290 116 290 153 290 195 290 232 290 273 291 314 291 351 292\n143 0.999 70 159 91 194 120 202 160 192 199 186 237 185 277 185 317 190 358 201 394 211 425 191 433 165 381 149 343 139 307 131 266 127 227 127 184 130 144 135 106 144\n143 0.999 171 205 172 230 173 255 201 255 225 255 245 256 269 257 293 258 317 259 337 260 361 261 361 235 357 212 331 211 311 210 288 208 265 207 240 206 216 205 192 204\n143 0.999 175 279 176 309 179 334 207 334 234 334 259 334 286 334 312 334 338 334 362 334 386 334 386 304 378 283 351 282 327 281 302 280 277 279 250 278 222 277 197 276\n144 0.999 359 85 334 63 303 48 284 42 255 38 223 35 176 38 130 48 96 57 82 92 78 128 95 138 126 121 158 106 195 93 228 83 264 80 294 82 322 93 349 108\n144 0.999 214 108 215 120 216 133 225 140 243 140 255 140 266 141 276 141 290 141 296 141 306 140 306 127 306 116 300 109 286 108 274 108 262 107 250 106 236 106 225 107\n144 0.999 469 133 448 140 425 149 397 158 370 166 344 170 316 173 285 174 258 173 238 183 245 206 269 211 296 215 320 217 352 215 379 213 413 208 439 200 468 190 472 160\n144 0.999 291 145 286 144 280 144 275 144 270 144 265 144 259 144 254 144 249 144 242 144 240 148 245 150 250 150 256 150 262 150 267 150 272 150 278 150 282 150 289 150\n145 0.999 344 92 327 90 312 90 295 90 278 91 262 93 244 96 227 99 209 104 194 109 192 124 208 123 222 121 241 118 259 117 276 115 293 113 310 112 327 111 342 106\n145 0.999 335 110 319 111 306 112 292 114 277 115 263 117 248 118 234 120 219 122 204 123 197 132 213 132 226 131 240 130 256 129 270 127 284 125 300 124 314 123 329 122\n145 0.999 486 117 468 119 452 120 434 122 417 123 400 124 380 126 361 127 339 129 330 137 331 154 349 154 365 153 386 152 404 151 423 150 441 149 461 147 479 146 487 135\n145 0.999 206 142 193 143 182 144 169 145 157 146 145 147 131 148 118 150 102 151 97 160 97 173 110 172 120 172 135 170 149 169 161 169 175 168 188 167 202 166 206 155\n145 0.999 442 149 431 150 425 151 415 151 406 152 398 152 386 153 376 154 369 161 370 171 371 181 379 181 386 181 400 180 409 180 420 179 430 179 441 178 442 171 442 160\n145 0.999 185 186 184 175 184 168 176 169 168 169 161 170 152 170 143 170 134 170 126 171 120 172 115 177 120 181 130 183 138 184 147 186 155 187 163 189 172 190 184 191\n145 0.999 95 185 95 194 102 199 110 199 119 199 129 199 138 201 147 202 154 204 162 206 170 207 171 198 165 192 156 190 147 188 138 187 130 186 121 185 113 185 103 185\n145 0.999 496 196 461 195 428 197 395 198 361 199 328 199 293 200 259 201 223 202 191 203 169 218 202 220 233 219 269 219 303 218 336 217 370 215 404 214 438 213 471 213\n145 0.999 101 232 101 239 103 246 111 246 118 246 128 246 136 246 144 246 150 246 156 246 164 245 164 237 160 231 154 230 146 230 139 230 131 230 122 230 116 230 108 230\n145 0.999 434 234 425 233 418 234 408 234 400 234 391 234 382 234 372 234 361 234 355 238 355 248 364 248 372 248 382 248 391 248 400 248 409 248 418 248 428 248 434 244\n146 0.999 439 61 402 63 365 66 328 69 295 72 261 75 226 77 191 79 159 82 119 84 108 118 141 119 173 117 206 114 242 111 284 108 320 105 358 103 390 101 429 93\n146 0.999 443 103 406 103 370 106 334 108 299 109 265 112 230 115 196 117 161 120 127 121 106 142 143 142 177 141 211 139 243 138 284 137 321 134 355 133 389 131 435 128\n146 0.999 278 161 259 153 240 150 222 148 204 149 185 150 167 152 149 157 134 162 116 171 114 188 130 186 144 180 161 175 179 172 203 172 221 175 239 182 257 189 270 179\n146 0.999 0 199 0 205 0 212 0 216 4 217 12 217 20 216 26 217 34 217 34 212 34 213 34 206 33 200 27 198 21 197 16 198 10 198 4 198 0 203 0 205\n146 0.999 411 203 402 203 394 203 387 204 379 205 372 206 365 206 358 207 351 208 346 209 346 220 353 221 359 220 365 220 373 220 384 220 392 219 400 219 406 219 411 213\n146 0.999 333 250 332 261 338 268 349 268 361 268 375 267 388 267 400 267 412 267 424 266 427 266 429 254 419 248 406 248 394 248 385 248 375 249 362 249 349 249 341 249\n146 0.999 158 261 158 263 165 266 171 267 177 267 186 267 193 267 201 267 207 266 212 262 212 255 210 252 204 251 197 251 190 251 183 251 177 250 170 250 163 249 159 253\n146 0.999 280 271 265 257 250 258 230 268 210 272 190 275 171 274 153 270 135 263 118 256 108 268 115 280 132 289 152 295 171 298 189 299 209 297 228 293 247 288 263 280\n146 0.999 443 270 427 270 414 270 400 270 387 270 375 270 362 271 349 271 336 271 322 272 320 289 333 290 344 290 356 290 370 290 387 290 402 290 415 290 427 290 439 285\n147 0.999 11 93 29 117 59 109 92 102 119 99 152 97 182 98 214 100 241 104 270 111 294 111 283 84 254 78 225 74 194 70 161 68 129 70 99 73 68 78 39 84\n147 0.999 18 135 36 151 66 151 96 151 122 151 153 151 182 151 211 151 238 151 267 151 293 150 277 134 250 134 221 133 192 134 163 134 133 134 104 134 75 134 47 134\n147 0.999 268 173 249 171 228 171 206 172 183 171 161 172 140 172 119 172 96 172 73 172 65 187 85 188 108 188 130 188 151 187 172 188 194 188 216 189 240 189 260 189\n148 0.999 214 60 218 66 228 74 237 66 244 61 253 60 261 62 269 67 272 76 279 76 287 69 290 61 283 55 274 48 265 45 256 42 247 42 237 45 229 48 222 54\n148 0.999 136 97 156 109 180 121 205 129 232 134 259 134 287 133 311 130 337 121 358 111 367 97 348 82 325 93 302 100 275 103 249 103 222 101 197 93 171 82 150 76\n148 0.999 149 133 148 143 155 146 163 146 172 146 181 146 189 145 197 146 206 146 213 146 222 144 222 135 213 133 204 133 196 133 187 133 179 132 170 132 162 132 151 133\n148 0.999 239 137 243 147 257 147 270 147 283 148 296 148 310 148 322 148 333 149 345 149 356 148 352 135 340 135 327 135 314 135 301 135 288 135 276 135 263 135 251 135\n148 0.999 160 148 170 160 190 160 210 161 229 161 248 162 268 161 287 162 307 162 326 162 344 161 333 150 313 150 294 150 275 149 256 149 236 149 217 148 198 148 178 148\n148 0.999 161 163 170 175 190 175 209 175 228 175 248 175 266 176 286 176 305 176 325 176 343 175 332 164 313 165 294 164 275 164 256 164 236 163 217 163 198 162 179 162\n148 0.999 162 177 170 190 189 190 209 190 228 190 247 191 266 191 284 191 304 191 322 191 341 191 331 178 313 178 293 178 275 178 256 178 237 178 218 177 200 177 180 177\n148 0.999 165 192 175 203 193 203 212 204 231 204 249 205 267 205 286 205 304 206 322 206 340 203 328 193 311 193 292 193 273 192 255 192 236 192 218 191 198 191 180 191\n148 0.999 191 215 199 219 213 219 226 219 239 219 253 219 265 219 278 219 291 219 304 219 313 209 302 206 289 207 276 207 263 206 250 206 237 206 224 206 212 206 197 205\n148 0.999 182 221 186 233 203 233 218 233 233 233 248 233 263 233 278 234 293 234 310 234 323 234 317 220 301 220 286 220 271 220 256 220 241 219 225 219 211 220 196 220\n148 0.999 289 273 281 274 273 275 265 276 257 277 255 283 254 291 258 298 268 298 277 298 285 298 290 292 289 282 282 281 274 282 266 283 263 284 271 283 279 282 287 281\n148 0.999 128 295 125 294 123 294 120 294 118 295 115 295 113 295 111 295 111 298 112 300 112 302 114 304 117 305 120 305 122 305 125 306 128 305 129 302 129 300 129 298\n149 0.999 90 128 99 161 132 154 169 149 204 147 239 147 275 149 310 153 344 161 376 172 403 172 405 141 366 127 332 118 299 111 263 108 227 107 191 107 156 111 121 118\n149 0.999 200 190 200 202 203 214 215 213 227 213 239 213 251 214 264 214 275 214 287 214 298 213 298 198 296 187 284 187 272 187 260 187 247 188 235 187 222 187 210 188\n149 0.999 200 223 203 232 215 232 226 232 236 232 248 232 259 232 270 232 281 232 292 232 299 227 296 216 284 215 274 215 262 215 251 215 240 215 228 215 217 215 205 215\n149 0.999 109 261 111 285 134 287 156 287 180 288 203 288 227 289 250 290 272 290 295 290 316 288 316 260 293 259 271 258 247 258 224 258 200 258 177 258 154 257 131 258\n150 0.999 471 38 451 39 428 40 407 42 386 43 365 43 344 44 320 45 304 51 304 69 304 86 327 86 347 86 369 86 390 86 410 85 432 85 456 85 470 77 469 60\n150 0.999 377 101 376 113 376 124 383 128 397 129 409 129 421 130 433 130 446 130 456 130 466 128 468 116 469 105 457 103 445 102 432 101 419 100 407 100 395 99 386 99\n150 0.999 304 146 317 147 330 147 344 148 359 148 373 148 389 149 403 149 420 150 437 151 440 137 426 135 413 134 398 134 382 133 368 133 354 133 338 132 323 131 306 131\n150 0.999 181 145 183 152 185 159 192 157 199 154 206 153 214 153 221 154 228 157 234 160 241 157 243 148 238 143 231 140 224 137 217 136 208 136 201 137 194 140 188 142\n150 0.999 317 149 317 161 327 163 338 164 349 164 360 164 371 164 382 165 394 165 404 166 415 165 415 151 405 151 394 151 383 150 371 150 360 149 348 149 338 149 328 149\n150 0.999 173 159 170 173 167 186 164 201 160 216 178 219 191 222 206 225 225 223 237 223 247 220 252 204 256 190 260 172 261 163 245 162 227 161 212 159 196 159 183 159\n150 0.999 171 231 175 236 182 240 190 243 197 245 205 246 214 245 222 243 231 240 237 237 238 230 233 223 226 225 217 228 210 230 202 230 193 228 186 225 181 222 175 224\n151 0.999 407 71 376 71 338 72 302 74 265 76 230 79 191 83 155 86 117 93 81 101 74 133 109 134 146 130 184 126 220 123 256 121 292 119 328 116 365 114 407 111\n151 0.999 436 115 399 117 358 118 318 119 277 120 237 122 194 125 153 130 113 135 74 140 63 174 99 179 141 171 181 166 221 160 260 157 300 157 342 156 381 156 428 156\n151 0.999 276 166 266 165 256 166 245 166 233 168 223 168 211 169 201 171 190 171 180 172 177 181 185 186 198 187 209 187 220 186 230 186 240 185 251 185 260 184 272 179\n151 0.999 399 168 371 170 332 171 295 173 257 182 226 185 186 186 146 184 109 185 101 212 94 255 116 264 157 261 191 259 232 258 272 258 315 257 351 254 390 249 392 221\n152 0.999 77 50 93 94 129 87 169 74 206 67 242 63 278 64 316 69 354 79 392 93 417 75 421 45 382 30 346 20 307 13 269 10 231 9 189 12 151 20 114 32\n152 0.999 141 88 143 122 162 134 188 132 211 132 235 131 258 131 282 130 308 130 333 130 356 127 355 96 342 82 319 82 292 82 268 82 244 82 218 81 192 82 166 83\n152 0.999 53 188 87 190 134 190 180 190 221 189 262 189 302 189 346 188 389 188 432 186 447 147 415 136 370 134 329 134 285 134 242 134 200 134 156 135 114 136 69 135\n152 0.999 14 199 52 214 103 214 152 215 201 214 250 214 298 214 348 214 397 213 446 212 496 211 458 193 408 193 359 194 311 193 262 193 212 193 162 194 112 194 62 195\n152 0.999 49 232 78 234 108 235 136 235 165 235 192 235 219 235 249 235 277 235 305 235 330 222 302 217 274 217 246 217 217 217 188 217 161 217 132 217 103 217 73 218\n153 0.999 429 88 398 63 367 46 328 33 286 26 241 23 202 25 162 31 123 45 92 65 66 89 102 102 136 79 173 65 212 59 254 58 292 62 330 69 361 82 399 104\n153 0.999 154 75 160 94 176 103 196 98 213 96 224 96 242 96 260 97 277 98 296 100 310 89 314 74 298 67 281 64 263 63 243 62 226 62 204 64 182 66 164 70\n153 0.999 100 131 100 170 130 190 172 189 206 189 228 189 264 189 299 189 335 190 379 189 406 188 408 149 393 129 352 129 318 129 279 129 242 129 192 130 149 130 116 131\n153 0.999 52 233 87 258 123 278 166 289 208 296 252 299 296 299 337 291 378 273 409 249 436 219 408 203 364 218 325 233 285 240 240 241 199 236 158 225 116 212 78 196\n154 0.999 471 97 438 79 402 71 367 66 330 67 295 74 261 81 229 94 197 108 167 127 150 152 181 165 210 146 243 129 275 118 307 109 342 106 377 106 410 111 444 123\n154 0.999 480 136 441 141 404 149 365 156 323 163 284 171 245 178 207 185 167 190 122 200 125 245 162 246 204 238 244 231 284 224 321 217 362 208 400 201 436 194 476 188\n154 0.999 483 193 445 198 407 205 368 212 327 218 287 225 248 232 210 238 169 245 128 251 129 298 166 301 208 294 249 287 287 280 325 272 365 263 404 256 440 249 479 242\n154 0.999 462 295 438 275 415 291 392 304 366 316 340 324 312 330 285 332 256 332 230 329 211 345 224 361 253 365 284 368 312 364 340 358 368 349 393 339 417 327 439 312\n155 0.999 61 123 81 138 128 133 171 124 218 116 256 119 307 124 357 131 403 137 435 137 433 107 415 77 367 71 323 65 280 62 233 61 189 63 145 68 103 72 68 74\n155 0.999 264 115 259 115 254 114 247 114 241 114 236 114 230 114 224 115 217 116 211 117 208 121 212 123 219 122 225 121 231 121 237 121 243 121 249 121 255 121 262 120\n155 0.999 159 129 157 154 158 174 183 174 207 174 228 174 259 175 287 174 308 175 324 175 343 174 341 149 336 128 313 125 292 123 267 121 245 121 221 121 195 123 178 126\n155 0.999 192 189 199 193 213 194 226 194 239 195 252 195 268 196 284 196 298 196 307 195 310 185 301 175 287 175 273 175 260 175 246 175 233 175 219 175 204 175 193 175\n155 0.999 45 226 91 229 134 228 172 229 216 230 258 232 305 233 353 233 397 233 438 234 457 210 421 200 376 198 333 196 286 195 240 193 197 193 156 190 109 189 63 190\n155 0.999 102 231 108 236 117 236 126 236 135 236 143 236 153 236 161 236 170 236 179 236 188 233 180 229 171 229 163 229 154 229 145 229 137 229 128 229 118 229 110 229\n155 0.999 411 240 373 241 343 242 306 241 272 242 237 243 207 243 176 243 141 243 99 244 97 273 134 275 170 275 205 275 239 274 272 274 306 274 340 273 374 273 409 271\n155 0.999 430 271 412 272 397 273 377 273 361 273 344 274 331 274 315 274 298 275 284 282 286 298 304 299 322 299 339 299 356 298 373 298 390 298 407 298 426 298 432 289\n155 0.999 77 274 77 291 85 300 101 300 115 300 131 301 147 301 166 301 180 301 192 301 206 301 206 283 198 275 182 275 167 275 151 275 136 275 120 274 103 274 91 273\n156 0.999 32 66 40 122 101 120 165 98 216 78 273 71 326 72 380 87 436 109 494 125 549 125 534 70 476 63 421 41 365 22 306 15 248 15 190 28 141 54 88 67\n156 0.999 249 88 249 98 258 100 269 101 277 101 286 101 295 101 304 101 314 101 325 101 330 91 328 78 319 78 309 77 299 76 290 75 280 75 270 75 261 75 251 77\n156 0.999 179 110 191 125 215 125 240 125 264 125 288 126 309 126 333 126 357 127 382 127 401 116 389 101 364 101 340 101 316 102 292 103 269 103 245 102 221 101 196 99\n156 0.999 33 132 69 158 122 158 175 158 228 158 280 159 332 160 384 160 438 160 489 160 541 160 506 133 453 132 400 132 347 132 295 131 243 131 189 130 138 130 85 129\n157 0.999 214 136 216 180 245 186 287 178 325 170 363 164 401 164 442 170 483 177 528 185 553 187 555 143 522 130 483 121 445 113 405 109 365 109 325 114 286 122 247 131\n157 0.999 330 185 329 200 336 208 350 208 361 208 373 208 385 209 398 209 411 209 428 209 433 206 435 190 426 183 414 183 402 183 389 183 377 183 363 183 351 183 336 183\n157 0.999 105 189 105 195 107 202 112 201 116 202 121 202 125 204 130 206 135 207 140 208 145 206 146 198 143 196 138 196 133 194 128 192 124 191 119 190 114 189 108 189\n157 0.999 180 248 205 264 255 262 300 261 343 261 385 260 428 260 471 259 517 259 571 256 567 223 533 212 491 214 447 215 405 215 361 216 318 215 273 214 230 210 182 208\n157 0.999 503 268 474 267 448 267 422 267 395 267 370 268 343 268 316 268 290 268 266 268 255 287 280 289 307 289 333 290 360 290 386 289 412 289 438 289 464 289 491 289\n158 0.999 236 1 210 0 181 0 149 0 120 0 101 0 75 0 46 0 21 0 0 3 0 46 24 56 56 56 82 56 113 55 138 53 171 51 198 48 227 44 234 44\n158 0.999 474 9 428 8 365 16 308 32 254 51 209 68 162 82 110 95 60 99 0 103 0 174 47 187 109 187 163 155 204 133 251 112 306 89 357 76 410 64 447 57\n158 0.999 111 58 102 58 89 58 76 59 65 60 62 60 52 61 41 61 32 70 32 73 31 89 42 93 57 92 66 92 79 92 88 91 101 91 111 90 112 79 112 78\n158 0.999 451 70 422 72 388 77 356 86 326 97 307 108 275 116 244 128 216 137 188 144 178 171 206 175 241 166 269 155 300 141 325 130 356 120 386 111 418 103 440 101\n158 0.999 263 181 276 195 300 195 319 195 341 195 374 195 396 195 416 195 436 194 453 179 453 159 444 158 419 157 397 157 375 156 351 156 328 155 303 156 285 155 263 164\n159 0.999 19 31 21 65 31 82 70 68 99 60 132 55 169 55 206 59 239 68 262 78 288 87 289 52 271 28 241 16 210 7 175 5 142 4 109 6 78 11 48 22\n159 0.999 84 98 91 112 102 113 117 107 132 103 147 101 163 102 180 104 196 109 208 114 216 104 217 91 201 85 186 81 172 78 156 75 140 77 124 81 109 85 96 91\n159 0.999 13 136 12 143 12 151 19 154 26 154 34 154 42 154 51 154 60 154 65 154 72 154 73 146 72 136 66 136 59 136 51 136 43 135 35 135 27 135 20 135\n159 0.999 236 138 235 145 235 152 242 155 249 155 257 155 264 155 273 155 281 155 285 156 291 155 292 146 291 138 285 138 278 138 271 137 263 137 256 137 248 137 242 137\n159 0.999 84 182 84 198 94 205 111 205 126 206 143 206 158 207 176 207 193 207 204 208 215 204 216 187 201 183 187 182 172 182 157 181 141 181 125 180 110 180 95 179\n159 0.999 298 198 269 210 236 224 200 233 166 236 134 236 102 232 69 223 39 210 16 222 18 255 49 267 80 277 114 285 150 288 186 286 220 281 253 273 287 260 297 229\n159 0.999 111 231 118 236 127 236 137 236 147 236 157 236 167 236 178 236 188 235 197 235 200 223 192 223 182 223 173 223 164 223 154 223 144 223 134 223 124 222 113 222\n160 0.999 367 20 338 12 317 9 293 9 260 10 231 12 207 15 180 18 150 24 133 39 140 62 157 66 182 60 212 54 240 52 269 52 296 53 322 56 351 60 359 45\n160 0.999 312 116 291 114 279 113 268 113 243 113 221 115 206 118 187 121 185 141 185 159 185 173 200 173 216 172 240 171 260 171 280 171 301 170 313 164 311 148 311 133\n160 0.999 319 217 299 220 287 222 272 219 254 220 237 221 220 223 203 221 182 230 183 247 182 272 191 274 209 273 232 269 255 265 274 263 294 260 310 256 324 248 320 233\n160 0.999 324 275 311 274 293 274 277 275 260 275 245 276 228 276 212 277 195 277 182 280 182 298 198 299 212 298 228 298 245 298 262 298 278 298 293 298 310 298 326 294\n160 0.999 156 301 158 322 179 323 203 322 223 322 244 321 266 322 292 321 310 321 329 321 351 321 347 300 328 301 306 301 284 301 263 301 240 301 218 301 197 300 178 301\n160 0.999 268 362 267 372 277 374 289 374 298 374 307 374 317 374 332 374 340 374 349 374 359 374 359 361 349 359 340 359 329 359 319 358 307 358 296 358 286 359 276 359\n160 0.999 222 369 224 372 228 372 233 373 237 372 242 372 247 373 254 373 257 372 261 370 263 364 259 360 254 360 251 360 246 360 240 360 236 360 230 361 226 361 223 363\n161 0.999 358 168 320 145 311 121 262 99 217 93 178 96 144 106 110 127 82 151 64 186 43 212 65 229 96 243 135 233 163 200 194 182 234 178 273 187 308 191 329 185\n161 0.999 152 216 150 234 151 256 171 258 188 259 208 261 228 262 248 263 267 264 287 266 306 263 307 244 308 226 290 225 269 223 249 222 227 220 208 217 190 216 170 216\n161 0.999 118 284 117 307 143 310 168 312 191 314 216 315 240 317 265 318 289 320 314 322 335 318 335 296 310 290 289 289 262 287 237 285 210 284 186 282 162 281 138 279\n161 0.999 150 339 142 339 135 341 125 342 116 343 108 345 100 346 91 348 85 350 79 354 85 363 91 365 99 365 110 363 119 362 128 362 136 360 145 359 150 354 150 348\n161 0.999 188 350 198 356 214 357 230 358 243 359 259 360 274 361 290 362 305 363 322 363 335 353 325 349 308 348 293 346 277 345 261 344 245 343 227 342 212 341 198 340\n162 0.999 361 10 330 29 301 48 270 68 240 85 207 97 174 104 140 111 105 113 69 112 47 129 79 133 116 134 152 131 189 123 224 114 256 99 288 82 318 62 347 42\n162 0.999 346 48 318 64 287 84 255 100 222 114 189 125 161 129 126 133 94 132 58 131 54 156 85 166 122 167 157 165 192 160 226 150 259 138 293 123 326 109 353 84\n162 0.999 360 89 329 107 298 127 260 141 225 152 186 161 158 164 120 166 84 166 51 163 41 198 80 202 120 205 157 206 193 203 230 194 266 185 307 174 345 160 357 124\n162 0.999 371 169 336 176 300 185 263 193 226 200 190 207 159 206 122 207 87 206 52 202 35 224 70 228 109 232 147 233 184 232 223 230 259 225 296 219 334 212 371 205\n162 0.999 310 305 309 310 309 316 310 321 312 324 318 324 326 325 332 326 339 327 346 326 345 327 346 322 344 316 344 311 341 308 335 306 331 306 325 306 318 305 315 304\n163 0.999 173 83 179 96 194 95 209 94 224 93 239 93 254 93 269 94 284 95 298 96 313 96 308 83 293 81 278 80 263 78 247 78 232 78 216 78 201 78 186 80\n163 0.999 18 154 50 201 85 196 132 182 179 174 228 169 278 169 325 175 373 185 417 200 445 185 451 144 398 124 350 111 302 105 253 103 202 104 153 108 106 118 61 131\n163 0.999 196 183 197 193 208 195 219 194 229 194 239 195 250 194 261 194 271 194 281 194 292 193 290 180 280 180 269 180 259 180 248 180 238 180 227 180 216 180 205 181\n163 0.999 119 212 122 250 146 253 174 253 203 253 232 253 260 253 289 253 319 253 346 253 373 252 373 211 350 210 321 210 292 209 264 209 234 209 205 209 176 210 147 209\n164 0.999 611 102 560 77 495 40 433 15 365 7 298 18 233 40 172 64 104 81 41 93 30 159 86 159 153 141 215 119 273 94 335 74 397 73 455 86 502 112 570 144\n164 0.999 614 179 560 158 498 126 435 97 371 88 304 100 246 125 189 151 128 177 68 186 38 227 105 225 167 208 227 183 285 161 345 139 406 147 462 166 511 190 577 222\n164 0.999 453 184 444 204 451 221 467 231 483 240 500 249 516 258 532 264 551 272 572 279 590 286 595 266 593 249 574 243 556 237 540 229 521 222 502 210 476 198 468 191\n165 0.999 58 144 94 145 119 128 138 96 170 77 209 75 243 92 267 126 278 157 303 162 322 142 309 102 291 70 260 46 225 30 190 26 151 32 118 47 91 73 74 105\n165 0.999 339 56 339 59 338 63 341 65 344 65 347 63 350 63 353 63 355 63 357 63 361 60 361 58 360 54 357 54 354 53 352 53 349 53 346 54 343 55 341 55\n165 0.999 351 125 348 122 345 121 340 121 335 122 331 122 328 123 323 123 319 124 320 129 320 133 322 136 327 136 330 136 334 136 339 136 343 135 347 135 351 133 352 129\n165 0.999 364 134 357 134 354 134 347 135 341 135 336 135 331 135 325 137 325 144 324 149 326 154 331 155 338 154 343 154 349 154 354 154 360 154 364 151 364 145 364 141\n165 0.999 144 233 127 231 112 231 95 231 77 232 61 233 45 233 28 233 9 233 0 238 0 255 11 259 30 258 45 258 62 258 79 258 96 257 113 254 126 252 144 247\n165 0.999 264 267 278 269 295 270 314 270 333 271 352 272 371 272 389 273 408 274 430 274 432 258 421 249 402 249 383 248 365 247 345 247 326 246 307 246 288 245 268 245\n165 0.999 175 252 166 252 159 253 149 255 140 257 131 258 123 259 114 260 104 262 103 269 103 277 111 278 120 277 129 275 138 274 146 273 155 272 164 270 173 268 175 261\n165 0.999 231 252 208 255 189 259 165 267 139 272 118 275 97 277 67 277 52 290 51 308 52 326 77 324 101 322 125 319 148 317 169 315 192 312 216 308 232 292 232 274\n165 0.999 261 272 262 290 281 292 302 292 321 293 341 293 360 294 379 295 397 296 417 296 435 296 434 276 416 275 396 275 376 275 357 274 337 273 317 273 297 272 280 272\n165 0.999 263 294 263 313 277 319 298 319 319 319 339 320 358 321 377 321 396 321 415 322 434 322 434 302 418 298 398 297 378 297 358 296 338 296 319 295 299 295 281 294\n165 0.999 235 309 218 309 201 311 183 313 165 315 148 317 131 319 114 321 96 323 81 325 73 332 91 332 108 332 126 332 144 332 161 332 179 330 197 328 213 327 233 324\n166 0.999 412 4 413 14 422 15 432 15 441 15 450 15 459 15 468 15 477 15 485 15 495 13 493 3 485 3 476 3 467 3 458 3 448 3 439 3 430 3 420 3\n166 0.999 334 96 298 93 261 91 225 90 187 93 148 102 114 118 86 139 61 165 38 191 36 229 60 209 86 184 113 161 144 143 180 132 214 125 253 125 291 127 329 129\n166 0.999 347 104 345 112 344 123 343 133 350 138 358 141 366 144 376 148 384 154 393 160 400 163 401 153 401 143 401 135 397 127 390 122 381 118 373 113 364 109 355 105\n166 0.999 175 133 177 153 195 157 219 155 241 154 263 154 284 155 306 157 327 161 346 165 367 168 366 145 347 136 327 133 305 130 283 129 261 127 239 127 216 127 195 130\n166 0.999 246 166 229 168 210 170 194 173 177 175 161 179 147 184 130 189 115 195 110 205 113 221 128 218 143 212 159 207 176 203 191 200 208 197 224 195 242 195 247 183\n166 0.999 315 170 316 182 318 195 330 199 342 202 355 206 366 209 377 213 389 219 400 225 410 232 410 221 406 209 396 200 386 193 376 188 364 183 352 178 339 174 327 171\n166 0.999 191 204 181 204 172 206 163 208 155 210 146 212 139 215 130 218 124 220 122 227 123 237 131 235 140 232 148 230 156 228 164 226 172 224 181 224 190 222 191 213\n166 0.999 366 212 365 218 366 225 368 230 373 233 380 235 385 237 390 239 396 241 401 243 407 246 407 238 407 233 403 228 398 225 393 223 388 220 382 218 377 214 371 213\n166 0.999 121 241 139 254 168 248 198 240 229 238 259 236 289 236 320 240 351 244 380 253 408 264 391 239 363 229 333 223 303 219 271 217 242 218 210 221 179 225 149 233\n167 0.999 300 106 291 107 280 110 269 113 259 115 247 118 238 123 230 132 222 141 223 152 225 164 230 172 240 164 246 157 255 146 265 141 275 139 285 141 291 135 296 121\n167 0.999 302 175 296 174 291 176 286 178 281 180 276 183 271 182 266 182 260 184 256 186 255 195 260 196 267 194 272 193 277 191 282 190 287 189 292 188 297 187 302 184\n167 0.999 295 187 291 187 287 188 283 189 279 190 275 191 271 192 267 193 263 194 261 197 262 203 265 204 270 203 274 202 278 201 282 200 286 199 290 198 293 198 295 194\n167 0.999 252 188 246 188 240 190 234 192 228 194 222 195 216 196 209 198 204 200 198 201 198 210 205 210 211 209 218 207 224 205 229 204 235 202 241 201 247 199 252 197\n167 0.999 296 198 292 198 288 198 284 199 280 200 275 201 271 202 266 203 262 204 260 207 261 213 265 215 269 214 274 213 278 212 282 211 286 210 290 210 293 210 296 206\n167 0.999 241 201 236 201 233 202 229 203 225 205 221 205 217 206 213 207 209 208 207 211 207 215 211 216 216 216 220 215 224 214 228 213 231 212 235 211 239 211 241 207\n167 0.999 254 208 249 207 244 208 239 210 234 211 229 213 224 213 218 215 213 215 208 217 208 224 213 226 219 225 225 224 230 223 234 222 240 220 245 219 249 219 254 216\n168 0.999 323 9 302 8 275 9 250 11 237 12 211 13 192 14 178 17 178 37 178 61 179 80 199 82 218 81 239 80 259 81 280 83 303 83 319 80 321 61 323 35\n168 0.999 331 21 328 49 328 80 342 92 367 89 393 87 420 86 444 84 470 85 497 83 523 85 529 65 528 40 517 25 489 24 461 24 432 23 410 22 381 22 353 22\n168 0.999 324 86 311 91 294 104 278 116 263 120 242 120 223 120 203 121 182 125 167 133 168 149 189 148 208 145 228 145 246 144 266 142 281 133 294 138 311 124 325 108\n168 0.999 595 97 542 88 485 87 431 89 383 100 331 121 286 147 233 168 185 182 141 187 135 237 177 242 228 229 279 212 329 193 381 174 431 158 478 151 537 143 589 143\n168 0.999 281 94 267 92 251 93 236 93 225 93 209 93 194 94 179 96 165 98 164 113 166 129 176 130 190 124 203 122 218 121 232 121 246 121 259 121 274 120 282 108\n168 0.999 225 145 218 142 210 142 201 144 195 145 187 147 179 148 171 149 168 157 169 163 168 170 176 176 184 174 192 172 199 170 207 168 214 168 221 167 225 158 225 150\n168 0.999 517 157 509 157 497 157 488 158 483 158 472 158 467 165 465 174 465 184 463 193 467 198 476 199 485 199 494 198 503 196 511 196 519 194 521 187 520 175 519 166\n168 0.999 326 205 326 239 349 253 388 253 421 253 457 253 494 253 528 253 560 253 595 252 629 249 630 216 608 205 570 204 536 203 500 202 464 202 428 202 394 201 358 201\n169 0.999 57 36 70 80 119 76 152 73 203 69 259 66 298 66 335 69 369 75 416 75 455 67 454 28 409 22 363 17 320 15 277 14 231 15 185 17 140 21 100 28\n169 0.999 484 82 445 87 397 97 335 104 291 106 239 106 181 107 116 103 59 96 15 109 10 167 63 181 127 188 184 191 236 191 289 190 349 186 400 177 450 165 489 140\n169 0.999 455 222 419 217 377 215 323 220 285 223 246 226 197 226 145 221 96 220 64 247 63 299 94 313 146 309 191 305 236 294 277 281 323 272 373 264 405 266 450 267\n169 0.999 431 275 422 276 405 279 378 276 361 277 346 277 327 276 303 277 280 290 276 313 271 335 284 340 310 334 330 330 348 327 365 325 387 322 404 315 416 304 429 302\n170 0.999 67 4 58 4 51 4 42 5 34 5 26 4 18 4 9 4 1 4 0 11 0 18 7 19 16 19 24 19 32 19 40 18 48 19 56 19 64 19 67 12\n170 0.999 156 7 156 16 163 19 172 19 182 19 192 19 201 19 211 19 220 19 229 18 238 14 236 5 229 4 219 4 209 4 199 4 190 4 180 4 171 3 161 3\n170 0.999 26 20 22 20 18 20 14 20 10 21 6 21 2 21 0 23 0 27 0 30 0 33 4 34 7 34 11 34 15 34 19 34 23 34 26 33 26 28 26 24\n170 0.999 60 39 56 35 48 35 41 35 34 35 27 35 19 36 12 36 4 36 0 40 0 45 4 49 11 49 18 49 25 49 33 49 40 49 48 49 57 49 60 44\n170 0.999 71 47 76 58 83 64 94 59 105 55 116 54 127 55 139 58 150 63 160 66 166 58 169 50 160 44 148 39 138 36 126 34 115 35 103 36 92 39 81 43\n170 0.999 195 37 194 42 195 48 199 50 204 50 210 50 216 50 222 50 228 49 234 49 239 48 239 42 239 36 235 35 229 35 223 35 217 35 210 35 205 35 199 35\n170 0.999 66 54 57 61 49 72 42 82 38 95 36 106 34 119 36 135 40 147 45 158 53 162 66 156 64 148 58 135 56 123 57 108 61 95 66 83 73 74 74 63\n170 0.999 178 60 169 67 165 74 172 84 178 95 181 105 182 117 181 129 179 140 178 148 187 152 196 150 198 141 199 130 201 120 201 109 199 98 195 87 191 77 184 68\n170 0.999 70 79 71 91 72 99 84 97 96 93 108 91 121 91 134 91 147 92 159 95 166 95 166 81 162 72 151 68 139 64 126 62 114 62 101 64 90 68 78 74\n170 0.999 0 67 0 70 0 74 0 77 0 80 1 82 5 82 8 81 12 81 15 81 19 81 19 77 19 74 19 70 19 66 15 66 12 66 8 66 5 66 2 66\n170 0.999 168 99 156 97 144 97 132 96 119 97 106 98 94 98 82 100 70 103 71 113 71 124 82 126 93 126 105 126 117 126 130 126 143 124 156 122 167 120 168 110\n170 0.999 167 124 156 126 144 130 132 132 121 133 108 133 94 132 80 130 72 132 72 143 74 153 84 156 96 159 108 161 121 161 134 159 147 157 159 153 169 146 168 135\n170 0.999 140 168 136 163 132 163 127 163 123 163 118 163 114 163 109 163 105 163 105 168 105 171 106 174 110 175 115 175 119 175 123 175 128 175 133 175 137 174 139 170\n170 0.999 165 186 160 177 153 172 144 177 134 180 123 181 112 181 102 179 92 176 84 175 79 182 78 190 88 194 98 197 108 198 118 199 129 198 139 196 148 193 157 190\n170 0.999 0 174 0 178 0 184 0 188 6 188 11 189 16 188 21 188 26 188 31 189 36 188 37 183 36 178 35 174 30 174 24 174 19 174 14 173 9 173 4 173\n170 0.999 0 207 2 211 9 211 16 211 24 211 31 211 38 211 45 211 52 211 60 210 61 202 58 201 51 201 44 201 37 201 30 200 23 200 16 200 9 200 1 200\n170 0.999 5 215 5 220 5 226 10 227 15 227 20 227 26 227 31 227 37 227 42 227 47 225 47 220 47 215 42 213 37 213 31 213 25 213 19 213 15 213 9 213\n171 0.999 1 38 2 58 31 56 51 53 73 52 92 51 111 51 131 51 152 51 170 45 171 34 165 20 144 18 123 18 105 18 86 18 66 17 44 17 23 17 2 20\n171 0.999 565 75 535 64 497 53 456 46 415 43 374 43 334 46 295 51 255 58 217 67 199 99 228 102 264 94 306 86 348 83 388 81 427 82 466 85 509 91 551 100\n171 0.999 148 125 162 176 220 153 266 140 322 131 369 126 420 127 468 130 520 137 569 153 610 166 600 120 549 102 495 93 445 89 394 86 344 87 291 91 241 103 195 113\n171 0.999 481 135 460 135 443 135 421 136 401 136 380 136 359 136 338 136 318 136 296 136 289 154 308 156 329 156 348 156 369 156 392 156 412 156 432 156 454 156 477 156\n171 0.999 551 148 516 155 489 161 452 166 416 169 381 169 345 168 308 163 272 156 239 150 226 180 260 185 294 192 328 197 366 201 404 199 440 198 475 196 510 189 548 180\n171 0.999 184 190 184 206 196 211 206 211 224 211 237 211 248 211 260 211 274 211 287 211 292 210 293 194 282 189 270 188 258 188 246 188 233 188 219 188 206 188 194 189\n171 0.999 517 203 524 209 539 209 550 209 564 209 577 209 589 209 601 209 613 208 625 206 625 192 617 190 604 190 592 190 580 190 569 190 557 190 544 190 531 190 520 191\n171 0.999 638 207 625 207 614 207 600 207 587 207 573 207 560 207 546 207 532 207 517 209 517 225 528 227 542 227 555 227 569 227 584 227 598 227 611 227 627 228 638 223\n171 0.999 290 212 278 210 271 210 259 210 250 210 238 210 227 210 216 210 204 210 192 212 192 226 198 229 210 229 219 229 232 229 245 229 256 229 266 229 279 229 288 226\n171 0.999 495 230 476 229 465 230 446 230 427 231 410 231 392 231 374 231 356 231 338 232 336 252 351 255 369 255 387 255 405 255 424 255 441 255 459 255 479 255 495 250\n172 0.999 125 88 125 102 123 124 121 146 123 164 140 166 158 170 177 175 195 182 212 188 225 195 233 185 233 166 234 149 225 135 208 125 195 116 177 106 161 94 141 88\n172 0.999 28 196 42 224 63 244 94 233 123 232 153 240 182 250 212 261 243 271 273 277 301 275 301 246 277 231 242 224 213 213 185 203 155 192 122 185 90 181 57 185\n173 0.999 151 23 152 47 163 56 181 56 201 56 220 56 239 56 257 56 275 56 293 56 309 55 311 28 300 23 280 22 262 22 244 23 225 23 205 23 186 22 166 21\n173 0.999 177 59 177 73 187 77 199 77 211 77 224 77 236 77 248 77 260 77 271 77 282 76 283 59 274 56 261 56 249 56 237 57 225 56 212 57 200 57 188 57\n173 0.999 177 78 180 88 192 88 203 88 214 88 225 88 237 88 248 88 259 88 270 88 282 87 278 77 267 77 255 77 244 77 233 77 222 77 211 77 199 77 188 77\n173 0.999 178 92 185 97 196 97 207 97 218 97 229 97 239 97 250 97 261 97 272 97 282 92 274 88 263 88 252 88 241 88 230 88 219 88 209 88 197 88 186 88\n173 0.999 88 122 99 146 135 141 167 137 200 135 233 133 265 135 298 136 329 138 361 142 391 144 381 116 348 111 316 108 283 105 250 105 218 105 184 107 151 110 119 114\n173 0.999 21 276 61 279 109 279 154 279 201 279 247 279 293 279 339 279 385 279 430 279 455 249 413 247 369 246 323 246 278 246 232 246 185 246 139 245 92 246 46 246\n174 0.999 107 53 106 73 110 83 130 84 150 89 166 97 180 110 190 124 197 137 204 152 225 146 227 134 220 116 213 101 202 86 190 76 175 67 158 60 142 56 123 54\n174 0.999 89 56 77 61 62 68 51 78 38 88 30 100 23 113 18 125 14 135 9 151 20 156 37 158 41 148 46 131 54 119 63 108 75 99 88 91 97 83 93 71\n174 0.999 223 209 202 198 183 215 166 233 142 242 122 245 100 242 78 232 62 216 44 206 25 221 42 238 59 253 80 264 102 271 127 273 152 269 174 260 193 248 209 229\n174 0.999 157 207 151 206 147 208 139 211 132 213 125 214 118 213 110 212 104 210 97 208 93 213 97 217 104 220 111 221 118 223 126 223 134 222 141 220 147 216 153 213\n175 0.999 179 37 173 35 168 34 162 35 156 35 150 37 144 38 138 40 132 41 128 43 127 51 132 53 137 51 144 49 150 48 156 47 161 46 166 46 172 46 177 43\n175 0.999 61 45 63 48 68 49 73 49 79 49 84 50 91 51 96 52 102 53 107 52 111 45 109 41 103 41 96 40 91 40 85 39 79 39 74 40 69 40 64 41\n175 0.999 166 52 156 53 144 53 133 54 121 54 109 54 97 55 82 55 67 55 65 63 64 76 70 81 84 80 97 80 111 79 123 78 135 77 145 75 160 73 165 65\n175 0.999 189 78 175 81 155 85 135 89 116 93 95 96 73 98 57 110 60 129 64 153 64 176 64 183 81 183 109 173 136 165 160 158 176 154 190 142 190 126 190 98\n175 0.999 176 204 164 199 153 195 139 193 126 193 113 193 100 194 85 196 70 199 66 208 67 221 74 228 87 225 100 220 115 218 128 218 140 219 151 221 166 225 172 216\n175 0.999 167 225 157 226 147 226 136 226 124 226 114 225 101 226 85 226 76 230 78 241 77 255 83 256 98 255 111 255 124 254 136 254 146 255 155 255 164 253 165 238\n175 0.999 174 257 162 255 149 255 136 255 123 255 109 256 95 258 77 257 63 256 61 263 62 279 67 289 82 287 96 285 111 283 126 280 138 278 149 277 164 278 170 271\n176 0.999 496 82 487 79 478 77 471 76 461 76 452 77 443 78 433 78 425 79 424 87 425 97 430 99 439 97 448 96 456 96 465 96 475 97 484 97 490 99 495 93\n176 0.999 86 147 95 182 122 172 152 150 184 133 219 129 257 133 287 146 318 168 347 190 373 186 376 159 347 134 316 114 285 98 246 88 208 86 173 91 138 105 109 126\n176 0.999 426 100 427 109 434 109 441 109 449 109 457 109 465 109 472 109 480 109 489 109 496 108 495 99 488 99 480 99 472 99 464 99 456 99 448 99 440 99 433 99\n176 0.999 433 120 433 125 435 131 441 130 447 130 454 130 460 130 467 131 473 131 479 131 485 130 485 123 482 118 477 118 470 118 464 118 457 118 451 118 443 118 438 118\n176 0.999 422 131 421 139 428 143 436 143 445 143 453 142 461 142 470 142 479 142 487 142 496 141 497 132 491 130 482 130 473 130 464 130 455 130 447 130 438 130 430 129\n176 0.999 188 138 186 150 186 165 186 175 198 177 211 178 225 180 238 182 250 183 262 186 272 186 274 171 276 158 277 146 265 144 250 143 235 141 222 140 206 139 198 138\n176 0.999 421 149 421 158 428 161 437 161 445 161 454 161 463 161 471 161 480 161 489 161 498 161 497 151 491 148 482 148 473 148 465 147 455 147 446 147 436 147 430 147\n176 0.999 405 221 404 228 409 229 415 229 421 230 428 230 433 231 439 231 445 231 451 232 456 225 453 221 448 221 442 221 436 221 431 220 425 220 418 220 413 219 410 220\n177 0.999 191 85 196 98 205 102 219 100 233 99 247 100 260 102 274 105 288 110 301 115 310 111 313 97 302 92 288 87 274 83 260 80 246 79 232 79 217 79 205 82\n177 0.999 218 112 221 115 228 115 235 116 242 117 249 118 255 119 262 120 269 121 276 122 280 115 276 112 268 111 262 111 255 110 248 109 241 108 235 107 228 106 221 105\n177 0.999 207 117 210 125 219 126 228 128 238 129 247 130 257 132 266 133 275 135 285 136 295 133 290 125 281 124 272 123 262 122 252 120 243 119 234 118 225 116 216 115\n177 0.999 211 130 210 139 218 142 228 144 237 145 245 146 254 148 262 149 272 150 281 152 288 150 288 140 281 138 272 137 262 136 253 135 244 133 236 132 227 130 218 129\n177 0.999 197 141 198 152 209 154 221 156 233 158 245 160 256 162 268 163 280 165 292 168 303 168 301 158 289 155 278 153 266 151 255 149 243 147 232 145 219 143 208 141\n177 0.999 401 192 397 192 393 192 390 193 386 193 383 194 378 194 375 195 372 197 372 201 372 204 373 206 379 206 383 205 386 204 390 204 394 203 398 203 401 199 401 196\n178 0.999 127 140 150 149 173 135 197 123 225 116 251 112 279 114 305 121 329 131 351 144 374 137 355 120 331 107 307 98 280 91 251 90 225 93 198 99 172 109 150 124\n178 0.999 448 189 448 188 446 186 444 186 443 186 440 186 438 186 437 186 435 186 433 187 433 189 433 191 434 193 436 193 437 193 440 193 442 193 444 193 446 193 448 190\n178 0.999 280 194 274 194 268 194 261 194 255 194 249 194 243 194 237 194 230 194 224 194 220 197 225 199 231 199 238 199 244 199 250 200 257 199 263 199 269 199 275 199\n178 0.999 219 204 223 210 229 210 236 210 244 210 250 209 256 209 263 209 269 209 274 209 278 203 273 199 269 199 262 199 255 199 249 199 243 199 236 199 229 199 223 199\n179 0.999 212 106 216 115 218 121 225 123 233 122 242 121 250 120 260 121 267 123 274 125 279 121 282 113 278 106 270 104 262 102 254 100 245 100 236 100 228 101 220 104\n179 0.999 236 122 236 124 236 128 238 132 241 131 244 131 247 131 251 131 254 131 257 131 260 131 261 127 260 123 258 122 255 121 252 121 249 121 245 121 242 121 239 121\n179 0.999 50 137 61 138 73 138 86 138 97 139 109 139 121 139 134 140 145 141 158 141 163 129 152 128 140 128 127 127 116 127 103 126 90 126 79 126 65 125 54 126\n179 0.999 344 131 343 134 344 137 344 141 344 144 346 146 349 146 352 146 355 146 358 146 361 145 363 142 362 138 362 136 362 132 359 130 356 130 354 130 350 130 347 130\n179 0.999 366 131 366 137 366 144 372 145 379 145 385 145 392 146 398 146 405 146 411 146 417 146 417 139 416 131 411 131 404 131 397 131 391 131 384 131 378 131 372 131\n179 0.999 366 145 367 150 372 151 378 152 383 151 389 152 394 152 400 152 406 152 411 152 417 150 414 146 409 146 403 145 398 145 393 145 387 145 381 145 377 145 371 145\n179 0.999 53 161 53 166 55 170 61 170 67 170 72 170 79 170 84 171 90 171 94 171 100 169 100 163 96 160 91 160 85 160 80 160 75 160 69 160 63 159 58 160\n179 0.999 381 169 381 175 381 182 385 186 393 186 399 186 406 186 413 186 420 187 427 187 431 186 432 177 431 169 427 168 420 168 413 168 406 168 399 167 392 167 387 167\n179 0.999 54 174 55 179 59 180 65 181 70 181 76 181 81 181 86 181 92 181 97 180 100 172 97 170 92 171 87 171 82 171 77 171 72 170 67 170 61 170 56 171\n179 0.999 208 191 221 193 236 193 252 194 269 195 286 195 303 195 319 195 337 196 356 195 359 181 348 176 331 176 315 176 298 176 282 176 264 175 247 175 230 175 209 175\n179 0.999 134 189 134 193 133 196 136 199 139 199 142 199 145 199 149 199 152 199 155 199 159 198 159 195 158 190 156 189 153 189 150 189 146 189 143 189 140 188 137 188\n179 0.999 60 192 60 196 61 201 65 200 69 200 74 200 77 201 82 201 86 201 90 201 94 200 94 195 92 191 88 191 84 191 80 191 76 191 72 191 68 191 64 191\n179 0.999 47 208 51 211 57 211 64 210 71 210 77 210 84 210 90 210 97 211 104 211 107 202 102 200 95 201 89 201 82 201 76 200 69 200 62 200 56 200 49 200\n179 0.999 382 200 382 206 387 209 392 209 398 209 404 209 410 209 416 209 420 209 426 210 431 208 431 201 426 200 420 200 415 200 409 200 404 200 398 200 392 199 387 199\n179 0.999 201 200 202 217 220 218 239 218 257 218 276 219 293 218 311 218 329 219 346 218 364 218 362 200 343 200 326 200 308 200 290 200 272 200 254 200 236 199 218 199\n179 0.999 383 211 383 215 389 216 394 216 399 216 405 216 410 216 415 216 419 216 423 216 430 216 432 211 427 209 421 209 416 209 410 209 404 209 398 209 393 209 388 209\n179 0.999 63 211 62 215 64 218 68 219 71 218 73 218 76 219 80 219 83 219 86 219 91 216 92 213 90 211 86 210 83 210 80 210 76 210 72 210 69 210 66 210\n179 0.999 432 217 427 216 421 216 415 216 410 216 405 216 399 216 394 216 388 216 383 216 383 222 389 225 393 224 399 224 404 224 409 224 415 224 420 224 427 225 432 222\n179 0.999 59 219 58 223 60 227 65 227 69 227 74 227 78 227 83 227 87 227 91 228 95 226 96 221 93 219 89 219 84 219 80 219 76 219 71 219 67 218 63 218\n179 0.999 51 245 51 255 55 262 66 262 77 262 87 262 97 261 108 261 119 260 127 260 138 260 138 250 132 245 121 245 112 245 101 245 91 245 80 245 70 245 60 245\n179 0.999 323 244 323 253 323 263 332 264 343 265 352 265 361 265 371 265 381 265 389 265 397 265 397 255 396 246 389 244 379 244 369 244 360 244 350 244 341 244 332 244\n179 0.999 162 251 162 248 159 247 157 248 155 248 152 248 150 248 147 248 145 247 143 248 142 250 144 253 144 254 146 255 150 255 152 255 154 255 156 255 159 255 161 253\n179 0.999 400 251 399 255 402 258 407 258 411 258 416 258 419 258 423 258 426 258 430 258 435 258 437 254 435 250 431 251 426 251 421 251 417 251 413 251 408 251 404 251\n179 0.999 246 252 246 257 248 262 255 262 260 262 266 263 271 262 277 263 283 263 289 263 293 260 294 254 290 250 285 250 279 250 273 250 268 250 262 251 256 250 251 250\n179 0.999 400 263 402 265 405 265 409 265 414 265 418 265 422 265 426 265 429 265 434 264 434 259 431 258 427 258 423 258 419 258 416 258 411 258 407 258 403 258 400 258\n179 0.999 271 279 264 279 256 279 247 279 240 279 232 279 224 279 217 279 209 279 208 286 208 293 216 294 223 294 231 294 239 294 247 295 255 295 263 295 270 294 271 287\n180 0.999 414 28 385 27 354 28 326 29 297 32 269 37 242 42 211 49 181 59 155 71 155 90 184 86 215 81 242 77 271 72 299 69 328 66 356 62 386 58 414 51\n180 0.999 401 59 375 61 350 65 323 68 299 72 274 75 250 78 225 81 198 85 173 89 162 104 188 101 213 98 238 95 263 92 288 89 313 86 338 83 364 80 390 77\n180 0.999 167 114 147 117 126 119 107 121 88 123 68 126 50 129 28 132 6 135 0 148 1 167 19 166 42 163 60 161 80 159 100 158 120 156 139 155 161 152 165 134\n180 0.999 126 156 110 158 94 159 78 160 64 161 48 163 34 165 17 167 0 170 0 183 1 200 15 199 31 197 46 196 60 194 76 193 92 192 106 191 121 186 123 173\n180 0.999 496 216 488 217 479 217 471 216 464 216 455 217 447 216 440 220 440 229 440 239 442 246 450 246 460 246 468 246 476 246 484 246 493 247 497 244 497 234 497 225\n180 0.999 398 219 356 220 313 222 270 222 229 223 187 224 145 225 103 226 60 227 22 227 1 249 45 249 89 250 129 249 169 248 213 250 256 249 300 249 344 248 380 248\n180 0.999 313 295 313 300 317 303 321 303 326 304 330 304 334 304 339 305 344 305 348 305 354 303 354 297 350 296 346 296 341 296 336 295 331 295 327 295 322 295 317 295\n180 0.999 436 297 437 304 437 314 442 318 450 318 458 318 466 318 474 318 482 318 491 319 497 318 497 309 497 300 492 296 484 296 477 296 468 296 460 296 453 296 444 296\n180 0.999 316 304 316 308 318 313 322 313 326 312 330 313 334 313 338 314 342 314 346 314 351 313 352 308 349 305 345 305 341 305 336 304 332 304 329 304 325 304 320 304\n180 0.999 350 318 351 322 354 324 356 324 360 324 364 324 367 325 371 325 375 325 378 325 382 322 379 317 377 317 373 318 370 318 366 318 363 317 359 317 356 317 353 317\n181 0.999 426 114 414 113 403 113 391 114 379 115 368 117 358 120 347 122 336 127 332 136 332 148 343 146 353 140 363 136 373 134 385 132 396 131 408 131 419 133 424 125\n181 0.999 436 136 424 137 411 140 398 142 385 143 374 145 361 147 350 150 338 151 329 157 331 168 343 168 356 165 368 164 379 161 392 159 405 158 417 157 430 155 438 148\n181 0.999 265 159 266 174 270 187 282 189 295 192 307 194 319 197 331 199 344 202 357 205 370 208 371 196 366 185 354 182 341 179 328 176 316 172 302 169 290 164 278 161\n182 0.999 100 154 118 183 147 172 177 162 208 157 240 155 272 158 303 164 331 174 359 188 381 189 378 162 347 148 318 135 287 127 255 122 223 121 191 125 158 133 129 142\n182 0.999 203 168 203 183 204 192 218 192 230 193 242 193 254 193 266 194 277 195 288 195 299 196 299 181 295 173 283 172 272 171 260 169 248 169 236 168 224 167 211 166\n182 0.999 124 199 124 209 126 219 137 220 146 220 156 220 165 221 175 222 185 222 194 222 202 222 203 208 201 200 191 199 182 199 173 199 162 199 152 198 143 198 132 198\n182 0.999 291 208 291 220 295 227 308 227 320 228 331 229 343 229 353 230 365 230 376 231 387 231 387 215 382 209 370 209 359 209 347 208 336 208 325 208 313 207 301 206\n182 0.999 139 233 139 239 140 247 141 253 147 253 153 253 160 254 167 254 172 254 178 254 184 254 185 246 185 237 182 232 176 232 170 232 164 232 158 232 150 231 144 231\n182 0.999 295 239 295 251 299 259 311 259 323 260 334 260 345 261 356 261 367 261 378 262 389 261 389 245 384 240 373 239 361 240 350 239 338 238 327 237 316 237 304 237\n182 0.999 234 298 249 299 267 300 286 300 305 300 324 301 343 302 362 302 381 303 399 303 401 286 391 277 368 275 348 275 330 274 311 274 292 273 272 273 253 273 233 274\n182 0.999 243 319 243 326 248 327 254 327 260 326 266 326 272 326 278 327 283 328 289 328 295 327 295 319 290 319 283 319 277 319 272 319 266 319 260 319 254 319 248 319\n182 0.999 259 327 259 329 259 332 261 334 264 334 266 334 269 334 271 334 274 334 277 334 280 333 281 330 280 328 278 327 275 327 272 327 270 326 267 326 264 326 262 326\n182 0.999 244 335 246 341 251 341 257 341 263 341 269 341 274 341 280 341 286 341 290 342 297 339 294 335 289 335 283 335 278 334 272 334 266 334 260 334 255 334 250 334\n182 0.999 1 368 6 370 13 370 21 370 29 370 36 370 44 370 51 370 58 370 67 370 67 359 61 359 55 359 48 359 40 359 33 359 25 359 19 359 12 359 3 359\n183 0.999 411 3 385 3 359 3 336 6 311 10 286 15 263 20 239 24 215 27 191 27 180 45 204 48 230 47 255 44 279 39 304 34 329 29 354 24 376 22 401 22\n183 0.999 10 35 17 40 23 32 29 24 36 18 45 17 54 18 61 23 68 29 73 36 80 31 75 22 69 15 60 9 52 6 42 6 33 8 24 12 17 18 12 27\n183 0.999 477 66 458 31 415 21 379 22 343 28 305 35 273 41 236 46 201 48 168 42 136 44 148 71 188 79 229 78 265 74 303 69 340 62 375 58 414 55 444 59\n183 0.999 98 35 98 39 97 44 100 48 103 48 107 48 112 48 116 48 121 48 126 49 128 47 128 42 128 36 126 35 122 34 117 34 114 34 109 33 105 32 99 33\n183 0.999 86 45 78 43 68 42 60 43 50 44 41 46 33 47 24 49 16 49 7 50 7 59 15 61 25 61 34 61 42 59 51 57 60 56 68 55 77 55 85 55\n183 0.999 441 68 430 66 419 66 409 66 397 66 387 66 377 66 366 66 355 66 344 66 341 76 352 77 363 78 374 78 385 78 395 78 406 78 417 78 428 78 439 78\n183 0.999 157 127 157 133 157 140 160 145 165 146 170 146 176 146 182 146 189 146 196 146 199 146 200 138 200 131 197 127 192 127 186 126 180 126 174 126 167 126 159 126\n183 0.999 170 151 170 171 188 174 209 174 228 175 247 175 267 175 286 176 307 176 329 175 338 171 338 151 320 148 300 148 281 148 261 148 242 148 223 148 202 147 182 147\n183 0.999 170 178 170 196 183 204 203 204 221 204 240 205 258 205 277 205 297 205 315 206 333 205 332 186 319 179 300 179 281 178 262 178 243 178 224 178 205 177 187 177\n184 0.999 52 65 61 83 74 94 90 85 110 78 131 74 150 74 173 79 193 86 207 93 218 82 226 65 206 54 187 45 168 40 148 38 126 39 105 42 85 47 68 57\n184 0.999 65 103 74 117 89 109 104 102 120 98 137 96 154 97 170 100 186 106 199 113 211 107 204 96 190 88 173 82 158 79 142 78 124 79 108 82 92 87 77 94\n184 0.999 190 137 190 142 191 149 193 152 198 152 205 152 210 152 217 152 224 152 227 152 232 152 233 145 233 138 229 135 224 135 218 135 212 135 206 135 200 135 195 135\n184 0.999 44 137 44 142 44 149 45 154 50 155 56 154 61 154 67 155 75 155 77 154 82 154 82 147 82 140 80 136 75 136 70 136 64 135 59 136 53 136 48 136\n184 0.999 105 202 105 211 105 220 111 224 121 224 130 224 139 224 150 224 160 224 165 224 173 223 173 213 173 204 165 200 157 201 148 201 138 201 130 200 121 201 112 200\n184 0.999 198 218 191 218 177 224 162 229 148 231 133 231 120 230 106 227 94 221 82 223 79 234 91 240 104 245 118 248 133 249 147 249 161 247 174 244 187 239 201 233\n185 0.999 29 67 37 103 61 103 90 97 119 93 148 90 176 90 205 94 234 99 261 105 283 101 289 67 258 60 230 54 204 51 173 48 141 47 111 49 82 52 57 58\n185 0.999 60 154 78 156 102 156 127 156 151 156 175 155 198 154 222 154 245 153 260 142 257 114 242 106 218 106 196 106 173 106 149 106 123 106 98 105 73 105 61 117\n186 0.999 142 11 143 17 148 17 152 15 155 15 160 15 164 16 169 18 172 20 176 21 180 20 177 17 175 12 171 10 168 8 163 7 159 6 154 6 149 7 145 9\n186 0.999 131 49 134 54 138 59 143 63 149 66 156 68 163 68 170 66 176 62 180 58 184 51 181 47 175 50 170 55 164 58 157 58 151 57 146 54 141 50 137 46\n186 0.999 53 90 53 100 54 109 61 109 68 109 76 110 84 110 91 110 98 110 105 110 111 110 112 98 110 91 104 90 96 90 89 90 81 89 74 89 67 89 59 89\n186 0.999 196 96 196 105 196 114 204 115 211 115 219 115 227 116 234 116 240 116 247 116 254 115 254 106 252 97 246 96 239 96 232 96 225 95 217 95 210 95 203 95\n186 0.999 94 127 95 140 110 140 124 141 138 141 151 142 165 142 179 142 191 143 204 143 214 140 212 125 198 125 184 124 171 124 158 123 144 123 130 122 118 122 103 121\n186 0.999 55 147 65 160 86 160 107 160 128 161 148 161 169 162 189 162 209 163 229 163 250 163 239 151 218 150 197 150 177 150 157 149 136 148 115 148 95 147 74 146\n186 0.999 80 161 88 172 103 172 119 172 135 173 150 173 166 174 181 174 196 174 211 175 226 174 219 163 203 162 187 162 172 162 157 161 141 161 126 161 110 161 94 160\n186 0.999 73 172 80 184 98 184 115 185 131 185 148 186 165 186 182 187 198 187 215 188 232 187 225 175 208 174 190 174 174 174 157 174 140 173 123 173 106 172 89 172\n186 0.999 128 191 129 197 135 197 141 197 146 197 151 197 157 198 162 198 168 198 173 198 179 196 176 191 171 190 165 190 161 190 155 189 150 189 144 189 139 189 133 189\n186 0.999 95 204 104 208 117 207 129 207 141 207 153 207 165 207 177 207 189 208 201 208 210 200 202 198 189 198 176 198 164 198 152 198 140 198 127 197 115 197 103 197\n186 0.999 114 209 118 214 127 214 135 214 143 214 152 214 160 215 169 215 176 215 185 215 193 210 188 207 179 207 170 207 162 207 154 207 145 207 137 207 128 207 121 207\n186 0.999 64 224 81 225 99 224 117 224 136 224 154 224 172 225 190 225 209 226 226 226 240 216 224 215 205 214 187 215 169 215 150 215 132 215 113 214 95 214 77 214\n186 0.999 128 226 128 233 134 234 139 234 145 234 151 234 156 234 162 234 168 234 173 234 177 231 177 225 172 225 166 225 161 225 155 225 149 225 144 224 138 225 133 225\n187 0.999 266 64 270 73 277 80 285 77 299 72 305 71 317 72 327 73 337 77 345 81 352 76 355 65 346 60 337 56 326 52 317 51 305 51 295 52 284 55 275 59\n187 0.999 110 43 109 63 108 85 107 104 126 122 132 122 167 123 186 121 204 121 225 120 248 125 246 101 245 80 245 56 243 46 220 46 192 46 173 45 150 44 130 43\n187 0.999 248 118 225 118 202 120 180 124 182 125 152 123 126 122 106 124 106 151 106 175 105 200 122 201 145 201 172 202 188 202 218 202 244 201 251 185 250 162 248 139\n187 0.999 269 131 269 137 269 142 270 147 281 147 282 147 288 147 294 147 300 148 305 148 311 147 311 142 311 136 309 130 304 130 298 130 291 130 286 130 280 130 274 130\n187 0.999 268 149 268 154 269 161 274 162 284 162 286 162 292 162 299 162 305 162 311 162 318 162 320 155 319 150 313 148 306 148 300 148 293 148 287 147 280 147 274 147\n187 0.999 100 229 99 241 99 254 102 264 128 264 130 264 144 264 158 264 170 264 183 265 196 265 196 251 197 238 191 229 180 229 166 229 151 228 138 228 125 227 112 227\n187 0.999 406 260 371 266 334 268 295 268 269 268 217 270 176 270 140 270 102 269 97 305 97 338 134 339 174 338 217 338 253 337 293 336 331 336 370 335 407 332 407 295\n188 0.999 344 110 325 87 299 70 271 61 242 58 213 62 186 72 163 86 144 104 127 126 112 154 141 170 160 151 177 130 200 116 226 107 253 106 279 112 304 128 323 132\n188 0.999 399 192 392 150 360 133 320 136 290 143 256 150 220 159 188 168 158 174 120 181 92 195 105 243 137 248 176 244 206 237 240 230 273 222 305 215 336 208 368 203\n188 0.999 372 236 347 219 322 231 304 251 283 267 260 277 236 277 210 271 186 258 163 251 143 274 166 291 188 305 216 314 244 318 270 316 295 310 318 298 337 281 353 260\n189 0.999 232 57 232 70 231 82 240 91 250 98 259 104 270 110 281 116 292 120 302 124 313 128 313 116 314 105 305 98 294 94 283 89 272 82 261 75 252 69 243 62\n189 0.999 232 92 232 114 243 143 266 156 289 166 314 174 340 181 365 187 390 195 416 209 437 218 438 198 425 174 402 161 377 151 351 144 326 136 302 128 279 118 255 105\n189 0.999 151 127 139 131 129 135 118 141 107 147 96 152 86 159 75 164 64 168 54 171 55 183 68 181 79 177 90 172 101 166 111 160 121 155 132 149 143 145 153 139\n189 0.999 153 144 142 145 132 149 121 154 111 159 102 164 92 170 81 175 71 179 60 182 55 189 66 190 77 186 87 181 97 176 107 171 117 165 127 160 137 156 148 152\n189 0.999 237 139 242 163 261 174 284 186 308 194 331 199 354 206 377 211 401 218 421 231 442 242 436 221 418 210 394 196 371 189 347 183 323 177 301 170 280 162 258 151\n189 0.999 87 149 82 150 79 151 75 154 70 155 67 157 62 157 57 158 54 160 55 164 55 169 59 170 64 168 68 167 72 165 77 164 82 162 85 160 89 158 88 154\n189 0.999 154 193 141 197 129 201 115 206 102 211 88 216 72 220 59 224 62 237 62 249 63 262 75 260 88 255 100 251 112 242 125 238 138 233 151 230 156 220 155 207\n189 0.999 148 230 142 231 137 232 132 235 127 236 122 238 118 239 113 241 109 243 105 245 105 251 108 251 113 249 118 248 123 245 128 243 132 241 137 240 142 237 147 234\n189 0.999 460 257 460 263 461 269 461 274 464 278 469 280 474 282 479 284 484 285 489 286 495 288 496 283 496 278 495 273 491 269 486 266 481 263 476 262 471 260 465 258\n190 0.999 203 37 199 49 195 61 195 73 206 79 220 89 230 99 240 111 248 120 258 127 271 121 281 113 287 106 279 93 269 81 261 70 251 61 239 53 227 47 214 41\n190 0.999 117 50 101 41 88 47 77 54 65 62 53 71 45 80 36 91 28 103 25 113 36 118 47 124 62 127 70 117 80 103 89 92 99 84 113 76 122 69 119 57\n190 0.999 195 82 171 77 146 77 122 83 99 96 82 113 70 134 64 158 66 182 74 205 86 225 106 214 97 193 91 168 95 144 108 123 126 109 145 99 164 102 191 106\n190 0.999 208 91 199 104 197 115 208 126 217 140 222 156 221 172 218 187 214 202 214 214 228 224 236 214 242 199 247 182 250 167 250 152 246 138 240 124 232 112 221 100\n190 0.999 106 151 106 164 105 177 109 186 124 186 139 185 153 185 167 185 181 185 195 185 208 184 209 172 210 160 206 150 190 150 175 151 161 151 147 151 133 150 119 150\n190 0.999 206 193 199 189 194 192 190 196 185 200 180 205 173 207 167 210 159 212 153 214 154 222 158 224 164 225 171 223 179 222 186 218 191 215 197 210 202 204 207 199\n190 0.999 33 235 45 247 58 262 74 275 92 284 110 292 131 297 153 299 173 299 193 296 195 285 189 265 171 265 153 265 131 261 111 254 92 244 78 231 65 217 47 224\n190 0.999 279 229 267 225 257 222 247 220 243 225 237 232 229 238 220 245 212 251 204 258 206 267 211 275 218 282 229 277 238 272 247 264 254 257 260 250 268 242 279 232\n190 0.999 109 242 115 246 126 250 138 252 148 254 160 255 171 253 182 251 191 247 201 243 202 237 195 229 187 226 175 229 164 233 153 233 142 230 132 228 120 224 115 233\n191 0.999 103 33 93 26 82 22 70 20 59 22 48 28 40 36 35 46 31 57 31 69 33 79 42 76 42 64 44 55 49 43 56 36 65 32 76 31 87 34 97 40\n191 0.999 110 43 100 48 104 58 105 69 104 80 100 88 93 96 83 99 72 101 61 102 59 112 72 114 81 114 92 111 102 104 110 96 115 86 117 75 116 63 114 52\n191 0.999 129 65 147 67 160 67 177 68 196 69 214 69 232 70 250 71 269 71 290 71 291 52 279 49 261 48 243 47 224 47 207 46 188 45 169 45 145 44 132 44\n191 0.999 129 70 128 89 138 98 158 99 177 99 194 101 213 102 234 103 252 103 270 104 287 103 290 82 277 76 259 75 241 74 222 73 203 72 184 72 164 70 145 70\n191 0.999 27 130 26 159 42 173 73 174 107 175 136 176 168 177 200 178 232 179 262 179 289 179 291 147 271 137 238 136 208 136 179 135 148 134 117 133 85 132 54 131\n191 0.999 33 215 46 231 65 232 97 232 124 232 153 233 182 233 211 233 241 233 266 220 280 205 266 193 241 192 208 192 182 191 155 191 128 191 100 189 71 188 39 189\n191 0.999 0 252 0 253 0 256 0 259 0 261 0 261 2 263 5 263 8 263 10 263 13 262 15 259 15 256 15 253 13 252 10 252 8 251 5 251 2 251 0 252\n191 0.999 197 274 197 282 208 282 218 281 227 280 235 280 244 280 254 281 265 281 268 274 268 265 265 260 255 259 246 258 237 257 228 256 217 255 209 254 200 256 199 265\n191 0.999 49 256 49 274 48 292 56 301 73 301 90 301 108 302 126 302 144 304 159 304 173 304 175 284 175 268 165 260 147 259 131 258 115 257 98 257 79 256 61 256\n191 0.999 207 283 207 290 207 298 208 306 216 306 223 306 231 306 239 306 248 307 255 308 262 308 263 297 264 288 262 284 254 283 246 282 238 281 230 281 221 281 213 282\n192 0.999 284 118 289 136 298 142 314 141 331 141 350 139 365 136 379 137 395 139 408 144 419 143 425 125 412 118 395 114 380 112 363 109 347 108 331 109 314 110 298 113\n192 0.999 232 166 244 185 270 185 296 185 322 186 348 186 374 186 398 187 423 187 448 187 474 187 463 169 438 168 411 168 386 167 360 167 335 166 309 166 284 165 258 164\n192 0.999 59 183 60 189 65 189 70 189 75 190 80 190 84 189 89 189 94 190 99 189 104 187 102 183 97 183 92 183 88 183 83 183 78 183 73 183 68 183 64 182\n193 0.999 93 91 97 129 128 135 167 126 204 120 243 116 281 117 317 121 354 127 390 133 426 139 428 100 393 93 356 89 319 85 283 82 243 82 203 83 165 86 132 89\n193 0.999 208 145 189 147 174 151 158 157 142 164 126 171 109 177 93 182 79 183 78 198 78 213 94 212 110 208 126 202 141 196 156 189 173 183 190 179 205 176 206 163\n193 0.999 325 148 323 163 323 177 338 182 352 188 368 195 384 202 396 208 410 212 424 216 441 219 441 205 443 192 428 185 414 181 399 175 384 168 368 161 355 155 343 151\n193 0.999 105 222 128 236 159 236 191 236 222 236 254 236 286 237 317 237 349 237 380 238 413 237 390 225 359 225 328 224 296 224 265 224 232 223 200 223 169 223 138 222\n193 0.999 128 240 143 253 171 254 198 254 225 254 253 255 281 255 307 256 335 256 362 257 390 257 373 243 346 242 319 241 292 240 265 240 237 240 209 240 182 240 155 240\n193 0.999 224 333 230 339 241 339 253 339 263 339 274 339 285 339 296 339 307 339 317 339 329 338 321 331 310 331 299 331 289 331 277 331 266 331 256 331 245 331 234 331\n193 0.999 212 348 213 358 223 358 232 357 242 357 252 357 261 357 271 357 280 357 289 358 299 355 296 346 286 346 277 346 268 347 259 347 248 346 239 346 230 346 220 346\n194 0.999 33 38 46 45 58 36 67 29 81 23 95 21 108 23 122 28 133 38 143 42 154 33 143 23 132 15 120 10 106 7 92 7 78 9 65 13 53 19 43 28\n194 0.999 146 79 134 78 122 78 112 79 102 79 90 79 76 79 63 79 51 79 44 85 43 97 54 97 67 97 79 97 91 96 102 96 114 96 127 96 140 96 145 90\n194 0.999 162 97 147 97 131 97 118 97 105 97 88 98 70 98 54 98 39 98 26 102 26 117 41 118 58 118 73 117 89 117 104 117 120 117 135 116 151 116 162 111\n194 0.999 93 149 93 155 93 161 93 169 105 169 110 170 114 170 119 169 126 169 128 169 133 168 133 162 133 156 132 149 128 148 121 148 116 148 109 148 103 148 97 148\n194 0.999 160 177 148 177 137 177 128 177 119 178 107 178 93 178 79 178 68 178 67 189 67 200 79 200 91 200 102 200 113 200 125 200 137 200 148 200 159 199 159 188\n194 0.999 160 233 150 233 138 233 130 233 121 233 109 233 95 233 80 234 71 234 71 247 71 259 83 260 95 260 106 259 118 259 129 260 140 259 152 259 160 256 160 244\n194 0.999 171 267 159 265 144 266 134 266 124 267 110 267 92 267 75 267 65 267 61 280 61 295 74 295 89 294 102 294 117 295 131 295 145 294 158 294 170 294 172 280\n195 0.999 445 12 398 21 342 52 289 74 233 91 180 104 133 113 76 119 24 127 8 195 9 246 56 246 106 226 162 203 221 179 272 157 322 136 374 115 423 92 460 61\n195 0.999 445 145 438 130 426 122 412 127 396 133 380 137 365 141 350 145 334 147 317 151 307 156 311 176 321 178 336 175 354 171 369 168 386 163 401 159 415 154 429 149\n195 0.999 191 194 192 201 193 208 201 209 208 208 215 207 222 207 230 208 239 208 245 209 251 207 253 200 250 193 242 192 235 192 228 192 220 192 212 192 206 192 197 193\n195 0.999 138 230 125 235 111 241 99 248 86 254 75 261 65 268 54 275 45 281 42 295 50 305 63 306 74 298 85 290 98 282 109 276 122 269 133 263 143 257 141 244\n195 0.999 156 244 165 257 178 266 189 273 204 277 219 279 234 279 251 275 264 269 277 260 287 250 277 239 265 246 253 254 238 258 222 260 208 259 192 254 181 246 169 237\n196 0.999 78 36 87 54 97 62 116 55 136 48 152 46 168 46 190 51 209 59 225 66 236 52 240 34 222 26 205 19 186 15 168 14 147 13 128 17 111 22 93 30\n196 0.999 62 48 45 65 32 83 21 108 16 133 15 157 16 182 24 205 36 229 50 251 68 259 90 243 84 223 68 202 58 178 54 154 55 128 64 105 79 83 81 65\n196 0.999 95 72 102 83 110 93 123 91 134 86 146 83 159 82 173 84 188 89 202 92 209 82 212 71 200 64 188 60 174 57 160 56 146 56 133 58 119 62 107 67\n196 0.999 263 53 244 68 241 90 253 109 258 131 257 157 249 182 240 206 226 228 221 244 246 256 264 251 280 232 294 212 301 188 304 166 304 139 299 115 291 93 278 72\n196 0.999 231 115 213 113 197 113 178 113 161 113 144 114 126 114 107 115 90 115 87 128 88 145 105 145 122 145 139 144 157 144 175 144 192 144 210 144 228 144 231 131\n196 0.999 87 155 87 171 93 183 109 182 125 182 142 181 159 180 177 180 195 180 213 181 227 179 227 161 219 152 202 152 185 152 168 152 151 152 135 152 118 152 100 152\n196 0.999 219 187 203 187 189 188 172 188 156 189 141 190 126 191 109 191 93 192 91 206 93 221 107 223 123 222 139 221 155 221 171 220 186 219 202 220 216 217 218 203\n196 0.999 211 280 207 265 204 255 189 251 176 252 167 254 152 254 135 253 120 251 110 253 107 262 102 280 106 287 120 290 134 293 149 294 164 294 180 293 197 291 211 283\n197 0.999 145 72 143 81 140 90 146 96 153 102 159 108 164 114 169 122 173 131 177 137 187 133 196 129 194 122 190 113 186 105 182 98 176 91 170 85 162 79 153 74\n197 0.999 33 114 31 135 45 150 60 153 80 157 100 162 119 165 139 170 160 175 178 179 197 183 199 160 188 151 169 146 150 141 131 137 113 133 93 128 74 123 52 118\n197 0.999 32 180 42 194 52 209 66 222 81 232 98 238 115 242 134 241 151 236 167 227 180 217 168 201 153 208 137 216 121 218 103 216 87 209 73 199 62 185 50 172\n198 0.999 0 5 2 8 8 8 13 8 18 8 23 8 27 8 32 8 38 8 43 8 43 3 40 0 35 0 30 0 25 0 20 0 16 0 10 0 6 0 2 1\n198 0.999 211 32 200 33 189 33 175 33 162 33 149 33 135 34 121 33 108 34 94 34 92 47 103 48 118 48 130 48 144 48 158 48 171 48 183 48 198 47 210 46\n198 0.999 84 35 77 36 69 36 60 36 50 36 41 36 31 36 21 36 11 37 4 37 3 49 10 50 21 50 29 50 40 50 49 50 58 50 67 50 77 49 84 46\n198 0.999 22 158 33 165 58 160 77 156 96 155 115 155 134 156 154 158 174 160 182 154 182 141 175 128 157 125 134 121 115 121 95 121 76 123 55 125 38 131 23 138\n198 0.999 181 163 169 162 151 162 133 162 114 162 96 163 77 163 57 164 38 165 24 167 24 196 38 196 62 195 77 195 99 195 117 194 138 193 153 193 173 193 181 182\n198 0.999 182 196 169 196 152 197 133 197 115 197 98 197 79 197 60 197 42 197 25 197 22 218 37 219 58 219 73 219 93 218 111 218 130 218 145 218 165 217 182 216\n199 0.999 455 106 411 108 366 110 319 113 274 116 227 121 180 128 135 135 90 143 69 172 72 210 116 196 162 184 207 175 254 171 302 170 345 173 393 179 437 186 453 147\n199 0.999 349 182 323 180 300 179 276 179 251 180 226 182 202 184 176 188 153 193 133 203 132 224 157 222 181 218 206 215 231 212 255 211 280 211 305 212 332 214 348 206\n199 0.999 22 311 21 323 29 328 41 328 53 328 65 328 76 329 87 329 102 329 110 329 122 328 123 315 114 310 103 310 91 310 79 310 67 309 55 309 42 309 32 309\n200 0.999 113 49 119 61 141 54 163 49 185 47 207 46 228 46 250 45 272 45 289 43 289 23 277 14 256 14 238 13 217 12 200 12 180 11 153 11 132 13 120 30\n200 0.999 305 50 273 51 244 55 212 62 184 71 156 83 128 97 104 113 82 130 61 151 65 168 90 171 112 150 135 134 161 118 186 105 212 94 239 85 270 80 299 76\n200 0.999 182 138 174 138 168 138 160 139 154 139 147 139 141 139 134 138 129 140 128 142 129 151 135 152 142 152 149 152 156 152 162 152 168 152 175 152 182 151 182 145\n200 0.999 128 158 132 159 139 159 145 159 151 159 157 159 163 159 169 159 175 159 182 159 183 152 178 152 172 152 166 152 160 152 154 152 149 152 142 152 136 152 129 152\n200 0.999 122 160 124 166 131 166 138 166 144 166 151 166 157 166 164 166 170 166 177 167 184 164 181 159 174 159 168 159 162 159 155 159 148 159 142 159 135 159 128 159\n200 0.999 131 171 134 174 139 174 144 174 149 174 154 174 158 174 164 174 168 174 173 174 177 171 174 169 169 169 164 169 159 169 154 170 150 170 145 169 141 170 136 170\n200 0.999 183 207 180 220 177 232 190 234 203 235 215 234 227 234 236 236 223 234 213 233 210 240 222 239 233 240 242 234 245 230 245 213 233 211 218 210 205 208 194 207\n201 0.999 45 63 48 79 60 81 75 75 90 72 105 70 120 70 135 73 150 79 164 84 173 81 175 64 162 56 148 49 132 45 117 43 101 43 86 45 71 48 57 55\n201 0.999 85 115 85 123 92 124 99 124 106 125 113 125 120 125 128 125 135 126 142 126 148 124 148 114 142 112 135 112 127 112 119 112 112 112 104 112 98 112 90 112\n201 0.999 79 136 79 144 81 151 88 151 94 151 101 151 108 151 115 151 122 151 129 151 136 151 136 142 135 134 128 134 121 134 114 134 106 134 99 134 92 134 85 134\n202 0.999 222 61 206 60 190 59 174 60 158 61 142 61 122 60 96 61 87 71 88 96 87 109 94 111 116 100 130 98 155 95 180 96 200 99 219 106 226 95 224 77\n202 0.999 200 100 191 99 181 98 171 99 162 99 153 99 143 99 133 99 123 99 114 102 114 110 121 112 131 112 142 112 151 112 161 112 171 112 180 112 192 113 200 109\n202 0.999 81 157 95 160 111 160 127 160 144 160 161 160 177 160 193 160 208 160 221 160 231 150 217 151 201 151 184 151 169 150 154 151 138 151 122 151 108 150 92 150\n202 0.999 181 162 176 162 171 162 166 162 160 162 155 162 150 162 143 162 137 162 135 166 135 171 138 173 143 172 149 172 155 173 161 173 167 172 173 172 180 173 181 168\n203 0.999 479 29 430 31 376 37 328 41 286 45 245 46 195 45 146 44 92 41 41 41 16 75 63 81 115 85 175 88 225 89 267 88 325 87 377 84 427 80 467 75\n203 0.999 486 82 434 86 381 88 329 92 284 93 245 95 192 93 143 90 87 88 32 84 10 132 56 136 110 139 175 143 228 144 275 143 328 143 382 142 433 136 479 131\n203 0.999 65 183 93 195 138 196 177 198 231 199 272 199 311 198 350 196 397 195 437 190 433 155 392 146 352 148 316 149 281 150 241 151 204 149 162 147 120 144 76 142\n204 0.999 191 108 191 122 195 133 209 133 224 133 238 134 253 133 266 134 281 135 296 135 309 134 312 120 306 110 291 109 277 109 262 108 248 110 233 109 219 108 204 108\n204 0.999 84 113 87 125 96 130 109 130 123 134 135 141 144 151 150 163 153 176 160 185 174 183 175 171 170 158 164 145 157 134 147 124 135 117 124 112 111 110 97 110\n204 0.999 344 111 344 121 344 130 344 138 354 139 364 139 371 139 381 140 391 140 400 140 408 138 409 128 410 120 408 111 399 111 390 111 379 111 370 111 362 111 353 111\n204 0.999 439 114 435 114 431 116 428 121 424 125 420 130 417 133 413 136 411 140 412 145 414 149 420 149 423 146 426 142 429 138 433 135 436 132 439 128 441 124 440 118\n204 0.999 78 116 68 120 60 125 54 132 47 140 42 147 38 156 35 165 33 175 32 183 40 188 49 186 51 178 53 169 57 160 61 151 68 144 75 138 83 134 82 125\n204 0.999 185 162 184 193 190 217 228 217 264 217 303 218 338 218 375 218 412 219 448 220 483 220 484 185 474 164 435 164 400 163 364 162 326 162 292 162 257 162 219 161\n204 0.999 252 227 259 241 278 241 297 240 316 241 335 241 353 241 371 241 391 241 409 242 427 239 420 226 402 226 383 226 364 226 345 226 326 226 308 226 289 226 271 226\n204 0.999 162 278 189 283 220 283 254 283 286 283 319 282 351 283 384 282 418 282 453 282 474 266 450 262 417 262 384 262 352 262 319 262 286 261 253 261 221 261 188 261\n204 0.999 33 283 63 302 106 302 152 302 197 303 243 303 289 303 334 303 380 303 425 303 472 302 442 285 397 284 351 284 306 283 258 284 213 283 168 283 123 283 78 283\n205 0.999 39 77 42 96 56 98 74 94 89 91 106 90 122 90 139 92 155 96 171 101 182 98 185 81 169 73 154 69 137 67 120 65 104 65 86 66 70 68 54 72\n205 0.999 42 128 41 145 42 171 49 181 68 181 89 181 109 182 129 182 146 183 162 183 179 180 179 160 181 137 174 128 155 128 138 128 117 127 97 127 76 125 58 125\n205 0.999 47 233 58 237 72 239 86 241 99 242 114 242 129 242 144 240 156 238 168 237 169 223 160 216 145 218 131 220 118 221 104 221 90 221 77 219 62 216 50 217\n205 0.999 51 281 52 297 68 298 84 298 100 298 115 298 131 298 147 298 162 298 178 298 192 298 191 280 176 279 160 279 145 279 129 279 114 279 97 280 81 280 65 280\n206 0.999 238 87 220 74 197 66 176 63 151 61 125 62 99 66 77 70 57 78 36 90 39 107 55 129 77 123 99 115 122 111 145 110 170 111 192 116 213 123 226 111\n206 0.999 201 124 192 120 173 120 158 121 140 121 121 122 102 123 85 123 72 127 72 143 74 163 80 174 100 172 117 171 136 170 153 169 170 168 185 167 200 166 203 148\n207 0.999 399 183 384 147 345 164 333 166 285 161 229 159 187 162 139 173 97 178 49 183 40 209 60 247 108 242 156 234 197 225 247 221 296 223 331 227 369 238 399 223\n207 0.999 90 252 100 277 135 270 171 263 201 258 234 256 268 254 304 255 337 256 370 264 404 273 394 249 361 240 326 232 293 230 260 230 225 230 190 233 156 240 122 246\n207 0.999 76 254 67 254 60 256 57 256 48 256 40 258 38 263 40 271 41 277 43 285 44 290 51 290 58 288 66 288 71 287 78 286 81 280 79 273 79 267 78 260\n207 0.999 237 256 237 269 238 286 250 292 262 292 277 291 291 292 307 293 322 294 338 297 354 300 353 286 347 273 340 260 327 258 311 258 295 256 280 256 266 256 251 255\n208 0.999 155 94 159 122 181 118 210 114 237 111 266 109 289 110 318 111 343 114 365 119 380 118 383 92 357 86 332 81 307 79 279 77 253 77 226 79 201 82 177 88\n208 0.999 414 133 387 132 358 132 326 132 295 133 265 133 234 133 205 133 175 133 144 134 130 152 160 152 190 152 218 152 248 152 281 152 311 152 341 152 373 151 404 151\n208 0.999 316 153 307 152 298 152 289 152 279 152 270 152 260 152 251 152 242 152 232 152 227 158 235 161 245 160 254 160 265 160 275 160 283 160 293 160 302 160 311 160\n208 0.999 156 172 170 186 194 186 218 185 242 185 268 185 291 185 315 185 339 185 364 184 382 182 367 169 344 169 321 169 297 169 273 170 250 170 225 170 202 170 178 171\n208 0.999 129 203 140 227 173 225 205 225 238 225 271 224 300 224 332 223 361 222 392 222 419 222 404 204 374 204 344 204 314 204 283 203 252 203 220 203 189 203 157 203\n208 0.999 425 247 404 232 372 231 342 231 310 231 280 231 248 231 217 231 186 231 155 231 129 234 147 251 178 251 210 250 243 250 274 250 304 249 336 249 365 249 397 249\n208 0.999 355 262 338 260 322 260 304 260 286 261 268 261 251 261 233 261 215 261 197 262 193 278 208 281 225 281 244 281 262 280 281 281 297 280 316 280 333 280 350 280\n209 0.999 53 28 51 44 55 56 68 64 80 71 93 79 106 87 117 94 130 101 142 108 155 115 157 100 151 89 139 83 128 75 116 68 103 60 91 50 78 43 65 34\n209 0.999 258 52 252 48 244 46 236 45 229 45 222 47 216 50 210 55 205 61 199 66 198 72 207 73 212 70 217 65 222 62 228 59 235 58 242 60 249 63 256 59\n209 0.999 243 68 239 67 235 68 231 68 227 69 223 70 219 70 216 71 213 72 213 76 213 79 217 81 220 80 224 80 228 79 232 79 235 78 240 78 244 76 244 72\n209 0.999 438 67 432 68 427 70 421 72 416 74 411 76 407 79 404 82 400 83 401 89 401 93 406 94 410 91 416 89 421 87 426 85 430 83 435 81 440 77 439 72\n209 0.999 450 77 443 78 437 81 431 84 425 86 418 88 411 92 406 94 400 96 401 102 402 109 408 108 414 105 420 103 427 100 432 98 439 95 445 93 451 90 451 83\n209 0.999 260 91 253 85 247 83 242 88 236 91 228 93 221 94 214 92 207 90 201 93 196 99 200 102 207 105 214 109 221 110 229 110 236 107 244 104 250 100 255 95\n209 0.999 387 83 371 89 355 96 338 103 323 110 307 117 291 123 277 130 261 136 247 143 248 159 264 153 279 147 295 141 311 135 327 128 343 121 359 115 376 108 389 99\n209 0.999 164 102 160 109 159 115 163 121 169 124 175 128 180 132 186 135 191 138 197 142 204 145 206 138 207 132 204 127 199 124 193 120 187 116 181 111 176 107 170 103\n209 0.999 300 141 292 143 285 145 278 147 271 150 266 153 259 156 253 158 246 160 245 167 247 174 253 174 260 171 266 169 273 166 279 163 286 160 293 157 300 154 301 148\n209 0.999 246 193 243 214 260 227 276 238 292 249 308 259 324 270 339 280 352 291 368 301 362 284 354 279 369 281 358 268 343 260 327 249 311 239 295 228 279 217 263 204\n210 0.999 290 84 289 66 277 63 264 62 252 62 239 63 227 64 215 65 203 66 191 65 179 66 180 78 192 80 205 82 218 82 231 81 243 80 255 79 266 80 278 82\n210 0.999 337 249 319 247 299 247 280 249 260 252 241 257 222 261 203 263 189 266 170 261 169 281 190 280 201 285 222 282 242 279 261 275 280 272 298 271 317 271 335 274\n210 0.999 219 299 209 297 200 300 192 303 183 306 175 309 167 311 158 313 151 314 141 316 140 326 145 332 153 332 164 331 175 329 183 326 190 322 197 318 204 314 211 311\n211 0.999 405 106 375 103 341 102 306 104 271 108 237 115 207 124 179 132 145 137 110 141 102 171 132 175 171 170 204 162 234 157 263 148 293 140 326 138 361 137 394 140\n211 0.999 457 171 417 169 372 166 325 164 279 164 233 171 189 181 147 188 101 189 56 181 34 213 77 227 125 232 174 227 219 213 261 204 300 202 345 203 391 206 436 207\n211 0.999 360 224 337 220 310 216 280 215 252 215 222 219 197 225 174 231 147 234 117 239 111 270 137 274 170 270 198 265 222 261 246 256 270 256 298 257 326 259 352 262\n212 0.999 250 40 236 40 221 41 205 42 190 42 174 42 158 44 143 46 129 51 119 59 122 72 135 70 150 67 165 66 181 65 196 64 212 63 226 63 243 62 251 54\n212 0.999 111 81 110 90 109 96 108 106 111 111 119 117 126 119 134 123 141 126 148 128 155 132 160 128 161 121 160 114 153 109 145 105 137 100 132 93 126 88 120 84\n212 0.999 85 99 85 111 84 120 83 136 84 146 97 151 112 158 125 163 138 165 151 168 166 169 170 160 170 152 170 138 157 132 143 125 133 118 121 115 108 109 96 102\n212 0.999 40 183 40 186 40 190 39 194 39 200 41 202 45 202 49 202 53 203 58 203 61 203 66 201 67 198 67 193 65 190 61 188 57 186 53 184 49 183 44 183\n212 0.999 112 190 111 192 111 194 111 197 111 200 111 203 111 206 114 206 117 206 120 207 123 207 124 204 124 202 124 198 125 195 125 192 123 189 121 190 118 189 115 189\n212 0.999 38 203 38 207 38 209 38 214 41 214 45 214 49 214 52 214 56 214 60 214 63 213 64 210 65 207 64 203 60 202 56 202 53 202 48 202 45 202 41 202\n213 0.999 112 56 128 71 145 76 163 59 183 49 205 47 226 52 245 63 264 79 273 77 286 62 276 46 262 29 243 19 223 13 204 10 182 12 162 18 144 28 130 42\n213 0.999 324 124 307 108 283 108 261 108 233 108 204 108 175 109 145 108 120 108 93 109 80 126 98 146 123 146 147 146 178 146 209 146 237 146 264 145 290 145 312 145\n213 0.999 103 175 117 194 133 211 152 226 175 236 199 240 225 239 247 231 264 218 282 201 294 185 282 167 262 169 243 187 221 200 200 202 176 196 155 182 139 166 124 158\n214 0.999 307 69 294 60 281 53 267 48 251 47 236 49 222 53 209 60 198 70 190 82 186 95 197 96 206 84 216 75 230 69 245 67 259 67 273 69 287 75 300 81\n214 0.999 253 69 246 69 241 70 236 71 230 71 224 72 221 76 221 81 222 87 222 92 223 97 229 97 234 96 239 95 246 94 252 94 256 92 255 86 254 80 253 74\n214 0.999 367 91 342 93 319 96 296 98 273 101 250 103 227 106 205 108 182 110 165 116 168 136 192 135 215 133 236 132 260 131 284 129 307 128 332 127 357 125 367 113\n214 0.999 225 138 225 145 225 152 225 158 231 161 239 161 245 161 252 161 259 161 266 161 273 160 273 154 273 147 272 142 267 137 260 137 253 138 245 137 238 137 230 137\n214 0.999 228 172 227 178 227 184 227 190 233 190 240 192 245 193 251 194 257 194 263 195 269 194 270 187 271 182 269 176 263 176 257 175 251 174 245 173 239 172 233 171\n214 0.999 307 191 307 198 311 204 319 205 327 205 335 206 343 206 351 207 359 207 367 208 374 207 374 199 369 194 361 194 354 193 345 192 338 191 330 191 321 190 313 189\n214 0.999 213 206 213 224 212 241 211 257 211 276 220 280 238 282 256 282 275 281 292 281 306 280 308 269 309 251 309 237 309 219 300 214 281 213 263 211 247 209 228 208\n214 0.999 122 237 122 247 122 256 122 267 121 279 122 291 126 297 136 301 146 305 157 306 166 304 167 296 167 286 167 278 167 266 168 255 164 251 153 247 142 244 132 240\n214 0.999 179 278 178 297 192 304 210 304 230 303 248 303 267 303 285 304 305 304 323 304 342 304 342 286 330 279 313 280 291 281 273 281 254 280 235 280 216 279 196 277\n214 0.999 366 322 352 307 329 307 307 307 284 308 261 307 239 307 216 307 194 307 177 307 154 309 167 324 190 323 212 324 235 324 257 324 280 324 303 324 327 324 348 324\n215 0.999 326 114 304 110 279 109 256 111 230 116 206 123 185 133 165 143 144 154 127 170 127 193 145 186 162 174 182 164 203 155 225 147 250 142 273 141 294 144 315 144\n215 0.999 308 148 290 143 270 142 250 144 230 147 210 151 192 158 175 166 159 175 144 186 132 202 147 195 164 183 181 176 198 168 216 162 235 157 254 155 272 155 292 157\n215 0.999 337 245 326 246 314 249 302 251 290 253 278 254 266 256 255 258 243 260 231 261 227 272 239 272 250 270 262 268 274 266 286 264 297 263 309 261 320 259 331 258\n215 0.999 190 269 182 270 173 271 163 273 153 274 144 275 135 277 126 278 117 280 108 281 106 291 115 291 125 290 134 289 143 287 152 285 161 284 170 283 178 282 187 281\n216 0.999 406 50 395 60 388 75 393 88 401 101 407 117 411 132 413 149 411 171 412 187 433 187 450 186 453 172 456 156 455 139 454 123 450 108 443 91 434 78 423 59\n216 0.999 364 81 326 86 281 95 241 103 201 112 156 120 121 127 80 135 41 148 32 188 26 229 73 228 116 221 157 213 196 205 241 196 278 190 319 185 362 174 363 129\n216 0.999 438 207 414 196 398 201 384 220 370 234 354 246 336 254 317 261 298 265 279 270 272 292 284 304 305 301 325 296 344 289 364 281 381 270 399 257 413 242 426 226\n216 0.999 357 200 320 205 283 210 244 215 208 220 170 225 134 230 98 236 63 242 26 250 20 290 57 288 94 283 133 277 171 272 209 267 246 262 281 257 321 251 355 238\n217 0.999 119 103 116 120 132 123 148 123 180 125 196 125 212 125 229 126 240 115 239 102 239 102 239 90 225 78 212 78 195 77 179 77 163 76 146 76 123 79 123 84\n217 0.999 112 129 111 147 109 167 117 177 152 177 168 177 184 177 200 177 226 178 241 178 244 178 246 164 247 144 237 136 219 135 199 135 182 134 162 132 144 130 127 129\n217 0.999 120 180 120 197 120 215 120 234 155 236 169 235 183 234 200 234 226 234 237 233 239 233 240 219 239 201 239 188 220 186 203 186 186 186 166 183 150 181 134 180\n217 0.999 263 234 242 235 223 236 203 236 193 237 174 237 154 237 135 237 117 237 104 246 104 274 124 274 144 274 165 273 185 273 205 272 225 273 244 272 262 271 264 255\n217 0.999 53 331 78 341 103 349 127 355 153 359 181 360 209 358 235 353 262 347 291 340 297 319 279 304 253 311 227 319 198 325 170 326 142 323 114 317 90 309 65 305\n217 0.999 217 384 208 382 199 382 190 382 183 382 174 382 165 382 156 382 147 382 137 384 135 394 143 397 152 396 161 396 171 396 179 396 188 396 197 395 206 395 216 393\n217 0.999 223 396 215 394 205 394 196 394 190 394 181 395 172 395 162 395 153 396 145 396 142 406 149 409 158 409 168 408 177 408 186 408 194 408 203 407 212 407 221 406\n217 0.999 238 406 227 405 215 406 203 406 194 406 183 407 171 408 159 408 147 408 135 407 132 420 142 421 154 421 165 421 178 420 189 420 201 420 213 420 223 420 237 420\n217 0.999 227 421 217 419 207 420 197 420 189 420 179 420 169 420 159 420 148 420 138 420 134 430 141 434 151 433 162 433 174 433 184 433 194 433 203 433 213 432 225 432\n217 0.999 229 432 218 432 208 432 199 433 191 433 183 432 173 432 162 433 152 433 141 433 140 444 149 447 158 446 169 446 179 446 189 445 198 445 209 445 218 445 229 443\n218 0.999 322 89 308 89 294 89 279 91 263 93 248 95 233 97 218 100 204 104 189 108 186 121 198 121 214 118 230 114 245 113 261 110 277 108 292 107 304 107 319 106\n218 0.999 376 124 352 123 329 123 302 123 275 125 249 129 222 133 197 138 173 145 146 152 141 174 162 173 189 166 216 161 243 158 272 154 298 152 324 152 346 153 371 154\n218 0.999 290 168 283 168 275 169 265 169 256 171 247 172 238 172 228 174 220 175 211 176 211 185 217 188 227 188 237 186 246 186 257 185 266 184 275 183 283 183 290 182\n218 0.999 332 223 326 220 319 221 311 222 304 222 297 222 290 223 282 223 275 223 268 223 263 227 267 232 275 232 283 231 290 230 298 230 305 230 312 229 319 229 326 229\n218 0.999 368 232 361 233 354 233 346 234 338 234 331 234 323 235 315 236 307 237 299 237 298 244 304 247 312 246 320 246 329 246 338 245 346 245 353 244 359 244 367 244\n218 0.999 280 239 276 239 269 239 263 239 257 239 251 239 245 240 239 240 233 240 227 241 226 247 228 252 235 251 242 251 249 251 256 251 263 250 268 250 274 250 279 249\n218 0.999 201 242 197 242 192 242 186 242 181 243 176 243 171 243 165 243 160 244 156 247 157 253 160 255 166 255 172 254 178 254 184 254 189 254 195 254 199 254 203 251\n218 0.999 365 249 359 250 352 250 344 250 337 250 329 251 322 251 314 251 307 251 300 254 301 262 305 264 313 264 321 264 329 263 338 263 346 263 353 262 358 262 365 261\n218 0.999 227 264 230 267 237 267 243 267 251 267 257 267 262 267 268 267 272 266 274 261 274 254 271 254 265 254 260 254 254 254 248 254 242 254 236 254 230 255 227 259\n218 0.999 207 258 201 258 195 258 189 258 182 258 176 258 170 259 163 259 158 259 152 259 149 264 153 267 161 268 167 268 174 268 182 267 188 267 193 267 198 267 206 267\n219 0.999 440 168 396 160 347 152 299 148 256 148 209 152 162 157 124 159 84 161 31 168 37 227 74 227 120 220 168 216 215 213 262 211 306 214 353 218 393 223 430 212\n219 0.999 247 223 229 224 212 225 194 225 178 226 160 227 143 228 129 229 113 230 99 234 101 257 117 258 134 256 152 255 168 254 185 253 201 252 220 251 239 250 247 240\n219 0.999 425 260 424 264 425 268 426 273 427 276 432 276 436 276 442 277 449 277 454 274 454 275 454 271 454 266 454 261 450 260 445 260 441 259 436 258 430 258 428 259\n220 0.999 397 170 376 140 348 112 316 86 282 68 245 65 207 78 171 100 143 126 116 153 95 186 130 176 155 147 182 121 210 100 247 85 281 93 314 116 341 143 365 174\n220 0.999 248 137 250 141 250 151 249 157 253 160 258 161 265 162 271 162 278 161 285 159 292 156 291 152 291 146 291 138 290 132 282 130 276 130 270 130 261 131 251 135\n220 0.999 158 146 157 172 151 213 150 231 180 231 211 229 239 228 269 228 296 227 325 227 347 224 346 197 345 167 340 159 317 172 284 180 257 180 230 160 211 146 171 146\n220 0.999 210 232 210 242 211 254 220 254 230 254 238 254 247 254 257 254 267 255 276 255 287 255 288 246 286 237 278 235 269 234 258 234 248 234 239 233 228 233 213 232\n220 0.999 383 246 377 240 371 234 361 237 354 242 348 246 340 250 332 253 324 255 320 259 322 266 326 274 333 276 340 272 347 269 356 265 362 262 370 258 375 255 374 255\n220 0.999 97 251 110 259 124 269 141 275 156 279 173 283 190 285 208 287 225 288 240 288 242 275 235 265 218 265 202 265 184 262 168 259 153 255 137 248 124 239 111 237\n221 0.999 132 55 115 55 102 54 86 55 68 56 56 56 39 57 30 57 15 67 17 78 20 95 25 112 40 105 55 99 69 95 84 91 95 88 114 85 131 83 132 70\n221 0.999 166 60 168 79 182 82 192 83 213 86 232 89 252 95 277 106 287 109 298 96 298 90 301 72 300 57 283 57 265 57 247 56 228 55 212 56 193 56 177 57\n221 0.999 85 92 84 111 94 117 111 118 132 118 150 118 168 118 190 118 207 117 223 118 238 117 239 99 227 92 209 92 190 91 173 91 155 91 136 91 117 91 100 91\n221 0.999 289 125 264 125 233 125 203 125 172 125 144 125 117 125 90 125 59 125 31 125 30 155 60 155 89 155 119 156 147 156 175 156 203 156 232 154 261 154 290 151\n221 0.999 142 161 143 165 145 169 150 170 155 169 160 170 165 170 172 170 177 169 181 168 183 164 182 160 179 157 175 156 170 156 165 156 159 156 154 156 149 157 145 157\n221 0.999 27 167 27 198 49 205 79 202 112 203 145 204 177 203 212 201 239 200 267 200 296 198 298 172 271 167 241 167 210 167 177 168 146 167 114 166 84 166 54 167\n221 0.999 279 205 252 205 229 205 203 205 177 205 154 205 130 205 107 205 82 205 60 205 48 224 73 225 98 225 123 225 147 224 172 225 197 224 221 224 247 224 272 224\n221 0.999 37 250 55 258 79 258 102 258 126 258 150 258 174 258 199 258 222 258 245 258 268 255 250 248 226 248 202 248 178 248 155 248 132 248 107 248 84 249 61 249\n222 0.999 45 195 44 242 45 277 86 274 124 269 163 263 203 261 241 266 278 274 315 277 350 277 349 229 339 197 303 197 268 195 234 195 194 194 152 194 114 194 74 194\n222 0.999 120 277 120 300 124 313 147 312 166 311 186 312 204 311 223 311 241 312 258 313 275 311 276 283 268 274 249 272 232 270 213 268 194 268 173 267 154 270 135 272\n222 0.999 127 318 140 324 154 323 169 324 184 324 198 324 213 324 227 324 242 324 256 324 273 322 260 315 244 315 230 315 215 315 201 315 185 315 170 315 156 315 141 315\n222 0.999 38 388 38 392 39 395 43 396 46 396 50 396 52 396 56 396 58 396 61 397 66 393 65 389 63 388 60 387 57 387 54 387 50 387 47 387 44 387 41 387\n223 0.999 26 187 43 201 62 211 81 218 105 223 127 222 144 221 163 216 181 208 199 197 209 181 190 170 171 181 153 189 133 193 115 193 95 191 75 185 57 175 36 169\n224 0.999 326 0 314 0 305 0 296 2 287 7 280 11 273 15 265 20 258 25 255 32 256 43 260 45 269 40 277 35 285 30 293 26 302 20 308 17 316 12 324 7\n224 0.999 369 27 347 33 325 40 302 48 279 59 260 69 240 79 217 90 194 101 190 121 190 146 207 139 230 125 251 113 273 102 294 91 315 80 335 71 356 63 371 52\n224 0.999 70 104 67 129 58 162 74 184 94 205 113 219 138 228 169 229 192 227 218 225 244 225 247 201 247 175 235 166 208 171 178 174 151 170 132 159 112 142 94 123\n224 0.999 276 175 274 186 273 204 284 211 298 218 307 223 317 230 330 241 341 252 350 262 358 264 360 251 359 237 354 224 346 214 333 208 321 198 314 191 302 183 289 177\n225 0.999 46 118 83 124 122 97 166 76 213 64 261 58 307 60 353 72 396 93 436 121 475 131 449 97 408 66 365 46 317 34 268 30 218 33 171 46 124 63 82 87\n225 0.999 46 153 47 179 67 181 89 182 112 182 134 183 155 183 176 184 198 185 219 186 240 186 240 155 221 154 200 154 178 154 156 154 134 152 111 152 88 151 66 151\n225 0.999 282 164 284 187 303 188 323 188 343 189 363 189 383 190 403 191 423 191 443 191 461 189 460 161 441 159 421 159 401 159 382 158 361 157 339 157 318 157 297 157\n225 0.999 46 189 47 217 70 218 94 219 120 219 144 220 168 220 194 220 218 220 241 221 266 222 265 192 242 191 218 191 192 190 168 189 143 189 118 189 93 188 68 188\n225 0.999 277 205 284 221 308 222 331 222 354 222 378 223 400 224 423 224 447 225 469 225 484 217 479 197 454 195 432 195 408 194 385 194 362 193 338 193 315 192 290 192\n225 0.999 152 261 152 295 176 298 204 298 232 298 259 298 285 298 313 298 340 298 367 298 393 298 393 261 369 261 343 260 316 260 289 260 261 260 233 260 204 260 176 259\n225 0.999 347 328 348 342 362 342 376 342 389 342 402 342 415 342 429 342 443 342 456 342 469 341 468 324 455 324 442 324 428 324 414 324 401 324 387 324 372 324 358 325\n226 0.999 328 6 303 14 272 26 242 43 216 60 194 74 172 87 146 102 127 117 127 147 128 186 154 174 182 162 212 150 241 137 270 124 296 113 316 102 335 87 336 54\n226 0.999 348 143 319 146 287 150 256 157 226 167 201 178 177 189 152 204 128 221 112 237 113 279 139 270 164 253 190 240 217 230 246 222 277 218 303 216 329 214 341 184\n226 0.999 247 245 240 246 232 249 223 252 216 254 209 256 202 258 195 260 189 263 188 270 188 280 195 280 203 278 210 275 218 273 226 271 233 269 239 267 246 265 247 257\n227 0.999 65 123 78 148 105 143 133 139 163 137 190 137 220 139 250 141 276 142 303 148 328 154 319 127 293 122 265 117 236 114 207 113 178 112 149 112 121 113 93 118\n227 0.999 148 146 151 160 165 160 180 160 194 160 208 160 223 161 238 161 252 161 264 161 279 162 275 148 262 148 248 148 233 147 219 147 205 147 190 146 177 146 162 146\n227 0.999 89 169 100 185 124 185 149 185 175 185 200 185 226 186 252 186 276 186 300 186 327 183 311 170 289 169 264 169 238 169 213 169 187 169 162 169 138 169 113 169\n227 0.999 131 251 127 251 123 251 119 252 115 252 113 252 109 252 105 253 102 254 102 258 102 262 105 263 109 262 114 262 117 261 120 261 124 261 128 261 131 259 132 254\n228 0.999 148 53 135 52 126 52 118 53 107 54 88 57 69 59 63 60 56 74 55 81 54 103 62 101 73 97 90 94 107 92 116 91 128 89 136 88 148 83 148 72\n228 0.999 85 120 88 124 94 123 96 119 99 115 103 115 107 117 110 120 113 122 118 119 117 114 115 111 112 107 108 104 103 103 97 103 94 106 91 109 89 113 86 115\n228 0.999 115 133 111 131 108 135 106 139 103 142 99 142 95 141 92 137 90 134 86 136 86 141 88 144 91 148 95 151 100 152 104 151 108 148 111 145 114 141 115 137\n228 0.999 142 134 131 135 128 144 122 155 114 165 104 171 93 170 83 163 76 153 71 143 66 152 70 163 77 173 87 181 100 184 109 183 120 176 128 167 134 157 138 147\n228 0.999 159 210 147 210 139 212 127 214 114 216 100 217 86 219 74 221 60 223 54 226 55 243 68 243 79 241 93 239 106 238 117 236 129 234 139 233 153 231 160 227\n228 0.999 153 258 139 260 130 261 118 263 105 264 88 266 72 269 60 270 43 272 43 278 46 301 59 302 70 301 87 299 104 297 114 295 129 293 139 291 154 288 155 277\n229 0.999 446 111 417 67 384 47 330 26 277 16 237 15 184 22 139 33 96 53 53 84 54 118 85 155 122 126 170 104 218 92 268 92 316 100 360 121 406 144 433 133\n229 0.999 133 135 141 156 170 156 192 156 217 157 244 158 269 158 296 158 322 158 343 159 363 151 354 132 330 132 305 131 280 130 255 130 230 129 202 129 177 129 152 129\n229 0.999 40 231 78 231 131 232 167 231 213 231 273 233 320 232 373 232 417 231 463 222 464 194 425 173 383 173 335 172 288 172 239 171 189 171 133 171 89 169 48 171\n229 0.999 190 257 189 273 202 275 215 275 228 275 244 275 257 276 271 276 284 275 297 276 307 274 308 258 295 257 282 258 268 258 255 258 241 258 227 257 214 257 202 256\n229 0.999 373 278 356 262 332 269 308 275 283 278 257 280 232 280 207 279 184 274 159 267 141 272 153 288 176 295 200 301 226 303 251 304 276 302 302 299 328 295 352 287\n230 0.999 33 82 48 129 86 132 131 118 175 108 206 106 253 106 298 111 344 120 384 135 412 119 424 80 380 61 337 49 295 44 247 40 200 41 157 45 112 51 73 65\n230 0.999 300 142 262 140 257 140 230 142 189 149 139 156 115 173 109 200 108 234 110 258 125 266 156 268 190 270 215 272 244 273 274 273 291 260 295 233 298 205 302 175\n230 0.999 336 336 303 338 291 338 264 338 234 338 198 338 166 338 136 338 119 349 120 373 121 400 149 400 179 400 205 400 233 400 261 400 287 400 320 401 334 387 334 364\n231 0.999 183 47 168 46 150 47 133 51 116 56 99 64 84 73 70 83 56 95 52 113 51 126 68 115 81 105 98 95 115 88 133 85 149 84 166 85 180 86 184 74\n231 0.999 148 84 140 85 131 88 123 91 114 94 106 97 98 100 90 104 82 106 75 110 73 120 82 118 91 115 99 112 108 110 116 107 124 104 132 101 139 98 148 95\n231 0.999 183 95 169 98 154 103 140 108 125 112 110 117 96 121 82 126 69 130 53 135 51 149 67 145 82 141 96 137 111 133 126 129 140 125 154 121 169 117 184 113\n231 0.999 162 128 151 131 139 134 127 137 115 140 103 143 92 145 81 148 70 151 59 155 59 168 71 166 82 164 94 161 107 159 118 156 130 154 142 151 153 149 163 147\n231 0.999 184 155 178 155 172 155 167 156 160 158 154 159 149 161 144 163 138 164 133 166 133 174 139 175 145 173 150 171 157 169 162 168 168 166 173 166 178 165 184 164\n231 0.999 113 166 105 169 98 172 91 176 84 179 77 183 70 186 63 190 57 193 50 196 51 206 58 204 65 201 73 197 80 193 87 190 93 187 100 183 106 180 113 177\n231 0.999 83 168 79 168 75 169 71 170 67 171 62 171 58 173 55 174 51 175 49 177 50 184 54 184 58 183 62 182 67 182 70 181 75 180 78 180 82 179 84 176\n231 0.999 112 189 110 189 107 189 105 189 102 190 99 190 96 191 95 193 95 196 96 200 96 202 99 203 101 202 104 202 107 201 109 201 112 201 114 200 114 198 113 194\n232 0.999 444 22 441 26 444 31 446 36 449 41 452 46 454 52 457 56 460 62 462 66 468 70 471 64 468 59 465 53 463 49 460 43 457 39 455 34 453 29 450 24\n232 0.999 435 28 433 32 435 36 438 40 440 45 442 49 445 52 447 55 448 60 451 64 457 67 459 61 457 57 455 53 453 49 451 45 448 41 447 37 444 33 442 29\n232 0.999 444 111 402 95 362 82 318 72 276 69 232 69 190 72 144 82 105 95 67 111 53 143 85 154 126 138 168 125 212 118 257 114 298 118 339 127 381 139 419 148\n232 0.999 431 157 416 157 403 157 388 157 375 157 360 157 346 158 332 159 317 159 311 166 313 181 327 182 341 182 356 182 371 182 386 181 400 181 413 181 427 181 433 172\n232 0.999 176 161 164 159 152 159 141 159 129 159 117 160 106 160 94 161 82 161 78 169 78 182 90 183 102 182 113 183 125 183 138 182 150 182 160 182 171 182 176 174\n232 0.999 414 213 377 213 339 213 300 213 262 213 222 213 183 213 148 213 106 213 85 226 84 268 123 271 163 271 202 271 242 272 280 272 316 272 353 271 393 270 414 255\n233 0.999 274 8 265 11 256 13 247 16 238 19 230 21 221 24 213 27 207 31 207 39 207 48 216 47 224 43 233 41 242 38 250 36 259 35 268 32 272 27 274 17\n233 0.999 295 52 283 50 271 49 259 48 246 48 234 51 222 56 213 62 204 69 195 75 196 87 206 84 216 77 226 71 236 67 247 64 259 63 271 65 283 67 293 64\n233 0.999 394 52 394 63 394 72 394 81 400 86 408 90 417 94 424 98 433 102 441 107 448 110 449 100 449 91 448 83 443 77 435 72 427 68 418 63 410 59 400 55\n233 0.999 102 65 96 67 90 69 83 71 76 73 70 76 64 77 58 80 57 84 57 90 57 97 64 96 69 94 75 91 82 90 88 89 94 87 100 86 102 80 103 72\n233 0.999 287 67 277 70 266 73 254 76 242 79 231 82 221 84 210 88 199 91 195 98 197 108 207 106 218 104 229 103 240 100 251 98 262 96 274 93 286 90 287 79\n233 0.999 114 100 106 99 97 97 88 96 79 97 72 100 63 104 57 110 49 114 45 119 46 128 54 127 60 122 67 117 75 113 82 111 90 111 99 111 108 114 114 108\n233 0.999 359 106 359 116 361 125 362 136 373 135 384 134 394 134 405 136 415 138 425 143 433 144 433 133 434 124 432 114 421 110 411 109 401 107 391 106 379 106 368 106\n233 0.999 42 106 37 107 33 108 28 110 24 111 18 113 13 114 10 116 11 120 12 125 12 130 16 131 20 130 25 128 30 127 34 126 39 125 43 123 44 117 43 111\n233 0.999 108 113 100 113 92 115 84 117 76 119 68 121 60 124 54 127 47 130 46 135 47 144 55 142 63 141 70 139 78 137 86 135 93 133 102 132 108 130 109 121\n233 0.999 40 148 35 148 30 149 25 151 20 152 15 154 11 156 6 159 4 161 4 166 5 171 10 170 15 169 19 168 24 167 29 166 34 165 39 164 41 159 41 154\n233 0.999 278 158 269 158 262 158 254 159 245 161 236 162 228 163 219 164 212 166 205 166 205 174 212 177 220 176 228 174 236 173 243 172 252 171 260 170 269 169 277 168\n233 0.999 326 163 314 166 304 168 293 169 281 171 271 172 259 173 248 174 238 175 237 183 237 195 247 195 257 194 269 192 280 191 291 189 301 188 313 186 323 185 326 174\n233 0.999 354 181 338 183 322 185 305 187 289 189 273 192 257 194 240 196 224 197 216 203 216 217 231 217 247 216 263 214 278 212 294 210 310 208 326 207 343 205 353 196\n233 0.999 334 206 322 208 309 210 297 211 284 213 272 214 260 216 247 217 234 219 229 225 230 236 242 235 254 234 266 233 278 231 290 229 302 228 314 227 328 225 335 219\n233 0.999 314 233 306 233 298 233 289 233 281 233 272 233 265 235 257 237 249 238 245 243 246 251 253 251 261 249 269 248 277 247 284 246 292 246 301 246 309 246 314 241\n234 0.999 153 143 152 149 152 155 156 158 160 159 165 160 171 161 176 161 182 161 186 162 191 160 192 154 192 147 189 145 184 144 179 144 173 144 167 144 162 143 156 142\n234 0.999 111 161 111 195 113 246 132 244 173 233 215 222 254 216 293 215 332 216 362 220 392 226 419 223 400 193 361 182 332 174 296 166 258 161 219 158 182 161 146 157\n234 0.999 295 224 295 236 300 245 311 246 322 247 334 248 346 249 357 248 368 249 379 248 390 247 391 233 388 226 376 224 364 224 353 223 341 223 329 223 317 223 305 223\n235 0.999 190 43 172 38 152 39 135 43 118 50 102 60 90 73 80 88 74 104 67 121 73 135 93 142 101 131 105 114 115 99 130 88 146 80 166 77 184 76 187 60\n235 0.999 208 52 202 65 195 75 195 86 206 94 217 105 224 117 229 129 233 145 238 155 248 151 261 147 268 141 264 127 259 114 254 102 247 90 239 79 229 69 218 60\n235 0.999 133 101 131 112 130 123 133 131 143 132 155 133 166 135 176 136 187 137 196 138 207 138 208 129 210 119 205 111 195 109 185 107 174 104 162 103 151 101 141 101\n235 0.999 135 138 135 147 135 156 138 163 148 163 158 164 167 164 176 165 186 165 195 165 203 164 203 155 203 146 199 139 191 139 182 138 172 137 162 137 151 136 143 136\n235 0.999 83 177 83 200 82 223 86 236 110 236 135 237 161 238 182 239 207 240 228 240 249 240 249 223 250 201 245 186 221 185 195 184 171 183 148 181 127 180 104 179\n235 0.999 132 258 131 268 142 271 153 275 163 276 175 277 188 276 202 274 212 267 219 257 218 255 216 245 205 247 196 248 182 249 170 247 159 243 148 240 137 238 134 247\n236 0.999 34 66 39 78 48 83 64 78 79 75 93 73 108 74 124 76 139 80 152 84 163 80 168 67 153 62 138 58 122 54 107 53 92 53 77 54 62 57 48 61\n236 0.999 33 100 46 101 59 101 74 101 88 101 103 101 117 102 132 102 147 102 165 103 168 91 158 88 143 87 128 87 113 87 98 86 83 86 68 86 54 86 39 86\n236 0.999 34 105 38 116 53 116 68 116 82 116 96 116 111 116 125 116 140 116 153 116 168 116 163 106 149 106 134 105 120 105 105 105 91 105 76 104 62 104 48 105\n236 0.999 52 148 61 149 71 149 82 150 94 150 105 150 116 150 128 151 141 151 152 150 154 137 147 136 136 136 124 136 112 136 101 136 90 136 78 136 66 136 53 136\n236 0.999 47 162 51 169 63 168 75 168 86 168 97 168 108 168 119 168 131 168 141 168 151 164 145 159 134 159 123 159 112 158 100 158 89 158 78 158 67 158 56 158\n236 0.999 41 175 48 183 62 183 76 183 89 183 102 184 116 184 129 184 143 184 156 184 170 183 164 174 150 174 136 174 122 174 108 174 95 174 81 174 67 174 54 174\n236 0.999 1 182 1 184 1 187 1 190 4 190 6 190 9 190 11 190 14 190 16 190 19 190 20 187 19 184 18 182 16 182 13 182 11 182 8 182 5 181 3 181\n236 0.999 41 184 46 191 57 191 69 191 79 191 90 191 101 191 112 191 124 191 134 191 146 189 139 184 128 183 117 183 107 183 96 183 85 183 74 183 62 183 52 183\n236 0.999 31 191 38 201 53 200 69 200 84 200 99 200 114 200 129 200 144 200 159 200 174 200 167 191 152 191 137 191 121 191 106 191 91 191 76 191 61 191 46 191\n236 0.999 1 192 1 195 1 197 2 200 4 200 7 200 9 200 11 200 14 200 16 200 19 200 19 197 19 193 17 192 15 192 12 192 10 192 7 192 5 191 3 192\n236 0.999 152 207 140 207 128 207 116 207 104 207 92 207 81 207 68 207 56 207 55 218 55 229 67 230 79 230 92 230 103 230 115 230 128 230 140 231 152 229 151 219\n236 0.999 79 254 79 262 86 265 95 264 103 264 110 265 119 265 127 265 135 265 143 265 151 265 151 255 144 254 136 254 128 254 120 254 111 254 103 254 94 254 87 254\n237 0.999 163 119 180 119 200 120 220 120 240 120 260 121 280 122 301 122 320 123 340 123 354 110 336 109 316 108 296 107 276 107 256 107 235 106 215 106 195 106 176 105\n237 0.999 130 141 133 174 151 181 178 177 207 175 238 174 267 175 296 176 324 180 351 185 377 187 378 147 358 144 328 143 299 142 270 141 243 141 214 140 185 139 157 139\n237 0.999 240 190 239 196 239 203 240 209 241 216 244 220 251 220 257 220 263 220 268 220 274 219 275 213 275 204 275 197 273 191 269 188 264 188 258 188 252 188 245 188\n237 0.999 152 227 150 257 166 264 190 264 214 265 239 265 264 266 289 266 312 267 335 267 357 267 357 231 342 231 319 233 293 235 269 236 246 236 221 234 197 231 173 226\n238 0.999 47 69 55 83 67 79 80 74 95 71 111 69 125 70 140 72 154 77 168 81 173 73 172 64 158 58 144 54 129 51 114 51 100 51 85 53 71 57 58 63\n238 0.999 83 80 83 89 84 95 94 95 103 96 113 96 122 95 131 95 141 95 146 92 146 91 146 82 144 76 135 76 127 76 119 76 110 76 102 76 94 76 86 76\n238 0.999 197 107 179 106 162 106 145 106 127 106 110 106 92 106 75 106 57 107 39 106 32 119 48 121 67 120 85 120 102 120 119 120 137 120 155 120 173 120 189 120\n238 0.999 55 132 61 141 76 141 88 141 100 141 114 141 127 141 141 141 154 141 168 141 174 134 168 127 156 127 143 127 130 127 117 127 103 127 90 127 76 126 63 126\n238 0.999 55 150 57 161 72 162 86 162 99 162 113 162 127 162 141 162 154 161 168 162 175 157 171 147 158 147 144 147 131 147 118 147 104 147 90 147 76 147 63 147\n238 0.999 157 170 148 167 140 167 132 167 121 167 113 167 103 167 94 167 84 167 75 168 74 179 81 182 91 181 101 181 110 181 120 182 129 182 139 182 150 182 157 178\n238 0.999 189 188 175 186 159 186 143 186 127 186 111 186 95 186 79 186 63 186 46 186 41 198 55 202 71 202 87 202 103 201 120 202 136 201 153 201 170 201 186 201\n238 0.999 48 210 55 222 70 222 84 222 98 222 114 222 129 222 144 222 159 222 174 222 183 218 179 207 163 207 148 207 133 207 118 207 103 207 88 207 73 207 58 207\n238 0.999 47 228 48 243 64 243 80 242 95 243 111 243 127 243 143 243 158 243 172 242 184 242 181 228 166 228 151 228 136 228 120 228 105 228 90 228 75 228 60 228\n239 0.999 103 77 103 92 103 108 118 108 133 108 148 108 164 108 179 108 194 108 204 108 219 108 219 93 218 79 205 78 189 78 174 78 159 78 143 77 129 76 116 76\n239 0.999 94 119 98 134 108 140 124 136 140 133 155 132 172 132 190 135 204 137 217 141 229 139 232 122 219 117 204 113 188 110 171 109 155 108 138 109 121 112 108 115\n239 0.999 83 182 92 193 109 193 128 193 147 193 167 194 186 194 205 194 227 194 234 193 238 172 227 163 210 163 191 163 173 163 155 163 137 163 117 162 99 162 83 163\n239 0.999 235 198 216 197 198 197 179 197 162 197 147 197 130 197 111 197 92 197 85 206 85 223 102 224 120 225 139 225 156 225 173 226 190 227 209 227 228 227 234 216\n240 0.999 216 101 200 74 186 52 166 31 135 16 105 14 72 22 45 38 26 60 11 87 0 121 25 124 43 103 59 77 80 58 111 51 139 58 160 77 174 97 193 121\n240 0.999 143 156 134 156 128 156 123 157 109 157 99 157 89 157 79 157 77 164 77 173 78 182 85 181 95 180 104 180 114 180 122 180 130 180 136 179 143 178 143 166\n240 0.999 3 180 4 203 27 211 55 211 74 211 97 211 120 211 145 211 169 211 192 211 216 211 214 187 195 180 171 180 147 180 123 180 99 180 70 180 47 180 26 180\n240 0.999 55 213 55 228 55 244 77 244 84 244 99 244 111 244 126 245 140 244 153 244 166 243 166 228 165 216 152 212 138 212 125 212 110 212 93 211 79 212 68 212\n240 0.999 73 256 72 266 76 274 89 273 95 274 104 274 113 274 123 274 132 275 141 275 150 274 150 263 147 256 138 256 129 255 119 255 110 255 98 254 87 255 81 255\n241 0.999 100 62 117 120 158 110 202 104 245 102 295 101 346 101 394 101 447 105 512 113 543 99 542 64 483 49 443 45 393 43 345 42 293 40 246 42 196 44 149 51\n241 0.999 171 229 170 259 173 266 190 267 206 269 223 271 241 273 256 275 275 276 298 279 310 267 310 255 301 240 288 240 270 238 255 237 238 235 222 234 205 232 188 229\n241 0.999 498 268 496 242 480 234 456 236 432 238 411 238 390 239 368 240 346 242 330 244 318 250 319 273 330 278 358 281 381 280 403 279 426 279 445 278 467 275 487 272\n241 0.999 88 320 135 324 177 324 222 322 265 324 312 326 363 327 408 327 460 328 516 328 536 285 498 281 443 281 401 281 350 281 302 281 256 280 209 282 161 282 113 283\n242 0.999 98 126 111 143 133 134 156 127 179 126 202 128 226 134 249 141 272 152 293 163 312 158 306 141 285 128 263 117 239 109 215 103 190 100 165 102 141 105 118 115\n242 0.999 109 184 109 209 121 222 143 227 167 233 187 238 209 244 231 250 253 255 274 260 292 260 293 233 291 216 261 212 242 207 220 203 198 198 175 194 154 189 130 185\n242 0.999 120 228 120 250 128 266 150 272 168 277 187 283 206 289 224 296 245 301 263 307 279 307 281 281 277 268 255 263 237 257 217 251 199 246 179 241 159 236 138 230\n242 0.999 115 305 114 313 114 326 114 334 123 338 132 340 140 343 150 344 160 346 168 349 177 349 177 338 179 325 178 314 171 313 163 311 153 310 143 308 133 305 123 305\n242 0.999 210 326 210 342 209 362 223 370 241 375 256 379 272 385 288 390 305 394 320 398 335 399 335 381 336 362 324 353 309 350 294 346 277 341 260 336 244 332 226 327\n242 0.999 111 342 111 371 111 405 133 416 157 426 181 435 205 445 229 455 254 465 276 475 294 473 299 447 302 424 275 411 253 403 231 395 208 387 184 378 159 370 133 353\n243 0.999 114 34 134 62 159 46 188 35 219 28 251 25 281 26 313 34 340 48 364 70 384 68 381 43 358 19 330 4 298 0 264 0 233 0 202 0 170 3 142 16\n243 0.999 216 121 216 133 216 144 223 150 232 150 242 150 251 150 262 150 271 150 280 150 289 150 290 138 290 125 283 123 274 123 263 123 254 123 245 122 235 121 225 120\n243 0.999 141 153 149 171 173 172 197 173 219 174 243 175 266 175 289 176 312 177 335 178 358 179 349 159 325 158 302 158 278 157 256 155 233 155 209 154 186 152 164 152\n243 0.999 175 185 188 189 204 189 221 189 237 190 253 191 269 191 286 192 302 192 319 193 330 182 317 178 301 177 284 177 268 176 251 176 235 176 219 175 203 175 186 175\n243 0.999 202 199 201 213 213 216 225 216 236 216 248 216 260 216 271 217 283 217 295 217 307 218 306 201 296 200 284 200 272 200 260 199 248 199 237 199 225 199 213 198\n243 0.999 206 219 209 229 218 230 230 230 240 230 250 230 261 230 271 230 281 230 292 232 303 230 301 219 290 219 279 219 269 219 258 219 248 219 237 219 226 219 216 219\n243 0.999 194 244 205 246 217 246 230 246 243 246 256 247 268 247 281 247 293 248 306 248 313 236 303 234 289 234 277 233 264 233 251 233 238 233 225 232 212 232 198 232\n244 0.999 49 172 64 204 85 208 110 189 151 177 194 174 226 172 265 176 299 180 335 197 359 185 368 158 336 142 298 130 260 124 222 122 186 124 149 130 114 140 81 154\n244 0.999 334 197 307 182 275 177 247 172 222 172 197 172 172 175 147 176 118 182 102 196 93 221 118 222 146 221 175 221 204 220 233 219 262 219 288 219 310 220 325 212\n244 0.999 305 224 281 222 256 220 232 220 209 220 186 221 164 221 140 222 120 222 100 225 97 250 115 253 139 253 164 252 186 252 211 252 235 252 261 252 281 252 299 248\n244 0.999 322 244 322 251 329 254 339 253 348 253 357 254 365 254 373 254 385 254 389 251 390 241 390 236 385 229 376 229 366 229 356 228 348 228 339 228 329 228 325 233\n244 0.999 0 288 0 295 0 302 3 306 9 307 16 308 22 309 30 310 36 312 44 314 47 311 48 305 49 297 45 292 38 290 32 289 25 289 18 288 11 288 5 287\n244 0.999 0 308 0 315 0 322 0 328 6 329 12 331 17 332 23 333 29 334 36 335 40 334 42 328 42 322 40 316 33 314 27 313 22 312 16 311 9 309 5 308\n245 0.999 140 91 140 96 141 103 142 108 148 109 155 109 160 109 166 109 171 109 177 109 182 109 182 103 182 93 180 89 175 89 170 89 164 89 157 89 152 89 145 89\n245 0.999 60 90 60 111 59 150 72 157 102 157 132 157 159 157 190 157 215 158 241 158 266 158 266 139 265 109 236 111 215 110 188 108 163 109 137 104 109 97 84 90\n245 0.999 112 165 112 177 111 190 122 193 137 193 151 193 163 194 176 194 188 194 200 194 211 193 211 177 210 165 197 164 186 164 173 164 161 163 149 163 135 163 123 163\n245 0.999 90 206 103 212 117 218 132 223 147 226 164 226 180 225 195 222 210 218 223 212 232 202 217 195 204 201 190 206 175 209 159 209 144 207 130 205 114 199 101 193\n246 0.999 167 102 190 107 213 110 237 113 265 115 296 115 323 112 350 109 377 107 402 101 402 75 379 77 351 80 325 81 303 84 277 85 251 83 225 81 199 78 172 75\n246 0.999 132 138 165 144 199 150 231 152 265 153 300 154 333 153 367 150 405 146 442 142 449 112 419 115 387 120 351 124 318 125 284 126 250 126 217 123 184 120 151 113\n246 0.999 459 176 432 158 399 163 365 165 331 168 298 169 265 168 230 166 196 164 163 159 135 164 151 186 188 190 221 193 260 194 295 194 331 194 365 192 400 189 432 185\n246 0.999 229 234 233 236 248 236 266 236 283 237 302 238 321 239 338 239 356 237 364 228 362 211 349 210 332 211 317 211 307 211 292 210 276 209 260 208 244 208 230 209\n247 0.999 46 51 46 65 48 72 61 75 73 76 84 78 95 80 106 82 117 84 128 85 138 86 140 71 134 66 123 64 112 62 101 60 90 58 79 56 68 54 56 52\n247 0.999 39 83 42 97 50 99 62 98 75 97 87 98 98 99 110 102 121 106 131 110 141 112 142 100 132 94 121 89 109 85 98 83 85 81 74 80 62 80 50 81\n247 0.999 67 113 71 114 77 115 83 115 88 116 94 117 100 118 106 118 112 119 118 120 120 112 117 109 111 108 105 108 99 107 93 106 87 105 81 104 75 104 69 105\n247 0.999 77 116 76 121 76 128 79 130 84 130 89 131 94 131 99 132 104 133 108 133 112 133 112 127 112 120 109 119 104 118 100 118 95 118 90 117 85 116 80 116\n247 0.999 77 145 77 150 77 156 78 160 84 160 88 160 93 161 98 161 104 161 108 161 112 161 112 155 112 148 110 145 105 145 101 145 96 145 91 145 86 144 80 144\n247 0.999 71 174 74 175 79 175 85 176 90 176 95 176 101 176 106 176 112 176 116 175 118 169 114 166 110 166 104 166 99 165 93 165 88 165 82 165 77 165 71 166\n247 0.999 80 178 80 181 80 186 84 187 87 187 90 187 94 187 97 187 101 188 104 188 107 187 107 183 107 178 104 178 101 178 97 178 94 178 90 178 86 177 82 178\n247 0.999 70 191 70 198 74 199 80 199 85 199 90 200 95 200 101 200 107 200 112 200 115 198 115 191 112 190 106 189 101 189 95 189 90 189 84 189 79 189 73 189\n248 0.999 179 72 189 87 204 84 220 77 237 74 254 74 270 76 287 80 304 87 318 96 332 87 328 77 314 68 296 59 279 54 262 52 244 52 227 53 209 58 194 65\n248 0.999 114 157 114 162 113 172 117 176 122 175 129 176 134 177 141 177 147 178 154 178 160 177 161 169 161 159 158 157 152 157 146 157 140 157 133 157 126 157 119 156\n248 0.999 346 172 345 177 346 185 350 188 356 188 362 188 368 189 374 189 380 189 387 189 392 189 393 180 392 172 388 171 381 171 376 171 370 170 363 170 357 170 350 171\n248 0.999 167 279 179 288 196 297 212 304 231 308 251 310 270 309 288 307 304 301 320 295 331 284 319 271 301 279 282 285 265 287 248 287 231 285 212 280 195 274 180 263\n249 0.999 226 35 210 38 196 41 180 45 164 48 148 51 134 55 119 58 103 62 95 72 96 85 108 84 125 81 143 78 159 75 175 72 191 69 208 67 224 64 226 51\n249 0.999 223 91 208 88 195 87 181 89 168 90 155 92 142 94 129 97 115 100 102 106 104 118 113 118 126 114 140 111 154 109 168 108 181 107 196 108 211 109 219 104\n249 0.999 187 112 179 113 171 113 162 113 154 114 146 114 140 115 132 115 122 116 121 122 121 132 126 133 135 132 145 131 153 131 162 131 171 130 180 130 186 128 187 121\n249 0.999 246 151 238 147 229 147 218 147 208 147 199 148 190 148 180 148 170 148 159 148 156 154 161 159 172 159 182 159 193 159 203 159 213 159 223 159 233 159 244 159\n249 0.999 82 151 85 156 93 157 99 157 106 157 114 157 121 157 129 157 136 157 141 157 147 153 143 148 136 148 130 148 123 148 116 148 108 149 102 149 95 148 88 148\n249 0.999 163 166 167 171 176 171 185 171 193 171 201 171 211 171 219 171 228 171 236 170 238 162 231 160 223 160 215 160 207 160 200 160 191 160 182 160 174 160 166 159\n250 0.999 416 41 390 44 367 59 354 85 355 114 371 135 403 144 429 134 440 112 415 125 390 127 371 113 369 95 379 70 400 58 424 65 438 88 447 110 453 83 440 55\n250 0.999 236 44 218 43 197 43 175 43 153 44 132 44 117 44 96 45 75 44 52 44 52 66 69 73 90 73 111 72 130 72 149 72 170 72 192 71 215 71 234 70\n250 0.999 303 76 278 75 251 75 222 75 194 75 167 75 144 75 118 76 90 76 60 77 54 100 79 104 106 104 134 104 159 104 185 103 211 103 241 102 271 101 296 101\n250 0.999 307 107 283 106 255 106 226 106 197 107 170 107 147 107 120 108 93 108 63 109 56 131 82 135 109 135 137 134 162 134 188 133 215 133 245 133 274 132 301 131\n250 0.999 220 138 203 138 186 139 170 138 152 139 136 139 120 139 104 139 86 139 69 139 58 146 74 150 90 150 107 150 123 149 140 148 156 148 173 147 189 148 207 147\n250 0.999 420 205 388 205 352 205 314 206 278 208 243 208 212 209 177 209 141 209 103 210 93 239 125 244 160 244 195 243 230 242 263 240 298 240 337 239 373 238 410 237\n251 0.999 325 52 295 48 264 47 231 47 197 52 165 60 135 73 104 91 79 114 68 134 87 153 111 174 132 157 157 141 187 128 219 119 252 116 288 117 312 116 320 83\n251 0.999 343 56 336 74 329 89 323 108 324 125 341 133 357 142 374 152 388 162 402 170 412 159 421 149 435 135 441 126 429 112 417 99 402 87 387 78 369 70 355 62\n251 0.999 379 167 350 167 321 167 292 167 263 167 236 167 208 168 180 168 149 167 133 178 134 205 160 206 188 205 217 205 245 205 274 205 302 205 330 205 357 205 378 193\n251 0.999 400 243 367 242 335 242 302 243 270 242 237 243 206 243 173 243 140 243 112 244 109 275 141 276 173 276 206 276 237 276 270 276 303 276 336 276 367 275 399 270\n251 0.999 107 290 111 321 140 321 174 321 207 321 241 321 272 320 305 321 338 321 372 321 402 321 399 290 368 289 336 289 303 289 270 290 237 290 205 289 171 289 138 289\n252 0.999 68 49 67 61 66 74 66 85 76 89 90 89 103 90 120 90 132 90 139 90 148 89 148 78 148 65 148 51 142 48 129 48 115 47 102 48 89 48 78 48\n252 0.999 64 98 64 110 64 123 65 135 78 134 91 133 105 133 121 132 133 132 142 132 152 130 152 121 153 108 149 99 137 99 125 98 112 98 99 98 85 98 74 98\n252 0.999 74 147 74 158 74 170 74 178 81 184 94 184 105 183 120 183 129 183 136 183 144 182 144 173 144 161 144 151 136 149 125 149 115 148 104 148 92 147 82 146\n252 0.999 180 212 168 194 156 198 143 204 126 208 109 210 93 209 77 206 63 200 48 197 39 209 44 219 58 225 73 231 89 233 106 234 121 233 137 231 153 226 167 220\n253 0.999 279 125 260 110 239 99 214 92 187 89 158 89 134 92 105 97 83 106 62 112 49 128 64 144 86 135 115 128 143 121 164 118 189 121 217 126 237 134 258 144\n253 0.999 89 99 86 95 83 93 80 93 73 94 67 94 63 94 55 96 51 97 50 101 49 105 51 113 57 114 63 113 70 111 71 110 73 110 79 108 80 108 87 107\n253 0.999 208 182 214 183 224 184 233 184 241 184 249 184 257 185 265 184 267 177 269 170 270 164 268 163 259 162 251 162 243 162 232 162 220 162 212 162 210 169 208 176\n253 0.999 57 185 65 186 75 186 86 186 95 187 104 187 114 187 125 187 130 183 131 176 132 166 129 166 119 165 110 165 100 164 89 164 76 164 67 163 61 169 58 175\n254 0.999 419 28 388 28 358 28 322 30 289 34 260 40 228 49 198 57 170 69 141 83 137 106 164 110 194 97 224 86 255 76 289 68 320 64 350 61 382 59 416 58\n254 0.999 189 208 194 237 210 250 242 250 270 250 299 251 329 251 359 251 392 251 419 250 446 250 444 219 423 209 393 209 364 209 335 209 306 209 276 208 245 208 217 207\n254 0.999 206 252 206 278 225 287 252 287 277 287 303 287 329 287 356 287 383 286 407 286 432 286 431 261 412 252 385 252 360 252 333 251 308 251 282 251 255 251 229 251\n255 0.999 31 140 75 150 111 124 153 107 194 99 239 95 287 94 316 104 363 122 400 144 441 147 410 116 368 91 323 74 288 75 244 70 196 71 152 75 110 91 70 112\n255 0.999 61 192 97 196 135 196 174 197 212 197 253 198 292 199 328 200 367 199 406 200 422 170 388 160 349 157 310 156 272 155 236 156 195 156 154 156 116 156 76 156\n255 0.999 163 202 172 217 191 216 209 216 226 217 245 217 263 217 279 218 298 218 316 218 331 215 323 199 304 199 286 198 269 198 251 198 232 197 213 197 195 197 176 197\n256 0.999 63 91 79 124 118 116 155 106 196 100 236 99 277 99 317 103 356 113 394 126 412 116 409 88 371 72 332 64 294 58 252 56 212 56 170 59 132 66 97 77\n256 0.999 185 109 184 124 185 143 200 144 218 144 234 144 253 144 267 144 284 144 305 143 307 143 307 127 307 113 295 108 277 108 262 108 246 108 229 107 213 107 198 108\n256 0.999 365 290 340 290 317 291 292 291 269 291 244 291 218 292 193 292 168 293 146 293 138 314 162 315 188 314 213 313 237 313 262 312 286 311 310 311 335 311 360 310\n256 0.999 350 312 328 311 310 311 288 311 269 312 248 312 227 314 206 315 185 315 166 315 160 335 180 336 201 336 222 335 242 335 263 334 283 333 304 333 325 333 346 331\n257 0.999 404 91 407 103 414 113 425 111 435 107 447 105 458 103 468 103 477 103 488 104 497 105 499 90 492 84 480 82 470 81 459 81 447 81 435 83 425 85 414 87\n257 0.999 57 138 76 144 115 145 150 145 188 147 231 150 270 152 306 155 345 157 373 157 383 110 371 95 331 92 292 90 258 90 220 88 180 86 137 85 100 85 67 91\n257 0.999 450 118 450 125 450 133 451 138 457 138 463 138 469 138 475 138 480 140 485 140 490 138 491 132 491 123 489 118 484 117 479 116 473 116 467 116 460 116 454 117\n257 0.999 391 163 396 168 409 172 421 175 433 179 446 182 457 184 470 186 483 186 495 185 498 170 493 165 480 166 468 165 456 164 443 161 431 157 420 151 408 147 396 147\n257 0.999 79 181 104 183 135 184 165 186 197 187 227 189 256 191 286 192 317 193 347 195 359 169 336 164 306 162 275 161 245 159 215 158 184 157 152 155 122 155 91 155\n257 0.999 450 186 449 192 450 195 454 197 459 198 463 198 468 198 473 198 477 199 481 199 487 198 487 193 486 186 481 186 477 186 472 186 468 186 463 185 458 185 453 185\n257 0.999 64 225 86 229 117 228 151 231 185 235 219 237 255 236 285 234 317 235 345 240 355 210 333 199 299 195 263 194 236 194 204 192 171 189 138 189 105 186 72 186\n257 0.999 432 204 435 208 444 208 451 209 459 209 467 209 475 209 483 210 490 210 498 208 499 204 497 199 489 199 481 199 474 198 466 198 458 196 450 196 442 194 433 195\n258 0.999 116 78 123 90 131 100 142 95 152 89 164 86 178 90 190 97 199 105 208 100 216 92 221 82 211 73 200 66 188 61 174 58 161 57 148 60 136 65 126 72\n258 0.999 235 107 218 108 202 109 184 109 168 110 151 110 134 110 117 110 100 111 98 114 99 140 116 140 133 140 150 140 167 139 184 139 201 138 216 138 235 137 235 124\n258 0.999 105 163 113 173 122 183 134 192 147 197 160 200 176 200 190 195 201 188 210 180 218 170 211 159 200 150 188 160 175 167 161 169 147 165 136 157 129 151 117 152\n258 0.999 230 233 229 238 229 242 229 247 229 252 229 256 229 261 231 266 230 268 232 272 238 271 239 266 239 262 239 257 239 253 239 248 239 244 239 239 239 235 236 232\n258 0.999 94 314 91 312 88 310 85 310 81 309 78 309 75 310 72 311 69 313 67 315 66 318 69 321 72 319 75 319 78 318 81 318 83 318 86 319 88 320 93 317\n258 0.999 290 317 270 316 250 316 229 317 212 317 191 317 170 318 147 318 126 318 108 318 103 337 123 338 143 338 164 337 184 338 203 337 225 336 246 337 266 336 288 335\n258 0.999 99 323 95 322 91 322 87 323 83 323 79 323 75 323 71 323 67 324 63 325 63 329 67 331 71 331 75 331 80 330 84 330 87 330 92 330 95 330 99 327\n259 0.999 65 82 102 90 139 90 175 90 217 91 258 91 297 91 340 91 381 92 425 92 435 66 408 55 368 54 329 53 287 53 246 53 205 52 164 51 122 51 77 52\n259 0.999 66 111 67 114 75 114 83 114 91 114 97 115 104 115 111 115 119 116 123 108 125 102 121 99 115 99 108 99 101 98 94 99 87 98 79 98 71 98 68 102\n259 0.999 54 144 72 187 103 182 151 166 197 156 242 152 287 154 333 159 374 171 418 186 441 174 441 138 400 120 358 107 315 99 266 96 220 98 175 104 131 114 88 129\n259 0.999 187 161 188 182 188 201 198 211 217 211 239 211 257 211 277 210 297 210 319 209 323 208 322 185 324 163 313 160 294 159 275 159 255 158 235 158 216 159 194 160\n259 0.999 376 218 350 216 323 216 299 216 274 216 250 216 226 216 200 216 175 217 151 216 136 233 160 236 185 236 211 236 238 236 261 236 287 236 314 236 338 236 363 236\n260 0.999 196 165 203 175 216 174 223 173 235 173 246 174 257 177 268 180 278 185 288 190 300 189 294 180 284 172 274 167 263 163 251 160 239 159 227 157 217 160 206 160\n260 0.999 185 179 190 197 204 200 218 197 232 196 247 196 260 200 275 205 287 213 300 221 313 221 311 205 300 196 287 189 274 183 259 177 245 174 229 173 214 174 199 176\n260 0.999 225 478 246 491 273 492 300 492 328 492 355 492 382 492 409 492 436 492 464 492 491 490 474 477 446 477 418 477 391 477 363 477 335 477 307 477 280 477 253 477\n261 0.999 104 153 103 204 106 261 150 269 187 272 250 276 292 279 353 264 364 252 402 248 433 223 438 167 406 130 369 124 339 113 296 90 234 80 191 81 159 95 118 116\n261 0.999 154 328 174 341 195 353 218 361 243 368 267 370 290 368 312 365 334 357 357 349 371 330 351 319 328 327 305 334 284 337 261 339 237 335 214 329 193 318 171 309\n262 0.999 208 39 214 45 224 47 234 41 244 39 255 40 266 42 274 47 282 54 289 60 297 51 292 42 284 36 275 30 266 26 255 23 245 21 234 22 223 25 215 30\n262 0.999 233 55 234 64 234 71 235 80 237 88 245 89 256 90 264 90 270 90 276 90 284 90 285 83 285 75 284 67 281 61 273 60 266 59 257 57 249 55 241 55\n262 0.999 98 62 97 69 100 76 107 75 116 76 123 76 132 76 140 76 147 76 153 77 160 77 160 69 157 63 150 63 143 63 136 62 128 62 120 62 113 61 105 61\n262 0.999 87 88 88 98 97 99 107 97 116 97 125 96 136 97 144 97 153 98 161 99 169 98 168 89 158 88 150 86 140 86 132 85 123 85 114 85 103 86 95 87\n262 0.999 301 102 293 94 286 96 277 102 268 106 256 107 247 105 237 102 229 96 221 100 214 108 220 112 229 118 238 122 248 124 258 125 268 123 277 121 287 116 294 109\n262 0.999 39 99 35 95 30 95 25 95 20 96 16 97 11 96 7 98 1 99 0 104 0 109 1 112 7 111 12 110 17 109 21 108 26 108 30 107 36 107 39 103\n262 0.999 40 109 38 107 34 107 30 107 26 108 23 109 20 109 17 110 14 111 11 112 11 115 13 119 17 118 21 118 24 117 27 117 30 116 34 116 38 116 40 114\n262 0.999 214 159 214 168 217 175 227 175 236 175 245 175 256 175 266 175 273 175 282 175 291 174 291 166 287 160 277 160 269 160 260 159 251 159 241 159 232 159 223 159\n262 0.999 303 175 293 174 283 174 271 175 259 175 248 175 238 175 228 175 215 174 203 174 203 187 213 189 224 189 235 190 247 189 258 189 269 190 281 190 293 190 303 186\n263 0.999 47 104 58 114 67 109 79 106 90 105 103 105 115 106 127 108 139 113 149 118 158 111 154 102 143 96 131 91 119 89 106 87 93 87 81 88 69 91 57 97\n263 0.999 173 87 176 122 176 148 214 148 246 148 280 150 315 150 347 150 379 150 408 150 441 150 440 116 432 89 397 88 367 88 333 88 300 87 263 86 230 86 198 86\n263 0.999 204 153 205 183 207 211 231 214 260 214 289 214 320 214 348 213 375 213 399 212 425 211 425 178 423 154 394 153 368 153 339 152 311 152 280 152 250 151 223 151\n263 0.999 53 180 61 183 73 182 85 183 98 183 109 183 122 183 134 183 146 184 159 184 163 172 157 165 145 164 132 164 120 164 108 164 95 164 82 164 70 165 56 166\n263 0.999 69 184 70 195 77 198 85 198 94 198 102 198 110 198 119 199 128 199 136 199 144 197 145 185 138 182 129 182 120 182 112 182 103 183 94 183 85 183 76 183\n264 0.999 374 81 352 65 325 66 294 66 266 66 237 66 208 67 179 67 152 67 129 68 104 73 123 89 153 89 182 89 210 88 238 88 267 88 296 88 325 89 350 89\n264 0.999 429 94 392 95 354 95 316 96 279 96 244 97 207 96 169 96 133 97 104 108 104 138 142 139 180 139 217 140 255 140 291 139 329 138 368 138 406 138 432 127\n264 0.999 440 148 404 148 368 148 332 148 296 148 261 148 225 149 189 148 153 148 122 149 98 166 133 167 169 167 205 167 241 166 276 166 312 166 347 166 382 166 415 166\n264 0.999 333 187 333 197 342 197 355 197 365 197 375 197 385 197 395 197 405 197 415 195 415 186 414 176 404 175 393 175 382 174 373 174 362 174 353 174 342 174 333 178\n264 0.999 239 178 230 178 223 179 215 181 208 185 203 190 198 196 195 203 193 212 192 220 194 225 205 225 206 219 206 211 208 204 212 199 219 194 226 191 233 190 239 187\n264 0.999 247 180 243 185 241 191 246 194 250 198 254 203 257 209 259 215 258 221 258 226 264 227 270 225 272 219 273 213 271 207 269 201 266 195 262 189 257 185 252 182\n264 0.999 357 197 357 201 357 205 361 206 365 206 368 206 372 206 376 206 379 206 384 206 388 205 388 201 387 197 383 197 379 197 375 197 372 197 368 197 365 197 361 196\n264 0.999 35 206 35 221 45 226 60 226 75 226 90 226 104 226 120 225 135 225 150 225 165 225 165 210 155 205 139 205 124 205 109 205 94 205 79 205 64 205 48 205\n264 0.999 291 207 291 212 291 217 290 222 290 227 293 230 297 230 302 230 307 230 312 230 316 230 317 225 316 221 315 215 315 210 314 206 309 206 304 206 299 206 294 206\n264 0.999 388 211 387 214 386 216 385 220 385 222 388 224 390 224 393 225 395 226 397 227 400 228 403 225 404 223 404 220 403 218 400 217 398 216 395 214 393 213 391 212\n265 0.999 153 95 152 121 150 154 167 160 195 159 227 158 251 158 275 160 299 165 323 168 341 160 344 132 346 102 335 94 307 94 280 94 254 94 227 93 200 92 171 91\n265 0.999 167 174 177 186 194 181 210 176 228 174 245 172 262 174 279 176 295 181 311 186 327 187 320 173 304 168 287 163 271 159 253 159 235 160 217 161 200 164 182 169\n265 0.999 173 217 173 237 183 243 200 244 218 244 234 245 250 246 267 247 283 247 299 248 315 248 316 226 306 219 289 219 272 217 257 217 240 217 222 216 205 216 188 215\n266 0.999 47 45 47 69 62 80 87 81 111 81 135 81 160 81 184 82 211 83 232 83 256 84 257 58 238 48 216 48 192 47 167 46 142 45 119 44 93 44 70 44\n266 0.999 60 87 60 108 67 123 93 122 115 122 138 123 160 123 182 123 208 123 227 124 248 123 248 100 235 88 214 87 192 87 170 86 147 86 124 86 101 86 81 86\n266 0.999 231 160 212 160 193 161 172 162 152 163 135 164 116 165 97 165 78 164 75 180 77 199 95 200 115 200 134 200 152 200 173 200 191 200 210 199 229 198 231 180\n266 0.999 74 243 76 261 88 267 110 261 128 258 147 257 166 256 185 256 208 259 222 262 232 251 227 236 205 230 187 229 170 229 159 218 145 226 130 230 109 233 91 237\n266 0.999 138 258 138 261 139 267 143 267 147 266 151 266 155 265 158 265 164 265 167 265 171 265 172 260 170 256 166 256 162 256 157 256 154 256 150 256 145 257 142 257\n267 0.999 433 60 403 61 373 60 343 60 313 60 282 60 252 60 223 60 201 60 169 65 169 99 198 99 227 98 260 98 290 98 322 97 351 97 378 97 413 96 434 91\n267 0.999 136 82 132 78 127 76 122 74 117 73 112 73 107 73 102 74 97 77 92 80 90 84 94 89 99 86 103 84 108 82 113 81 118 82 122 83 127 86 132 88\n267 0.999 92 104 93 108 97 109 103 109 108 110 113 110 118 110 126 110 132 108 132 104 132 101 132 96 128 95 122 96 117 96 112 96 107 96 103 96 96 96 93 100\n267 0.999 435 98 423 96 411 96 398 97 385 97 372 97 360 97 349 98 337 98 324 99 323 112 334 114 346 114 360 114 372 114 386 114 399 114 411 113 422 113 435 111\n267 0.999 288 99 274 99 262 99 247 99 234 99 221 100 207 100 196 100 184 101 170 102 168 116 182 117 196 117 209 117 223 116 236 116 250 116 263 116 274 116 288 113\n267 0.999 423 114 412 113 402 114 393 114 382 114 372 114 362 114 355 114 345 114 338 115 339 130 348 131 358 131 369 131 379 131 389 131 399 131 409 131 417 131 423 126\n267 0.999 268 117 258 116 250 116 240 116 231 117 222 117 213 118 205 117 197 117 190 119 190 133 199 134 208 134 218 134 227 134 237 134 246 133 255 133 263 133 268 129\n267 0.999 241 183 243 188 247 193 253 191 258 188 263 187 268 187 274 189 280 192 283 193 287 188 287 181 283 179 278 177 272 175 266 175 261 175 255 176 250 178 245 180\n267 0.999 253 195 253 200 253 205 254 211 254 217 254 222 255 225 263 225 270 225 275 225 276 224 278 221 277 215 276 210 275 205 274 200 272 195 268 194 262 194 258 194\n267 0.999 288 229 283 227 279 230 274 233 269 234 264 235 259 234 254 232 250 229 245 229 244 234 247 238 252 240 257 242 262 243 267 243 272 243 277 241 282 239 286 235\n267 0.999 432 294 432 301 435 305 443 305 449 305 456 305 463 305 472 305 481 305 487 305 488 303 488 295 484 291 476 291 470 291 463 291 456 291 449 291 442 291 435 291\n268 0.999 189 61 180 57 171 55 162 55 153 55 144 56 136 58 129 59 121 61 112 62 111 71 118 73 126 71 135 69 144 68 152 66 160 65 169 66 178 68 186 69\n268 0.999 149 72 138 74 129 77 119 79 108 81 97 81 86 81 75 80 64 76 54 74 52 83 61 88 71 91 82 93 93 94 105 93 115 91 127 89 139 86 149 83\n268 0.999 70 128 70 140 71 153 72 164 87 163 99 163 112 163 132 163 143 164 153 164 157 163 157 148 157 136 155 128 142 128 129 128 117 128 103 128 91 128 80 128\n268 0.999 32 170 33 190 38 204 57 204 79 204 99 204 118 205 147 205 168 205 181 206 195 205 195 183 190 170 169 170 149 170 129 170 109 170 87 170 69 170 51 169\n268 0.999 48 221 51 234 66 234 80 234 95 235 109 235 124 235 141 235 155 236 169 236 180 235 177 223 163 223 148 222 134 222 119 222 105 221 90 221 75 221 61 221\n268 0.999 37 248 45 261 60 261 77 261 95 261 112 262 129 263 147 263 164 263 181 263 196 262 186 250 171 250 155 250 138 250 121 250 104 250 87 249 69 249 52 248\n268 0.999 45 277 57 280 72 280 86 281 101 282 116 282 131 282 147 283 162 283 176 283 187 276 173 274 160 273 146 273 131 273 116 272 101 272 86 271 71 271 57 271\n268 0.999 60 282 66 291 76 291 88 291 101 292 113 292 124 293 138 293 150 293 162 294 172 293 167 284 155 284 143 284 131 283 119 283 107 283 95 283 82 282 71 282\n268 0.999 40 299 56 300 69 301 85 301 101 302 117 303 133 303 151 304 166 304 182 304 191 295 176 295 162 294 146 294 130 293 114 293 98 292 82 292 66 291 51 291\n268 0.999 46 301 54 311 69 311 83 312 98 312 112 313 127 313 143 313 158 314 172 315 184 313 177 304 162 304 148 303 133 303 118 303 104 303 88 302 73 301 58 301\n269 0.999 248 20 248 28 249 39 253 43 260 43 268 44 275 45 283 47 291 49 297 50 303 48 305 37 306 28 300 26 292 25 285 24 277 22 270 21 263 20 254 19\n269 0.999 251 48 251 53 256 54 262 54 267 55 272 56 277 57 283 58 288 58 293 60 298 55 297 50 291 49 287 49 281 48 276 47 271 45 266 45 260 44 254 43\n269 0.999 69 84 90 115 130 111 172 109 211 112 254 113 295 118 336 125 378 134 417 145 454 151 441 120 400 108 360 97 318 89 275 82 233 78 194 76 153 76 110 79\n269 0.999 217 231 216 255 236 257 258 259 277 261 299 263 320 265 342 267 363 269 384 270 403 272 404 247 385 243 364 241 343 240 322 237 300 235 280 233 259 232 236 230\n270 0.999 372 89 353 90 335 91 317 93 298 96 281 99 262 103 242 108 240 121 240 138 243 154 259 153 277 151 296 150 315 149 334 148 351 148 364 140 367 121 370 104\n270 0.999 222 113 205 119 184 127 165 136 146 148 130 159 113 170 94 184 81 198 79 217 82 229 97 221 114 209 131 196 149 186 168 177 189 169 210 163 222 148 223 131\n270 0.999 362 147 329 151 297 160 267 171 236 181 206 191 176 200 146 211 115 221 87 232 73 251 105 243 135 234 166 225 196 216 227 207 260 198 292 189 322 179 347 172\n270 0.999 231 159 218 162 207 165 194 169 183 174 172 178 160 182 148 187 137 192 134 200 135 210 146 209 158 205 170 200 181 197 193 193 204 189 217 186 229 182 230 172\n270 0.999 221 225 212 216 202 218 192 221 182 223 173 226 162 229 152 231 143 233 135 237 134 247 139 253 149 250 159 247 168 244 178 242 189 240 199 237 210 234 218 232\n270 0.999 34 301 35 307 37 313 40 318 47 316 53 315 60 315 66 315 73 316 79 318 84 316 87 310 85 303 80 301 73 299 67 298 60 298 54 298 46 298 40 300\n270 0.999 45 318 43 323 43 328 43 333 45 336 49 336 55 336 60 336 66 336 70 336 76 336 76 331 76 326 77 322 73 318 68 318 63 318 58 318 52 318 48 318\n271 0.999 283 43 272 45 254 46 235 48 217 49 198 52 181 54 165 57 155 63 156 85 167 91 185 84 201 81 215 81 230 83 247 89 262 102 275 110 280 91 282 69\n271 0.999 276 124 266 106 258 98 237 86 218 80 200 81 182 85 169 92 157 101 146 113 133 131 153 143 163 134 173 123 188 115 201 111 217 111 235 121 246 133 261 131\n272 0.999 88 99 100 143 138 131 175 117 210 101 243 91 275 92 312 108 350 123 387 135 422 138 424 97 379 88 344 73 311 57 275 42 232 42 196 58 161 76 126 87\n272 0.999 44 167 46 222 64 262 116 259 166 258 210 256 249 257 298 255 352 255 400 257 455 252 455 196 435 160 386 163 340 168 294 170 243 170 194 170 143 168 95 163\n273 0.999 90 61 89 82 89 104 105 117 131 106 146 102 165 100 181 101 201 107 223 115 236 116 240 99 241 79 233 65 214 60 193 58 168 57 145 55 128 56 109 58\n273 0.999 129 115 129 122 140 122 152 122 162 122 169 122 177 122 184 121 194 121 198 114 199 110 195 104 188 104 179 104 171 104 163 103 153 103 144 103 135 104 130 109\n273 0.999 257 130 236 130 213 130 200 130 179 130 155 130 133 130 111 130 91 130 72 132 71 154 91 154 111 154 133 154 153 154 175 154 195 154 216 154 235 154 255 150\n273 0.999 94 157 93 172 101 181 121 181 140 181 151 182 166 182 180 181 197 181 214 182 225 182 225 165 217 157 201 157 186 157 170 157 154 157 138 157 122 157 107 157\n274 0.999 111 102 131 135 156 127 186 117 216 111 248 111 281 115 313 123 342 135 367 152 387 142 386 115 356 95 327 81 298 71 266 66 233 65 200 67 168 73 139 87\n274 0.999 159 133 160 159 171 173 194 175 215 176 238 177 260 180 284 181 306 182 326 184 347 183 348 153 338 143 314 141 293 139 271 138 249 136 226 135 202 134 179 132\n274 0.999 191 191 190 210 189 234 201 237 215 237 232 238 250 240 266 240 283 241 296 242 313 242 314 225 314 201 303 198 289 197 272 196 257 195 239 194 221 193 205 191\n275 0.999 388 147 362 131 332 118 298 109 265 105 232 105 199 109 167 117 135 129 104 143 95 161 111 174 143 159 177 148 208 141 243 140 274 147 308 150 340 158 368 170\n275 0.999 353 169 348 161 336 160 323 161 311 161 298 161 283 161 270 162 256 162 246 166 246 179 250 189 263 189 276 188 289 188 302 187 320 186 333 186 347 186 352 178\n275 0.999 203 169 201 165 189 165 180 165 170 165 160 166 148 166 137 166 127 170 128 178 127 190 130 193 141 192 152 193 163 192 173 192 188 192 198 192 204 189 204 178\n275 0.999 395 196 386 195 376 196 365 196 354 196 343 196 331 197 320 197 309 197 299 198 296 210 303 214 314 213 325 213 336 213 348 213 362 212 372 212 383 211 393 209\n275 0.999 274 197 265 197 254 198 243 198 231 198 218 198 206 199 194 200 182 200 173 204 170 214 181 217 190 217 202 217 214 217 227 217 241 216 252 216 265 216 273 212\n275 0.999 155 203 148 202 140 203 131 204 123 204 115 203 107 204 98 204 90 204 84 207 83 217 86 221 96 220 104 220 112 220 121 220 131 219 138 219 148 219 153 215\n275 0.999 310 251 301 250 281 251 262 252 245 253 226 253 206 254 186 254 164 255 159 267 158 287 167 293 186 292 207 291 226 290 246 290 271 289 290 288 310 286 311 274\n275 0.999 416 353 421 353 430 353 438 353 446 353 453 353 461 353 469 353 476 353 480 351 482 340 475 339 468 339 458 339 452 339 445 339 439 339 432 340 425 341 418 343\n276 0.999 160 31 161 50 160 85 160 126 159 148 190 145 219 143 243 150 268 166 287 185 313 203 328 200 332 173 334 142 319 113 294 100 268 85 246 72 219 54 188 34\n276 0.999 180 164 181 173 181 187 181 199 191 203 201 209 210 213 219 216 229 219 238 221 248 222 255 215 255 205 247 199 239 194 229 189 221 185 212 179 202 173 190 168\n277 0.999 319 30 302 30 285 32 267 34 250 36 233 38 216 40 200 42 182 44 166 46 157 57 172 59 190 57 208 55 226 52 242 51 259 48 276 47 291 45 310 43\n277 0.999 324 72 308 70 290 70 268 72 249 75 229 79 209 84 192 89 176 95 163 101 172 126 185 136 204 129 223 124 241 118 260 114 280 112 300 110 315 110 321 101\n277 0.999 297 117 287 117 276 119 264 122 252 124 241 127 228 129 218 132 207 134 204 142 207 157 216 159 229 156 241 153 253 150 263 147 275 145 287 143 296 142 300 136\n277 0.999 276 146 271 146 266 147 261 148 255 149 249 151 243 152 238 153 234 155 233 159 235 166 239 171 245 169 251 168 256 167 261 166 267 164 272 162 275 161 277 156\n277 0.999 319 157 305 159 291 162 277 166 262 169 248 173 234 177 221 180 208 184 194 188 193 201 205 203 220 199 235 195 249 191 263 188 277 184 291 180 303 177 318 173\n277 0.999 333 180 317 182 299 185 281 190 263 194 246 199 228 204 212 208 195 212 184 220 186 243 199 247 218 243 237 238 254 233 271 227 289 221 306 217 320 214 331 207\n277 0.999 336 222 320 226 306 232 290 237 274 240 258 246 243 251 228 256 213 261 199 266 190 278 208 277 223 271 239 266 254 261 270 257 284 253 299 248 313 241 328 235\n277 0.999 337 232 331 233 327 234 321 237 317 238 313 242 313 248 319 248 320 247 316 249 313 254 316 261 321 260 327 258 331 256 335 254 340 253 339 248 337 244 338 238\n277 0.999 251 266 244 265 238 266 231 267 224 270 218 272 212 275 205 277 200 278 193 281 191 288 196 292 204 290 211 288 217 286 223 284 229 281 235 278 240 277 246 273\n278 0.999 95 39 101 54 107 66 119 70 137 63 150 61 169 62 182 66 200 71 206 63 209 57 216 47 212 35 196 29 181 25 166 23 151 23 133 24 119 29 104 36\n278 0.999 243 246 225 246 207 245 193 245 176 245 156 246 139 246 119 246 100 246 85 246 78 259 95 261 113 260 132 260 149 260 167 260 185 260 203 260 219 260 237 260\n279 0.999 307 58 288 57 258 57 228 57 197 60 174 61 143 64 121 66 84 68 85 98 88 127 113 128 145 125 169 122 199 118 225 117 252 116 279 117 306 117 307 87\n279 0.999 429 131 395 126 352 122 305 119 255 119 216 120 169 124 133 127 86 128 53 154 49 198 96 192 144 189 183 185 228 183 273 183 315 185 358 189 401 191 430 175\n279 0.999 141 199 152 227 182 227 213 228 243 230 277 232 310 233 345 234 375 236 400 237 431 237 423 211 391 209 361 207 329 205 297 203 265 200 232 199 202 199 171 199\n279 0.999 429 280 433 283 438 282 443 281 448 281 453 281 459 281 465 281 470 282 474 281 477 273 472 272 467 272 462 272 456 272 451 272 446 272 440 272 435 272 431 274\n280 0.999 59 96 69 115 85 104 105 96 126 90 149 88 171 90 195 94 216 102 234 112 248 108 243 90 227 80 206 72 185 67 161 64 138 64 116 69 96 75 78 85\n280 0.999 125 108 125 118 135 120 145 120 155 119 165 119 175 119 185 119 195 119 205 119 215 119 213 107 205 107 195 107 185 107 175 107 164 107 154 107 145 107 135 108\n280 0.999 189 119 184 119 178 119 173 119 169 119 164 119 160 119 155 119 150 120 150 125 150 130 152 131 160 132 165 131 170 131 175 131 180 131 185 131 188 129 189 124\n280 0.999 63 138 71 158 89 169 108 177 130 182 152 184 175 183 196 180 216 174 235 164 251 154 239 146 227 154 202 162 182 166 159 167 137 165 117 162 98 156 79 146\n281 0.999 69 90 70 104 81 117 90 116 100 116 112 116 124 117 136 120 149 122 159 125 168 122 170 102 165 97 152 96 141 94 129 93 117 93 105 91 93 91 81 90\n282 0.999 54 110 53 161 77 191 132 186 173 176 215 169 262 170 310 176 358 186 400 197 448 199 454 153 439 125 389 120 338 118 292 117 243 116 195 114 145 113 98 111\n282 0.999 213 176 209 189 205 215 200 234 216 238 231 238 248 239 266 239 283 239 297 240 312 240 316 226 318 205 321 183 311 179 296 179 279 177 262 176 245 175 227 174\n283 0.999 203 77 197 76 190 76 185 77 179 77 174 77 166 78 165 81 165 86 166 91 165 97 171 99 176 98 182 98 188 97 194 97 200 97 204 93 205 88 204 82\n283 0.999 365 107 313 92 255 93 203 96 149 96 91 103 69 137 112 179 159 177 209 179 257 179 299 169 259 157 210 157 160 156 142 158 193 157 255 153 305 157 350 151\n283 0.999 349 153 347 161 352 165 359 166 368 168 375 169 382 170 390 171 397 172 403 174 409 171 410 162 406 157 398 156 391 155 384 154 376 152 368 152 361 151 353 151\n283 0.999 354 169 353 175 353 181 358 183 365 183 370 185 378 185 383 186 388 187 394 189 400 186 402 181 400 174 395 173 388 172 383 171 377 170 371 169 365 168 359 168\n283 0.999 104 185 112 202 136 202 159 202 184 202 207 202 231 202 255 203 278 203 301 203 325 202 317 186 293 186 269 186 245 185 222 185 198 185 174 185 151 185 128 185\n283 0.999 455 295 451 294 447 294 445 294 441 294 437 295 433 295 431 297 431 301 431 304 431 307 435 309 439 309 443 308 446 308 450 308 453 308 456 305 456 302 456 298\n284 0.999 29 26 30 47 50 49 71 48 93 48 112 48 133 48 154 48 175 48 195 49 215 49 214 28 195 24 174 24 154 24 133 23 112 23 90 23 69 24 49 25\n284 0.999 19 105 20 132 22 153 48 145 75 138 100 135 127 134 154 136 178 140 201 146 222 151 223 124 219 102 194 93 168 88 142 86 115 85 89 87 65 91 42 98\n284 0.999 208 152 189 152 167 152 145 152 125 152 103 153 84 153 63 153 45 155 44 177 45 196 64 197 84 198 107 198 128 198 147 198 168 198 191 198 207 193 207 173\n284 0.999 61 212 60 231 59 248 73 255 91 254 107 254 124 254 143 254 160 254 175 254 189 253 191 234 191 220 179 213 163 213 145 213 127 212 109 212 92 212 76 212\n284 0.999 26 285 37 296 58 296 79 296 99 296 119 296 139 296 161 296 181 296 201 296 221 295 210 284 190 284 170 284 149 284 128 284 107 285 86 285 65 285 46 284\n285 0.999 90 34 93 47 98 61 106 72 113 79 124 86 138 91 150 92 162 91 174 86 182 76 176 61 166 61 153 63 141 62 138 53 126 46 115 53 114 50 104 37\n285 0.999 20 105 20 121 37 122 53 122 68 123 83 124 100 124 115 125 131 126 146 127 161 127 159 110 143 109 128 108 112 108 97 107 81 106 65 105 51 105 35 104\n285 0.999 28 138 36 143 49 143 62 144 74 144 87 145 100 145 112 146 125 146 138 147 147 140 138 135 126 135 113 134 100 134 88 134 75 133 62 133 50 133 37 132\n285 0.999 48 152 53 158 64 158 74 159 84 159 94 159 104 160 115 160 125 160 135 161 142 155 137 149 126 149 116 149 106 148 96 148 85 148 75 147 66 148 55 147\n285 0.999 29 170 39 173 52 174 66 174 80 174 93 175 107 175 121 175 134 176 148 176 156 165 145 163 131 163 118 162 104 162 92 162 77 162 63 161 51 161 37 161\n286 0.999 441 12 398 13 355 13 313 13 270 13 229 13 189 12 145 12 98 12 58 12 46 42 90 44 133 45 176 45 218 45 259 45 302 46 345 46 389 46 433 45\n286 0.999 55 55 74 83 110 84 151 84 191 84 233 84 277 84 317 84 355 84 395 84 430 79 415 51 374 51 333 50 293 50 252 50 212 49 168 49 129 49 90 49\n286 0.999 342 140 316 136 292 132 270 129 238 128 215 130 194 131 162 133 152 153 154 177 157 204 173 206 198 200 226 197 251 197 276 199 300 203 323 206 331 182 338 159\n286 0.999 219 254 218 260 218 265 222 270 227 269 233 270 239 270 244 270 250 270 255 270 261 270 262 264 262 257 258 254 252 254 247 254 241 253 234 253 229 253 224 253\n286 0.999 196 270 195 280 198 291 210 291 220 291 232 291 243 291 254 291 264 291 274 291 286 291 286 279 281 270 270 270 259 270 249 270 237 270 226 270 215 269 205 269\n286 0.999 213 295 213 301 214 306 221 306 227 306 235 306 243 307 249 307 255 306 262 306 269 304 268 297 265 292 258 291 251 291 245 292 238 291 229 291 224 291 218 292\n286 0.999 194 324 201 327 213 327 226 327 238 328 251 329 267 329 280 329 293 329 299 325 302 309 295 307 283 307 271 308 258 307 246 308 234 308 220 307 209 307 197 309\n286 0.999 19 382 20 423 63 424 108 424 154 424 199 425 247 426 289 425 331 425 373 426 415 418 410 389 369 381 326 380 281 379 236 379 192 378 147 378 105 378 61 378\n287 0.999 336 123 335 127 335 131 335 135 335 138 337 143 338 147 343 148 346 150 350 151 354 151 356 146 355 142 355 137 355 133 354 128 352 125 348 124 344 123 341 123\n287 0.999 316 124 310 123 302 123 295 124 288 124 280 124 272 126 272 133 273 138 273 144 275 150 280 150 287 149 294 149 301 149 307 149 313 149 316 144 316 136 316 131\n287 0.999 318 181 294 165 266 146 234 137 203 131 168 133 135 141 106 155 84 175 63 202 54 219 87 237 110 222 131 201 157 186 193 182 227 189 257 205 292 223 304 207\n287 0.999 373 131 368 142 355 153 346 169 336 190 346 205 362 213 372 197 384 182 402 181 420 202 424 216 430 227 439 229 444 209 444 191 439 178 429 160 415 144 391 133\n287 0.999 372 199 371 206 371 215 371 224 372 229 381 230 391 232 399 233 406 233 414 234 421 234 422 226 421 219 423 212 419 205 411 203 402 202 393 200 385 199 379 199\n287 0.999 250 200 234 200 219 201 202 203 188 204 170 206 152 207 137 208 128 214 128 227 129 240 144 240 159 239 176 239 191 237 208 237 224 236 239 236 249 227 250 216\n287 0.999 495 206 491 205 486 206 481 206 478 206 474 208 469 208 465 209 464 212 465 216 466 220 468 222 473 222 478 221 481 220 485 219 489 219 493 217 495 213 496 210\n287 0.999 498 221 498 219 495 219 492 220 488 220 484 220 480 222 477 224 478 227 478 231 479 234 481 236 486 235 490 234 493 234 498 233 499 229 499 229 499 226 499 223\n287 0.999 60 294 53 294 46 294 39 294 31 295 24 295 16 295 8 295 1 296 0 302 0 309 8 312 15 311 22 311 30 310 37 310 44 309 51 309 58 308 60 301\n287 0.999 266 366 287 369 294 354 304 339 317 329 337 328 357 334 371 351 382 367 396 369 403 351 393 333 382 319 366 306 346 300 325 298 302 304 286 315 275 331 270 346\n288 0.999 60 38 66 60 73 84 81 99 105 91 129 85 153 82 175 84 197 89 216 97 234 101 240 78 241 60 217 49 201 43 179 36 156 32 131 30 108 31 85 33\n288 0.999 151 97 150 105 149 117 152 126 160 127 169 128 179 131 188 132 196 134 205 136 215 138 216 131 216 119 212 111 205 109 197 107 189 105 179 102 169 100 160 97\n288 0.999 100 122 97 142 111 150 130 154 148 158 165 162 185 167 203 170 220 175 238 178 257 181 258 161 245 152 226 148 210 144 193 141 175 137 156 132 138 128 119 122\n288 0.999 92 152 96 169 116 174 136 179 156 183 176 187 195 192 215 196 235 200 255 204 275 207 273 189 252 184 232 179 213 176 193 172 174 167 153 163 133 159 112 154\n289 0.999 236 2 211 0 184 3 161 14 137 24 114 35 92 44 69 52 46 60 24 75 24 97 43 101 67 94 93 88 116 80 142 70 162 58 184 47 207 34 231 20\n289 0.999 259 0 247 5 234 15 222 24 210 31 197 39 188 45 173 53 161 63 163 77 165 91 177 87 190 78 203 71 215 63 230 55 242 46 254 39 262 28 261 14\n289 0.999 256 48 231 61 202 79 174 93 146 108 119 121 93 130 63 139 32 146 10 161 7 188 40 183 71 175 103 167 133 158 162 144 191 128 215 115 242 103 258 77\n289 0.999 136 74 123 78 108 84 94 89 79 93 65 98 52 101 37 105 25 113 25 128 28 142 44 140 59 135 74 130 88 125 105 119 117 114 130 109 144 102 141 87\n289 0.999 474 93 452 97 428 102 405 106 383 110 360 113 338 112 315 112 293 111 276 118 276 139 300 140 323 140 347 141 369 140 392 138 416 134 437 130 459 127 475 114\n289 0.999 269 129 258 130 246 130 234 133 222 133 212 136 203 138 192 142 192 153 191 165 192 176 203 176 214 174 226 172 237 171 250 170 260 169 268 165 269 153 269 141\n289 0.999 471 133 450 134 426 136 404 139 380 142 359 144 338 144 315 144 292 145 276 154 278 175 299 176 322 174 344 173 366 173 390 172 412 169 433 167 456 164 473 153\n289 0.999 337 174 337 181 337 188 338 196 345 197 353 197 361 197 370 197 377 197 385 197 392 196 394 191 394 183 393 176 386 175 377 175 369 174 361 174 352 173 344 173\n289 0.999 271 177 260 178 250 179 239 179 230 179 220 179 212 179 202 179 192 181 193 191 194 201 203 202 212 201 223 201 232 201 242 201 253 202 262 202 268 198 270 188\n289 0.999 473 198 449 188 416 191 382 195 350 196 318 198 287 199 255 202 225 203 193 203 181 231 207 235 240 234 272 234 304 234 337 233 370 232 400 230 433 227 467 224\n290 0.999 295 62 290 61 283 62 277 63 270 63 264 63 257 64 251 64 245 64 240 67 240 74 246 75 252 75 259 75 265 75 271 75 278 74 284 74 290 74 295 71\n290 0.999 169 84 158 83 146 82 134 83 122 85 110 87 100 89 89 93 78 96 69 101 67 115 77 116 89 112 101 109 112 106 124 104 135 104 146 104 157 105 164 101\n290 0.999 65 93 62 91 59 92 54 93 50 93 47 94 42 95 37 95 35 99 35 102 35 107 37 109 41 109 46 109 50 108 54 108 58 107 62 106 65 104 66 99\n290 0.999 155 109 149 106 140 108 130 109 121 110 112 111 103 113 94 114 85 115 77 117 74 126 81 129 90 128 100 126 109 125 118 124 127 123 135 122 144 121 153 120\n290 0.999 165 117 155 119 144 120 133 122 122 124 111 125 100 126 90 127 79 129 69 130 67 140 78 141 89 141 100 140 111 139 121 137 132 136 143 135 153 134 163 132\n290 0.999 153 140 144 141 135 143 124 145 113 147 102 147 92 147 81 146 73 147 69 159 70 165 80 167 91 168 102 169 113 168 124 167 134 165 143 163 152 161 156 156\n290 0.999 186 174 173 172 158 173 143 173 128 174 112 174 97 175 82 175 67 175 51 175 46 188 59 191 74 191 90 191 105 191 120 191 135 190 150 190 165 190 180 189\n291 0.999 344 157 304 159 259 162 212 164 186 175 232 158 275 155 322 157 335 118 310 99 257 94 198 94 153 106 115 128 110 188 153 198 183 199 230 196 275 194 319 191\n291 0.999 337 205 316 203 290 203 266 205 244 206 219 207 194 208 168 209 146 210 121 212 107 224 127 229 152 229 176 227 200 226 225 225 250 225 274 223 299 223 322 222\n291 0.999 408 240 375 240 332 241 294 243 259 244 215 246 173 247 129 249 91 251 46 253 41 283 71 295 113 293 155 291 193 290 238 289 277 287 316 287 358 285 397 285\n292 0.999 157 111 167 134 185 128 202 120 224 115 244 113 264 114 286 119 305 127 322 138 336 130 335 112 319 98 298 89 279 83 257 80 235 80 213 84 193 90 174 100\n292 0.999 151 168 163 191 183 184 202 175 224 170 246 168 266 170 289 176 309 185 326 198 340 190 337 170 319 156 299 145 277 138 255 136 233 136 210 139 188 147 169 157\n292 0.999 179 188 177 205 175 238 173 261 173 278 199 281 225 282 250 283 272 283 294 285 315 286 317 262 318 237 318 208 315 192 296 190 266 188 240 187 214 186 194 185\n292 0.999 195 294 195 310 197 321 207 321 222 322 234 323 246 323 259 323 271 324 283 325 293 323 294 306 293 296 281 295 269 294 257 294 244 293 230 292 217 292 203 291\n293 0.999 423 50 388 52 355 53 320 55 286 56 251 57 216 58 182 60 144 62 111 62 106 89 140 90 174 89 207 89 242 88 277 87 313 87 349 86 385 85 420 83\n293 0.999 244 104 244 110 244 114 244 121 245 127 250 129 257 128 263 128 270 128 278 130 283 128 284 123 284 118 284 111 284 106 278 103 271 103 263 103 257 103 250 103\n293 0.999 213 111 204 114 195 118 185 124 176 129 167 135 158 143 150 150 141 159 146 168 155 174 163 179 171 172 178 164 186 155 194 149 204 143 213 138 219 131 217 122\n293 0.999 315 111 311 122 308 130 314 136 323 141 332 147 341 154 350 161 358 169 367 177 377 172 387 165 386 158 379 151 371 144 362 137 353 129 343 123 333 117 323 112\n293 0.999 390 171 382 176 373 180 371 187 376 195 381 205 384 213 388 223 391 234 392 244 400 245 410 243 417 240 415 233 414 225 411 216 408 206 405 195 400 185 396 176\n293 0.999 128 191 125 196 123 200 121 207 119 212 117 218 115 225 120 227 126 229 133 232 138 233 141 230 144 225 146 219 147 212 149 206 150 199 146 197 140 195 134 192\n293 0.999 440 352 423 322 386 322 346 323 307 322 268 323 228 324 189 324 147 323 115 323 85 329 100 361 140 361 178 360 216 360 257 360 298 359 336 359 378 359 406 359\n294 0.999 383 45 352 42 326 42 296 42 268 41 228 42 176 42 138 43 111 73 110 111 112 147 127 175 166 171 206 162 240 154 275 140 303 128 334 113 368 103 383 79\n294 0.999 412 95 411 99 416 102 419 103 421 103 424 104 427 104 431 105 433 106 437 107 442 105 441 100 438 98 435 98 432 97 429 95 426 94 422 94 418 94 415 93\n294 0.999 439 240 412 178 395 170 383 134 372 103 322 117 276 140 236 155 203 163 169 172 132 175 110 209 128 250 167 252 211 251 259 252 296 252 334 251 370 249 406 231\n294 0.999 391 161 390 166 396 172 401 172 406 173 410 174 417 175 422 177 427 178 433 182 441 180 441 175 438 170 432 168 427 166 421 163 415 162 409 160 403 159 396 159\n294 0.999 459 263 416 260 378 262 333 264 293 268 243 261 193 260 147 261 102 260 57 260 41 296 86 301 130 303 176 303 222 304 268 304 306 302 354 303 399 304 450 304\n295 0.999 255 149 237 129 217 110 191 99 162 94 135 97 108 106 85 120 68 141 56 162 46 187 78 198 91 174 105 154 127 138 153 130 179 133 203 145 223 167 240 177\n295 0.999 191 189 187 181 178 182 170 184 160 185 151 187 142 187 134 188 125 189 118 192 118 202 126 209 134 208 143 207 151 206 160 205 168 205 176 204 184 203 191 201\n295 0.999 282 211 265 211 235 213 204 216 173 219 143 223 115 227 87 229 55 232 46 249 48 282 75 291 103 288 131 286 160 283 188 280 215 277 242 275 267 273 282 254\n296 0.999 15 123 46 128 102 117 150 112 201 110 255 110 311 111 359 114 410 124 455 133 475 98 428 79 379 70 335 67 283 65 230 62 176 62 122 65 72 72 34 82\n296 0.999 361 115 330 113 301 114 272 114 245 115 219 117 193 117 166 117 136 117 107 118 103 145 128 149 156 150 186 150 217 150 246 150 276 149 304 149 330 148 361 143\n296 0.999 128 163 139 180 164 179 189 179 214 179 239 179 268 179 293 178 318 178 342 179 358 169 344 155 320 155 296 155 271 155 246 154 219 154 194 154 169 154 144 154\n296 0.999 152 180 152 204 169 209 191 209 213 209 234 209 257 209 279 209 300 208 319 209 336 206 335 181 316 181 296 181 276 180 256 179 231 179 210 179 190 179 170 179\n297 0.999 91 40 103 65 115 83 143 73 168 71 193 79 215 94 232 118 240 144 261 143 286 137 286 114 273 87 260 66 241 49 219 35 197 27 171 22 142 22 116 30\n297 0.999 78 54 68 61 58 71 49 82 43 92 36 104 31 116 26 127 27 136 45 138 59 140 69 143 81 141 86 128 90 117 96 105 104 93 104 85 95 73 87 63\n297 0.999 194 107 173 109 153 110 135 110 127 112 127 127 127 144 127 160 126 177 124 196 126 206 150 207 168 208 186 208 190 199 190 188 191 175 192 159 193 141 193 124\n297 0.999 278 231 258 207 237 206 214 228 195 236 165 243 136 241 109 228 89 210 65 206 41 231 51 245 75 263 97 280 124 290 153 296 182 294 211 286 234 272 256 252\n298 0.999 206 87 207 106 227 104 248 103 267 103 285 103 305 105 334 105 342 105 347 78 347 78 341 60 323 59 306 59 287 57 268 54 250 52 232 48 215 52 211 71\n298 0.999 200 107 199 131 199 157 206 175 225 175 247 175 270 176 304 175 313 175 347 174 353 172 353 152 354 129 349 115 330 114 307 114 285 114 262 112 243 110 219 107\n298 0.999 215 176 215 196 215 222 214 241 229 245 249 245 270 245 301 243 307 243 328 242 345 242 343 225 343 206 343 189 330 183 310 182 290 182 270 179 251 178 231 176\n298 0.999 360 242 336 242 310 242 289 242 267 244 246 244 227 245 213 242 197 259 198 263 197 301 218 301 238 302 261 301 285 300 308 300 327 300 350 297 360 284 359 262\n298 0.999 411 321 395 292 370 297 343 308 313 315 283 317 254 317 226 315 199 309 173 299 158 316 166 335 188 344 215 353 245 356 274 357 302 353 332 350 359 341 384 330\n299 0.999 92 81 95 119 98 141 134 137 172 148 205 161 250 171 287 171 310 166 335 163 369 166 366 137 338 118 311 123 283 130 251 133 219 125 189 109 160 95 126 86\n299 0.999 141 196 158 200 179 202 200 204 224 206 248 209 278 212 302 214 321 215 338 216 346 198 332 192 305 191 286 189 266 186 243 184 221 182 198 180 174 178 150 176\n299 0.999 215 322 207 331 203 339 192 349 179 353 170 352 165 349 155 341 144 338 144 355 143 365 150 369 152 370 165 373 182 374 194 373 203 366 212 356 218 348 216 336\n300 0.999 463 140 431 116 392 94 347 80 297 72 250 72 200 75 153 84 107 100 61 121 43 153 69 167 111 147 159 130 208 117 257 114 306 114 349 123 393 140 443 171\n300 0.999 183 131 185 148 204 149 221 149 239 149 260 150 277 149 294 149 311 149 326 149 341 148 334 132 318 133 302 133 285 132 268 132 251 131 233 131 215 132 198 132\n300 0.999 94 161 94 174 108 175 120 175 132 175 148 175 161 176 173 176 185 176 197 176 201 172 197 158 185 157 174 157 162 158 151 158 138 157 125 157 112 158 101 158\n300 0.999 412 164 400 164 390 165 379 165 368 165 358 165 346 165 334 165 322 165 311 168 312 182 319 183 330 183 343 183 356 183 368 183 381 182 392 182 403 182 412 177\n300 0.999 290 183 280 182 269 182 259 182 248 182 237 182 225 182 212 182 200 182 189 184 189 199 194 201 206 201 218 201 231 201 245 201 257 200 268 200 280 200 289 196\n300 0.999 54 209 54 226 71 227 90 227 106 226 126 227 143 227 159 227 175 227 191 227 204 226 200 209 183 209 168 209 152 209 136 208 118 208 102 208 84 208 68 209\n300 0.999 368 212 360 208 350 208 339 209 328 209 319 209 307 209 294 209 282 210 269 211 269 223 271 228 285 228 298 227 310 227 323 227 335 227 346 226 356 226 369 223\n300 0.999 437 239 424 239 415 239 404 239 393 240 382 240 370 241 358 242 347 243 336 245 336 258 343 260 355 259 367 258 381 258 393 257 405 257 415 257 427 256 436 252\n300 0.999 249 243 236 243 226 243 214 243 203 244 191 244 179 244 165 244 153 244 143 247 143 261 152 263 164 263 177 263 191 263 204 263 217 262 228 261 241 261 250 257\n300 0.999 337 270 309 271 284 273 258 273 231 274 205 275 177 276 150 277 122 278 94 278 81 294 106 295 132 295 159 295 187 295 216 294 243 292 269 291 296 289 323 288\n301 0.999 318 95 312 94 301 96 289 97 278 99 267 101 257 103 246 105 237 107 235 115 237 130 246 133 257 131 267 129 277 127 287 125 298 123 307 123 318 121 319 113\n301 0.999 392 134 373 128 343 123 314 123 284 126 257 131 230 139 205 151 178 163 169 180 170 219 194 213 219 200 247 191 275 185 303 181 330 181 359 184 383 189 394 176\n302 0.999 359 48 357 55 362 59 369 62 381 64 388 66 398 67 410 65 417 59 419 52 424 46 424 42 421 36 411 34 400 33 391 32 384 31 374 32 366 36 362 40\n302 0.999 383 99 361 86 335 76 310 77 286 80 264 85 243 94 224 107 206 123 191 140 183 160 200 160 217 144 237 128 259 118 282 112 305 110 332 112 354 122 370 117\n302 0.999 388 191 371 169 353 172 335 189 309 203 283 213 261 217 237 217 211 206 190 196 176 213 182 230 203 247 228 253 259 254 283 251 309 244 331 236 352 223 371 208\n302 0.999 414 207 403 208 396 215 388 220 379 226 371 232 361 237 359 243 364 252 372 259 381 263 390 261 399 256 408 251 417 246 427 240 433 235 430 227 424 219 418 211\n302 0.999 87 341 86 345 87 349 93 349 96 350 100 350 104 350 108 351 112 351 117 351 121 349 121 343 118 342 114 342 109 342 106 341 102 341 98 341 94 341 90 340\n302 0.999 90 353 90 357 93 360 96 360 100 360 104 360 107 360 111 360 115 360 119 360 123 356 121 353 118 352 115 352 111 352 108 352 104 352 100 352 97 352 93 352\n302 0.999 96 361 96 364 96 366 98 369 100 368 103 368 105 368 108 368 110 368 114 367 115 365 116 362 114 360 112 360 109 360 108 360 105 360 103 360 100 360 97 360\n303 0.999 297 18 298 39 319 40 340 39 359 39 379 40 399 39 420 39 439 39 458 40 479 40 476 17 458 17 436 16 417 16 396 16 376 16 356 16 336 16 315 16\n303 0.999 105 86 105 95 129 97 157 98 181 100 203 105 226 110 246 115 269 121 288 125 291 108 288 92 263 84 238 79 216 73 193 68 172 62 150 56 127 48 112 60\n303 0.999 159 100 165 111 179 113 193 116 207 118 220 121 233 124 246 127 260 130 273 133 287 133 282 124 267 120 254 117 240 114 226 111 213 108 199 106 185 103 172 100\n303 0.999 114 125 113 138 114 148 127 150 140 152 151 154 164 156 175 158 187 159 198 161 211 162 213 149 209 141 196 138 184 136 171 134 160 132 148 130 136 127 123 125\n303 0.999 114 160 113 177 118 187 134 190 150 193 165 196 180 198 195 199 210 201 224 202 239 204 239 186 234 177 218 175 202 173 186 171 172 169 157 167 143 163 126 160\n303 0.999 113 216 112 224 111 235 115 240 125 240 133 241 142 241 151 242 159 242 167 243 175 242 177 234 177 222 172 218 163 217 154 217 146 216 137 216 129 216 119 215\n303 0.999 270 246 269 250 270 257 270 262 271 267 271 271 277 273 282 273 287 273 291 274 296 274 296 270 297 262 297 256 297 250 295 247 290 247 285 246 280 246 273 246\n303 0.999 111 249 110 260 109 272 119 275 130 276 141 277 152 278 162 279 172 280 183 282 193 282 193 270 193 256 187 253 176 252 164 252 154 251 143 250 132 249 120 248\n303 0.999 200 290 200 293 199 299 202 302 205 303 209 303 213 303 217 304 220 305 225 303 228 300 229 296 229 291 226 289 222 289 218 289 214 289 211 288 206 288 202 288\n304 0.999 28 159 80 179 113 144 154 117 204 99 255 94 302 100 353 120 394 149 430 185 473 172 441 126 403 92 356 68 305 52 254 45 201 48 151 60 105 83 66 116\n304 0.999 36 214 67 253 105 285 147 312 198 329 251 335 301 331 353 315 396 290 430 261 467 225 426 205 388 234 346 262 300 280 251 285 203 280 156 262 113 234 78 199\n305 0.999 444 53 430 56 403 60 381 63 356 67 333 70 307 74 283 77 274 82 275 112 276 149 286 161 314 151 341 140 364 133 386 126 407 120 426 117 444 114 448 93\n305 0.999 446 115 427 117 401 122 379 128 362 133 337 142 310 151 292 158 275 166 276 188 280 223 293 227 314 216 338 203 359 193 381 186 402 182 421 179 437 172 440 144\n305 0.999 236 119 222 120 204 121 186 123 168 125 150 128 132 131 114 134 101 137 102 158 103 182 116 187 137 185 156 183 174 181 192 177 209 174 223 169 237 166 237 146\n305 0.999 234 178 217 179 197 180 177 182 155 185 134 188 113 192 93 195 73 198 67 210 69 238 85 243 108 241 130 238 149 236 171 233 190 230 207 227 226 224 233 208\n305 0.999 235 235 216 236 197 238 177 240 156 243 135 246 114 248 94 251 73 254 69 266 70 293 86 297 108 295 130 293 150 291 170 288 190 286 207 284 226 282 233 264\n306 0.999 57 110 57 114 57 120 57 123 62 124 67 124 71 124 76 124 80 125 84 125 89 125 90 120 90 115 89 112 84 111 80 111 75 110 70 110 66 110 62 110\n306 0.999 159 122 144 122 124 121 109 123 92 125 77 125 64 125 54 135 54 151 55 167 56 180 69 179 89 180 105 182 122 182 132 177 151 179 161 170 161 153 159 137\n306 0.999 128 175 126 178 127 181 127 184 131 185 134 186 137 186 141 187 144 188 147 188 151 189 152 185 153 181 150 180 147 178 144 177 140 177 136 176 133 175 129 175\n306 0.999 76 177 75 186 77 194 85 196 95 199 103 202 112 204 121 207 129 210 138 213 144 213 146 203 146 195 137 192 128 189 120 186 110 184 101 182 92 179 84 178\n306 0.999 66 195 66 204 74 208 83 211 92 215 101 218 110 221 119 224 127 228 136 231 145 232 146 222 138 219 129 216 120 213 111 209 102 206 93 203 84 200 75 197\n307 0.999 105 95 109 125 120 141 158 112 194 94 228 85 263 83 298 92 331 107 360 128 382 134 403 101 375 77 342 57 309 45 275 38 236 37 199 42 165 53 135 71\n307 0.999 297 91 284 88 269 86 255 85 239 86 227 87 214 89 201 91 195 99 196 110 196 121 210 123 225 123 239 123 253 123 267 123 279 124 290 124 298 115 297 104\n307 0.999 134 147 160 151 184 151 207 152 233 152 257 152 284 152 311 154 331 156 353 159 360 135 347 123 320 122 292 123 266 122 244 121 219 122 193 122 168 122 145 124\n307 0.999 109 171 124 188 156 186 189 186 220 186 248 188 279 187 310 187 339 187 369 187 390 171 370 160 341 158 311 155 280 154 250 152 219 154 187 152 156 151 127 150\n307 0.999 111 191 119 216 149 216 180 216 210 216 242 216 269 216 298 215 328 215 357 216 387 216 379 192 349 191 319 191 285 191 257 190 229 190 198 190 169 191 140 191\n307 0.999 180 216 180 225 179 240 182 249 191 248 198 249 207 250 217 252 227 256 235 258 243 252 245 242 248 232 247 222 239 221 229 220 220 219 209 218 198 215 189 215\n307 0.999 250 224 249 235 250 248 258 250 270 250 278 251 285 258 295 255 305 250 314 248 325 243 326 232 328 219 319 215 310 214 299 215 289 215 278 215 265 216 255 219\n307 0.999 95 273 114 279 139 279 164 279 188 280 212 279 236 279 259 279 284 279 308 279 325 265 307 258 284 258 258 257 233 257 209 256 185 256 160 255 136 255 112 258\n308 0.999 358 25 352 26 347 28 341 30 335 32 330 35 324 36 318 38 312 40 310 44 310 52 317 51 322 49 328 47 334 45 339 43 346 41 351 38 358 37 359 31\n308 0.999 353 41 349 40 345 41 341 43 336 44 332 46 328 47 324 48 319 50 317 52 317 58 321 60 325 58 329 56 334 55 338 54 343 52 346 51 351 49 353 46\n308 0.999 358 48 352 48 348 49 343 51 337 54 332 55 327 57 322 58 316 60 313 62 313 69 318 70 323 68 328 66 334 64 338 62 344 61 350 59 355 57 359 54\n308 0.999 497 48 491 49 485 51 479 52 473 54 467 56 462 57 456 59 449 61 450 66 451 74 457 75 462 73 469 71 474 70 481 68 488 65 494 64 499 61 498 55\n308 0.999 370 53 363 54 356 56 349 59 341 61 334 64 328 66 320 69 313 71 306 71 304 78 311 81 319 79 326 76 334 74 341 71 348 69 356 66 363 65 370 62\n308 0.999 358 66 353 67 348 69 343 71 337 72 332 74 327 75 322 77 316 79 313 81 313 87 319 88 324 86 330 85 335 83 341 81 346 79 352 77 357 76 359 72\n308 0.999 498 65 496 66 492 66 487 68 483 69 479 70 475 71 471 72 469 75 469 80 470 84 475 85 479 83 484 82 488 81 493 80 497 78 499 75 499 73 499 70\n308 0.999 362 75 358 75 352 77 346 79 341 81 335 83 329 85 323 86 317 88 312 88 311 96 316 98 322 96 328 95 334 93 340 91 346 90 352 88 358 86 361 81\n308 0.999 498 81 492 81 487 81 481 83 474 85 468 85 463 86 457 87 450 89 446 91 446 97 452 102 457 102 464 100 470 98 476 97 483 96 489 94 494 92 496 88\n308 0.999 156 73 141 79 120 88 99 99 79 110 61 119 45 128 26 137 23 151 17 173 18 185 34 175 57 162 76 153 96 150 117 150 143 149 148 134 151 112 154 90\n308 0.999 499 91 495 93 489 94 483 96 477 97 472 98 467 100 461 101 455 103 455 108 458 114 464 114 469 112 475 111 481 109 487 108 493 107 499 106 499 104 499 98\n308 0.999 498 106 497 107 492 107 488 108 482 109 478 111 474 111 469 113 465 115 467 120 468 125 473 126 477 125 482 124 487 123 492 122 497 121 499 117 499 115 499 111\n308 0.999 251 115 246 116 241 117 236 119 231 121 226 122 221 123 216 125 211 126 209 130 209 136 214 137 219 135 225 133 230 132 235 130 240 129 245 128 251 126 252 121\n308 0.999 251 128 246 128 241 129 237 130 231 132 226 133 222 134 217 136 212 138 208 139 208 146 213 146 217 145 223 143 227 142 232 141 237 139 242 138 247 137 251 134\n308 0.999 248 138 242 138 238 139 234 140 230 141 225 142 221 143 217 145 212 146 210 148 209 153 214 154 219 153 223 152 227 150 232 150 236 149 241 147 245 146 248 143\n308 0.999 250 146 245 146 240 147 235 148 230 150 225 151 221 152 216 153 210 154 207 155 207 162 212 163 217 162 222 160 227 159 232 158 237 157 242 156 247 155 250 151\n308 0.999 240 162 239 159 237 157 233 158 230 159 226 159 224 160 221 160 217 162 216 162 213 165 214 168 217 170 220 170 224 169 227 168 231 167 234 167 238 166 239 165\n308 0.999 250 162 245 163 240 165 235 166 230 167 224 168 220 169 214 170 209 171 205 172 205 179 210 181 214 180 220 179 226 177 231 176 237 175 242 174 247 173 250 168\n308 0.999 92 163 82 165 71 167 59 172 47 175 38 179 33 185 30 196 29 206 22 225 33 230 39 229 53 227 68 226 80 224 89 221 92 210 94 197 93 185 93 176\n308 0.999 171 191 167 191 162 192 157 192 151 193 147 194 144 197 144 200 143 205 143 212 144 215 148 216 153 216 157 215 162 215 168 214 171 212 172 207 172 201 172 196\n308 0.999 497 198 445 201 393 207 341 211 289 215 238 219 189 223 138 228 85 232 38 235 4 250 56 247 106 246 157 243 208 241 260 239 313 236 366 233 419 228 462 224\n308 0.999 172 213 169 213 165 214 162 214 157 215 153 215 150 216 147 216 142 217 142 220 142 225 147 227 150 226 155 226 158 225 162 225 166 225 171 225 173 221 173 218\n308 0.999 288 252 282 251 274 251 266 252 258 253 251 254 245 254 237 254 230 257 230 263 230 273 237 274 245 273 252 273 259 273 268 273 276 273 284 273 288 268 288 264\n308 0.999 137 257 137 260 137 263 137 267 138 270 141 270 143 270 147 270 151 271 155 271 158 270 158 267 158 264 158 261 157 257 154 256 151 256 147 256 144 256 140 256\n309 0.999 288 24 282 22 276 22 269 22 261 22 254 23 247 23 240 23 233 23 226 23 223 28 228 31 236 31 242 31 249 31 256 31 263 31 271 31 278 31 284 31\n309 0.999 408 28 411 33 417 33 423 34 428 34 435 34 439 33 445 34 451 34 458 34 464 33 462 28 456 28 449 28 443 27 437 27 430 27 425 27 419 27 414 27\n309 0.999 391 33 397 41 406 42 416 42 425 42 436 43 445 42 456 42 465 43 475 43 483 41 478 32 469 33 459 33 450 33 439 33 429 33 420 33 410 33 402 33\n309 0.999 309 35 299 33 287 33 275 33 263 33 252 33 240 33 228 33 216 33 204 33 196 39 206 42 218 41 229 41 241 41 253 41 265 41 277 41 287 41 299 41\n309 0.999 209 58 212 64 219 65 226 65 231 65 239 65 246 65 253 65 260 65 266 65 273 63 268 57 262 57 256 57 249 57 242 57 234 57 228 57 221 58 215 58\n309 0.999 274 69 272 65 266 65 261 65 254 65 250 65 243 65 238 65 232 65 225 65 222 67 225 73 230 73 236 73 242 73 248 73 254 73 259 73 265 73 270 73\n309 0.999 129 127 146 148 173 138 197 130 220 126 250 124 277 125 303 128 326 134 352 142 370 128 356 114 332 106 305 100 280 97 254 96 228 98 202 101 178 107 154 116\n309 0.999 422 196 415 155 386 152 347 153 302 155 266 155 229 156 191 157 150 157 109 155 89 162 86 198 124 203 162 201 204 201 245 200 284 199 323 199 361 198 398 198\n309 0.999 391 241 371 217 343 224 316 234 286 241 257 245 226 242 197 237 172 229 143 217 121 239 137 256 164 267 192 275 225 280 254 280 281 279 310 274 338 267 364 256\n309 0.999 499 253 499 254 499 256 498 248 490 248 487 248 482 249 478 249 471 250 466 254 468 258 468 260 473 267 477 266 484 265 489 264 493 263 498 262 499 257 499 256\n309 0.999 498 265 497 267 494 264 491 264 485 265 483 266 480 267 477 267 475 273 477 277 478 279 478 278 483 283 485 282 489 281 493 280 496 280 499 277 498 272 498 270\n310 0.999 127 164 143 195 169 188 196 181 226 177 255 175 283 176 315 181 345 186 371 193 393 188 392 163 362 152 334 146 305 141 274 138 244 137 211 140 182 145 154 153\n311 0.999 443 6 430 6 417 7 403 7 388 7 371 7 357 8 342 8 327 9 314 9 313 21 328 22 340 22 356 22 370 22 384 22 399 22 413 21 428 21 443 21\n311 0.999 111 174 141 168 163 139 190 114 225 99 256 96 285 101 316 118 344 144 364 175 386 167 368 136 344 108 316 89 281 77 246 73 211 79 178 92 150 114 128 143\n311 0.999 128 89 125 89 118 90 110 90 102 90 90 90 83 91 75 91 73 96 72 105 76 110 83 110 86 111 98 112 105 112 111 113 120 113 127 112 128 105 128 96\n311 0.999 310 110 300 107 285 107 274 107 260 107 245 107 232 107 217 107 204 107 194 110 192 120 202 124 214 124 228 124 242 124 254 124 268 124 281 124 294 124 308 122\n311 0.999 158 224 158 247 181 249 206 249 227 250 241 250 260 250 281 251 303 251 323 252 343 250 343 227 323 224 303 224 281 224 260 224 241 224 219 224 199 224 178 224\n312 0.999 268 193 274 220 302 227 332 240 359 254 383 269 404 286 429 305 451 326 474 351 493 342 487 312 468 290 447 270 422 254 399 237 374 220 347 199 321 194 300 198\n313 0.999 335 66 331 67 328 71 324 75 320 78 318 81 317 86 317 91 317 96 318 99 319 106 323 102 327 99 331 95 335 90 340 87 343 83 343 79 340 75 338 71\n313 0.999 311 85 304 89 297 98 289 106 282 113 275 120 268 129 259 139 260 150 261 157 265 167 270 166 277 157 286 148 296 139 304 131 313 123 314 114 313 105 311 95\n313 0.999 81 113 74 142 94 164 123 176 154 186 183 197 216 210 252 225 286 241 313 263 337 277 341 250 325 223 298 202 265 186 234 175 204 162 172 150 141 138 110 125\n314 0.999 423 121 370 104 307 105 259 109 259 99 216 99 156 97 74 99 58 144 55 186 55 251 111 253 134 210 169 185 187 205 248 211 263 229 347 208 381 198 418 158\n314 0.999 160 268 186 275 218 274 252 274 292 274 328 275 360 275 390 274 415 274 450 273 453 237 426 236 395 236 363 234 316 234 295 234 253 234 229 234 192 234 163 234\n315 0.999 201 104 208 110 216 104 223 99 232 95 242 93 253 94 262 97 271 103 277 110 286 108 282 97 274 90 266 86 256 82 245 82 234 82 224 85 216 90 208 96\n315 0.999 231 103 231 107 231 110 233 114 237 116 240 116 245 116 248 116 252 116 256 115 259 112 259 109 258 106 257 101 254 101 250 101 246 101 242 101 237 101 234 102\n315 0.999 213 138 212 142 212 146 215 148 219 150 222 151 225 153 228 154 232 155 235 156 239 154 240 150 238 147 234 145 231 144 228 143 224 141 221 140 218 137 215 137\n315 0.999 278 141 275 138 273 137 269 138 266 140 262 142 259 143 255 144 251 145 249 147 249 151 250 155 253 156 258 155 261 153 265 152 268 150 271 149 274 147 277 145\n315 0.999 295 166 283 166 271 166 259 166 246 166 234 166 223 166 211 166 197 166 192 172 192 185 204 185 216 186 228 186 240 186 252 186 265 186 277 186 290 186 295 179\n315 0.999 120 186 125 210 152 210 179 210 207 209 234 209 262 210 287 210 313 210 340 210 367 210 360 188 334 188 307 187 280 187 254 187 227 187 199 187 173 187 146 187\n315 0.999 162 219 164 236 182 236 200 236 218 236 235 236 254 236 271 236 289 236 306 236 324 236 320 219 303 219 285 219 268 219 250 219 233 219 215 219 197 219 180 219\n315 0.999 135 236 144 254 165 254 189 254 212 254 236 254 259 254 283 254 305 254 328 254 352 254 343 236 321 237 297 237 274 237 251 237 228 237 204 236 180 236 158 236\n315 0.999 378 261 350 260 322 260 293 260 265 260 238 260 210 260 183 260 154 260 127 260 112 273 140 277 167 277 195 277 223 277 251 277 279 277 307 277 337 277 364 276\n316 0.999 114 79 127 94 142 100 161 89 179 82 199 79 222 81 241 86 257 93 272 103 282 94 281 76 265 65 247 55 228 48 208 47 186 46 164 50 147 56 131 67\n316 0.999 163 109 162 119 173 120 187 119 202 119 215 119 228 120 241 120 253 120 255 111 254 102 250 89 242 87 229 86 214 85 202 85 190 85 176 85 161 87 162 96\n316 0.999 141 132 141 147 152 153 170 153 186 154 203 154 219 154 236 154 252 154 266 155 278 151 279 135 268 129 250 129 233 129 217 129 200 128 184 128 168 128 153 128\n316 0.999 1 129 1 136 1 144 1 151 1 159 1 166 8 168 13 167 21 168 27 168 33 167 33 159 33 151 34 144 34 137 33 130 27 129 19 129 13 129 7 129\n316 0.999 253 155 241 156 229 156 219 156 208 156 197 156 187 156 175 156 170 161 170 170 170 179 181 180 192 180 203 180 214 181 225 181 235 181 247 181 253 174 253 167\n316 0.999 182 182 182 189 184 195 192 196 199 197 206 197 213 198 221 199 229 200 236 200 244 199 244 192 242 185 234 184 226 184 219 183 211 183 203 182 196 181 189 181\n316 0.999 199 197 199 201 199 205 199 209 201 210 204 211 208 211 211 211 214 211 217 211 220 209 221 206 221 203 220 199 217 198 214 197 211 197 207 197 204 197 202 197\n316 0.999 182 216 184 221 191 222 198 222 205 222 212 222 219 222 226 222 233 222 240 222 243 214 240 211 234 211 227 211 219 211 212 211 206 211 198 210 191 210 185 211\n316 0.999 180 230 183 235 190 236 198 236 205 236 212 236 220 236 227 237 235 238 241 238 244 229 242 223 234 222 227 222 219 223 211 223 204 223 196 222 189 222 182 223\n316 0.999 161 251 161 255 161 259 163 263 166 262 170 262 174 262 178 262 181 262 184 262 189 260 190 257 190 252 187 251 183 251 179 251 176 251 172 251 168 251 165 250\n317 0.999 373 113 352 94 317 94 290 89 262 79 222 81 185 88 153 96 121 97 102 117 104 152 112 191 140 191 174 191 215 190 254 185 280 178 304 166 332 158 364 153\n317 0.999 397 208 385 176 348 177 311 179 277 189 246 193 210 195 174 193 144 191 113 192 103 225 125 257 152 271 180 282 221 280 260 271 287 262 317 253 347 244 377 236\n318 0.999 13 109 34 105 46 77 65 52 91 38 119 35 149 40 174 55 195 84 205 112 222 102 212 73 196 47 174 29 145 18 115 15 86 21 59 33 37 55 23 82\n318 0.999 77 76 77 86 80 95 90 94 102 94 111 94 121 94 130 94 140 94 150 94 159 93 160 83 155 76 145 76 136 76 126 76 116 76 106 76 96 75 86 75\n318 0.999 67 102 67 114 77 118 88 118 100 118 112 118 124 118 135 118 146 118 158 118 168 117 168 105 159 101 147 100 135 100 123 100 112 100 100 100 88 100 77 100\n318 0.999 207 124 187 123 169 123 150 123 129 123 110 124 91 124 69 124 49 124 30 124 28 141 48 142 68 142 88 142 106 142 127 142 146 142 166 142 186 141 205 141\n318 0.999 82 154 82 162 84 174 91 174 101 174 109 174 118 174 126 174 134 174 143 174 151 174 151 164 151 154 142 154 133 154 125 154 116 154 107 153 98 153 90 153\n318 0.999 143 198 136 197 133 198 125 198 119 199 113 199 108 199 101 198 97 200 96 206 96 211 102 212 108 213 114 213 120 214 126 214 132 213 138 213 142 210 142 204\n319 0.999 306 14 300 6 290 6 280 6 270 6 261 6 253 6 246 6 235 6 226 7 220 12 222 21 233 21 242 22 253 21 262 21 272 21 282 21 292 20 302 20\n319 0.999 168 120 170 138 186 142 205 141 222 141 241 141 266 141 288 141 306 142 321 136 327 120 323 112 304 108 287 104 269 99 252 97 231 99 211 104 194 108 176 111\n319 0.999 135 158 134 185 142 200 173 198 200 197 229 197 263 198 296 197 322 198 348 199 364 200 363 172 352 157 325 155 298 154 269 154 241 154 211 154 185 155 160 156\n319 0.999 172 214 178 234 193 245 209 253 225 261 241 264 265 260 284 253 299 245 315 236 324 231 324 220 316 211 297 211 278 211 255 211 238 211 221 211 205 211 188 211\n320 0.999 117 82 117 120 138 125 172 127 202 130 232 130 262 130 293 128 321 132 348 137 375 137 381 106 353 93 325 87 294 85 264 85 233 87 204 87 175 85 145 83\n320 0.999 123 131 123 164 147 167 177 168 207 169 236 171 264 171 294 172 321 173 350 173 378 174 377 143 350 138 321 137 294 136 265 135 236 134 209 133 180 131 151 130\n320 0.999 180 190 193 192 208 193 225 194 241 195 257 196 273 197 290 197 306 198 321 197 328 184 317 178 300 177 284 176 268 176 252 175 235 174 218 174 202 173 184 173\n320 0.999 116 195 130 212 159 213 188 214 216 215 244 216 273 218 304 218 331 219 358 222 386 223 376 201 345 199 316 198 287 197 258 196 230 195 201 195 172 194 143 194\n320 0.999 193 222 192 236 207 237 222 237 236 238 250 238 265 239 279 240 293 240 307 240 319 238 317 219 304 219 290 218 276 218 261 218 248 217 233 216 218 216 204 215\n321 0.999 193 46 199 50 204 43 207 38 212 33 218 32 225 35 230 39 234 45 237 52 244 49 242 42 239 36 233 30 228 25 219 22 212 22 205 26 199 31 195 38\n321 0.999 138 107 139 124 138 162 138 178 148 186 173 186 198 182 220 175 241 172 263 171 290 172 295 165 295 137 295 116 276 110 251 108 223 106 198 104 173 103 152 101\n321 0.999 167 227 170 238 183 238 195 239 207 240 219 240 231 240 244 241 256 241 268 241 281 241 278 228 266 228 254 227 241 227 228 227 216 226 203 226 191 225 179 225\n321 0.999 139 243 139 264 152 269 172 270 190 270 210 271 228 271 247 271 266 272 284 273 304 273 304 247 293 245 274 244 254 243 234 242 215 241 195 240 176 240 156 239\n321 0.999 277 300 279 302 283 302 287 302 289 302 293 302 296 302 300 302 303 302 307 302 308 295 307 291 304 291 299 291 296 291 292 291 288 291 283 291 280 291 278 294\n321 0.999 265 304 265 309 269 310 274 310 279 310 283 310 289 310 293 310 299 310 303 310 309 307 308 302 302 302 298 302 293 302 288 302 283 302 278 301 274 301 269 301\n322 0.999 192 34 177 34 162 33 147 31 129 32 117 33 102 35 83 40 66 44 64 58 66 76 78 76 94 73 112 70 130 68 139 69 155 70 180 73 190 68 191 52\n322 0.999 198 88 181 87 168 87 150 86 129 87 118 97 106 97 83 96 66 97 60 108 58 135 68 137 83 136 105 132 126 130 134 127 151 125 177 122 193 121 197 107\n322 0.999 193 127 179 128 167 127 153 129 138 131 126 133 115 134 99 134 84 135 75 139 72 156 86 157 99 155 114 155 130 153 139 152 154 151 171 150 187 150 193 143\n322 0.999 199 161 182 161 166 161 150 162 131 163 117 163 102 164 82 165 66 165 52 166 51 188 67 188 82 187 100 187 118 186 130 187 148 187 166 187 184 186 196 178\n322 0.999 66 191 66 194 69 195 72 195 76 195 80 195 85 195 88 195 93 195 94 194 96 190 95 187 91 187 87 187 84 187 79 187 75 187 72 187 70 187 67 187\n323 0.999 180 25 174 12 164 7 152 7 142 7 130 8 116 8 104 8 92 8 80 9 77 17 77 31 89 33 103 35 114 36 126 36 139 35 150 34 161 32 176 30\n323 0.999 3 38 18 61 43 76 73 83 92 86 116 88 148 88 173 86 201 80 224 69 245 50 224 36 202 51 176 60 151 64 125 64 99 63 78 59 50 47 28 32\n323 0.999 507 43 477 49 448 56 432 60 395 70 368 79 340 87 315 95 288 105 271 112 271 153 285 151 313 144 345 136 384 127 418 119 444 111 469 103 498 97 508 73\n323 0.999 471 137 447 142 423 147 409 151 382 156 360 162 339 166 318 171 296 176 276 182 277 208 293 206 315 201 340 197 368 193 394 188 415 184 437 179 461 175 471 161\n324 0.999 78 48 75 103 78 142 114 154 155 142 198 133 250 128 302 131 349 141 393 161 433 166 438 121 440 66 405 50 362 48 316 46 265 45 215 45 169 46 126 47\n324 0.999 243 135 236 135 228 136 221 137 214 138 207 140 201 141 194 142 189 148 188 153 189 160 196 162 203 158 209 156 216 154 222 152 229 151 236 150 243 150 244 142\n324 0.999 271 135 271 143 273 150 280 150 287 151 294 154 301 157 308 159 315 162 322 164 328 164 329 155 329 147 321 145 314 144 307 141 299 139 292 137 285 136 278 135\n324 0.999 70 168 67 209 113 211 155 212 194 211 233 212 274 212 317 212 358 212 400 213 444 213 440 172 400 168 361 167 320 166 276 167 231 168 188 168 151 168 112 167\n324 0.999 89 226 97 258 138 259 177 258 216 258 255 258 293 259 332 259 370 259 410 259 448 260 441 226 399 226 361 226 323 226 283 226 244 226 204 226 166 226 128 226\n324 0.999 88 260 110 277 148 277 186 277 223 277 260 277 298 277 335 277 375 277 411 277 448 277 424 259 388 259 350 259 313 260 275 260 237 260 200 260 162 260 126 260\n324 0.999 88 278 101 304 140 304 179 304 217 304 256 304 294 304 333 304 370 304 409 303 447 303 431 277 393 277 356 277 317 277 280 277 240 277 201 277 163 277 126 277\n324 0.999 87 305 103 326 143 326 180 326 219 327 256 326 294 327 332 326 370 326 408 326 446 326 427 304 390 304 352 304 313 304 276 305 238 305 200 305 162 304 125 304\n324 0.999 67 338 94 351 134 351 173 351 211 351 250 351 289 352 328 352 367 352 405 352 444 350 414 341 376 340 337 340 298 339 259 340 219 339 182 339 144 338 105 338\n325 0.999 145 57 133 52 124 47 106 43 89 44 75 48 64 56 52 65 48 78 49 86 50 98 62 98 76 98 92 98 104 98 118 98 132 98 144 93 143 80 144 69\n325 0.999 156 110 140 110 126 110 104 110 81 110 67 110 48 110 41 124 41 143 41 160 40 173 57 174 76 174 96 174 111 174 131 174 148 174 156 160 156 141 157 128\n325 0.999 30 177 30 193 45 194 59 194 74 194 89 194 104 194 121 195 136 195 150 195 164 193 163 177 151 176 136 176 120 175 104 175 88 175 72 175 58 176 44 176\n326 0.999 408 125 371 110 346 103 309 98 265 95 225 97 187 103 147 112 110 127 78 144 87 177 115 177 147 165 184 155 221 148 260 145 295 147 331 154 370 165 396 161\n326 0.999 406 183 413 185 423 185 423 184 413 184 415 195 423 195 432 195 442 194 450 193 458 193 468 192 469 182 462 181 453 182 444 182 438 184 431 176 421 175 413 175\n326 0.999 302 180 284 180 275 181 259 181 240 181 223 182 207 182 191 182 173 182 166 191 168 209 184 210 198 209 215 209 231 208 246 208 263 207 278 207 294 207 302 196\n326 0.999 413 196 414 203 417 208 424 208 430 208 434 207 441 206 447 206 454 206 459 206 466 206 467 198 463 195 457 195 450 195 444 195 438 195 431 195 425 195 419 195\n326 0.999 466 205 462 205 459 205 456 205 453 205 450 205 447 205 443 205 440 206 438 206 438 209 440 211 443 211 447 211 450 211 453 211 456 211 459 211 461 210 466 209\n327 0.999 23 0 23 11 34 22 46 31 59 45 71 55 81 65 91 76 102 86 115 97 131 90 126 79 117 68 106 58 96 48 85 35 75 25 64 14 52 5 38 0\n327 0.999 373 83 346 68 320 58 292 57 266 63 242 70 214 82 188 98 168 113 144 128 142 150 162 166 184 148 206 130 230 115 254 103 276 94 304 91 331 100 352 100\n327 0.999 319 108 301 112 285 118 266 124 250 128 234 133 216 139 198 145 192 155 192 169 191 185 210 182 228 176 245 172 263 167 280 162 294 157 313 152 320 144 320 129\n327 0.999 418 128 380 138 343 158 305 171 261 179 227 186 189 196 153 212 125 237 105 264 106 288 146 283 186 273 225 264 268 259 305 251 339 240 378 226 398 205 410 166\n328 0.999 385 104 359 90 332 84 301 81 272 81 242 81 212 83 181 86 152 93 123 100 100 126 130 132 161 127 192 123 221 121 247 118 276 119 306 121 336 123 366 125\n328 0.999 431 193 415 131 378 131 338 132 298 133 257 134 215 134 170 135 129 135 90 135 64 156 62 208 101 213 147 211 188 206 222 206 263 207 306 204 347 202 385 200\n329 0.999 218 128 213 100 196 79 175 78 149 78 128 79 108 80 87 82 70 84 47 89 42 91 51 111 63 137 82 142 105 137 127 135 146 133 169 131 196 132 213 130\n329 0.999 232 82 235 117 241 133 267 137 294 142 319 147 345 155 369 163 395 174 420 186 439 175 445 160 426 139 413 132 389 122 365 112 340 103 313 95 284 89 257 83\n329 0.999 96 211 130 219 156 222 195 225 232 228 271 232 314 237 350 240 390 244 429 245 435 204 401 201 369 197 340 194 307 189 268 183 229 178 191 172 152 166 110 160\n329 0.999 14 290 13 302 12 308 16 316 26 316 35 317 45 318 54 320 64 321 75 322 83 313 84 310 85 301 81 297 72 295 64 294 54 292 44 291 33 289 22 289\n330 0.999 192 8 178 7 165 6 152 6 139 7 126 9 114 12 103 17 90 22 83 31 87 39 95 45 106 40 118 34 130 31 143 29 155 28 168 28 181 29 188 20\n330 0.999 195 10 192 32 219 43 240 58 258 78 272 101 282 125 287 150 285 176 281 202 300 215 306 190 310 165 310 140 306 114 295 89 282 67 264 47 243 30 219 18\n330 0.999 74 31 58 45 42 61 30 78 19 99 12 118 8 138 7 161 9 179 11 200 23 213 37 201 33 181 31 161 33 139 38 117 46 99 58 79 72 64 86 48\n330 0.999 81 121 81 143 81 164 85 179 110 179 132 180 153 180 175 179 197 179 219 179 238 178 239 157 239 139 237 120 213 119 189 120 169 120 147 120 124 120 101 120\n330 0.999 243 190 224 189 207 189 189 189 171 190 154 190 137 190 120 190 102 190 87 190 81 202 97 204 114 204 132 204 149 204 167 204 184 204 202 205 220 204 237 204\n331 0.999 235 35 210 32 187 32 163 31 136 31 109 33 84 35 61 36 36 36 20 50 21 75 39 80 67 78 91 78 115 78 139 78 163 78 188 78 210 74 234 60\n331 0.999 15 119 18 142 45 138 69 134 90 132 113 132 136 134 160 136 187 140 207 145 224 138 223 120 197 113 176 108 154 105 131 104 108 103 84 104 59 107 37 111\n331 0.999 9 141 8 167 32 174 59 176 83 178 108 180 134 182 160 182 188 183 213 183 235 182 236 155 214 147 188 146 163 145 139 144 112 143 85 142 57 141 32 141\n331 0.999 18 178 18 203 38 209 63 211 85 212 109 214 132 216 155 217 182 218 203 219 223 216 228 191 204 188 181 186 157 185 136 184 112 182 87 180 63 179 39 178\n331 0.999 31 212 31 233 44 242 67 244 85 245 105 247 125 248 145 250 167 252 186 253 204 252 204 231 189 223 169 221 150 220 132 219 112 217 91 215 70 214 49 213\n332 0.999 240 25 236 47 250 60 275 68 296 81 312 97 327 118 336 145 339 166 340 185 372 186 376 167 375 142 371 118 360 96 345 76 330 60 308 45 285 35 263 29\n332 0.999 225 30 214 30 202 32 190 35 178 38 168 44 157 50 145 57 140 65 146 76 151 85 158 94 169 91 178 80 188 72 199 67 210 63 222 60 229 52 227 42\n332 0.999 176 86 176 99 184 106 198 106 212 106 225 106 240 106 253 106 267 106 281 106 293 105 294 91 287 85 272 85 258 85 244 85 230 85 216 85 202 84 189 84\n332 0.999 126 80 118 89 111 101 105 115 101 127 98 140 97 154 96 168 96 180 106 184 121 185 131 184 132 174 133 160 135 147 139 131 143 118 150 105 150 94 139 88\n332 0.999 208 119 208 126 212 130 219 130 226 131 232 131 239 131 246 130 253 130 261 130 264 122 264 118 259 114 253 114 246 114 239 114 232 114 225 114 218 114 211 114\n332 0.999 178 128 178 147 178 166 177 183 184 193 198 193 220 193 239 192 260 193 278 192 291 191 292 175 293 156 292 140 286 131 264 132 244 132 227 133 209 131 194 130\n332 0.999 82 193 81 228 95 246 131 246 169 246 205 246 243 245 279 244 317 245 353 243 388 244 389 211 370 197 335 197 297 197 258 198 223 197 188 196 152 195 116 192\n332 0.999 30 202 29 206 33 210 38 210 42 210 47 210 52 210 56 210 61 210 66 210 71 209 71 204 67 201 63 201 58 201 53 201 48 201 44 201 39 201 34 201\n332 0.999 403 208 404 212 409 212 415 212 419 212 423 212 429 212 433 212 438 213 444 212 445 206 442 203 438 203 433 203 428 203 423 202 419 202 414 202 408 202 404 203\n332 0.999 41 213 41 215 42 217 44 217 45 217 47 217 50 217 52 218 55 218 57 217 59 215 59 214 58 211 55 211 53 211 51 211 48 211 47 211 44 211 42 211\n332 0.999 414 213 414 215 413 218 415 219 417 219 420 220 422 220 425 220 427 220 429 220 432 218 433 216 432 213 430 213 427 213 425 213 423 212 420 212 418 212 416 212\n332 0.999 29 217 28 224 28 229 28 234 33 237 39 237 45 237 52 237 58 237 64 237 70 236 70 230 70 224 70 217 64 217 58 217 52 217 45 217 39 216 34 217\n332 0.999 404 220 404 226 404 231 404 237 407 240 413 240 419 240 426 240 432 240 438 240 444 239 444 233 444 227 444 220 439 219 433 219 427 219 420 219 414 219 410 219\n332 0.999 141 268 157 278 176 288 194 296 214 301 233 303 254 300 273 295 291 287 307 277 320 267 305 254 289 263 270 269 250 273 230 277 209 273 191 265 172 255 158 255\n333 0.999 211 59 212 75 227 77 243 77 257 78 274 79 290 79 305 80 323 80 334 80 347 79 347 63 333 62 318 62 302 61 287 60 272 60 256 59 239 59 225 59\n333 0.999 213 119 213 135 213 152 228 153 245 154 265 155 282 155 300 155 320 155 327 155 341 154 341 138 341 123 329 120 311 120 294 120 277 119 260 118 243 118 228 118\n333 0.999 340 187 329 181 317 178 304 176 288 175 273 175 258 176 245 178 231 178 214 182 215 191 222 205 237 203 252 200 266 198 281 198 295 198 310 200 325 201 337 199\n333 0.999 326 228 316 226 306 227 295 227 282 226 270 227 257 227 246 227 235 227 227 234 227 247 236 249 247 249 261 249 272 249 284 249 297 249 309 249 322 249 326 241\n334 0.999 414 41 370 40 332 39 288 40 250 40 206 40 160 40 114 39 69 39 28 39 10 79 54 80 101 79 144 79 188 79 231 79 273 79 317 79 363 80 403 80\n334 0.999 95 124 113 134 140 132 170 129 197 127 222 127 248 129 272 132 303 136 328 137 330 118 314 103 288 100 262 98 238 97 212 96 185 97 157 99 131 101 105 104\n334 0.999 47 161 55 195 93 195 130 195 169 195 204 195 240 195 273 195 313 194 342 195 374 193 365 162 330 162 296 161 260 161 224 161 187 160 151 161 115 161 80 161\n335 0.999 335 143 323 127 311 112 289 103 266 99 242 99 220 100 193 107 176 117 162 125 141 136 149 154 174 168 192 157 214 148 237 142 262 140 278 139 299 142 312 151\n335 0.999 183 203 198 206 216 208 233 208 249 208 264 209 280 209 290 209 305 207 312 191 311 174 293 174 286 172 275 172 258 171 242 169 224 168 200 169 185 171 183 188\n335 0.999 124 239 144 249 177 250 206 251 234 251 261 252 290 253 311 253 346 253 370 247 369 217 346 213 334 214 306 213 275 211 247 209 215 207 183 207 147 208 126 215\n336 0.999 422 44 434 60 455 52 476 46 498 43 519 42 542 42 565 45 587 50 605 57 626 57 613 39 594 31 570 25 548 23 526 23 503 23 483 25 461 30 441 36\n336 0.999 839 74 850 82 871 78 892 73 913 73 933 73 954 74 977 74 999 76 1018 76 1023 58 1014 43 994 42 970 42 949 43 927 43 906 43 884 43 864 44 844 48\n336 0.999 1 45 1 59 0 74 1 86 0 103 6 111 21 109 38 109 50 107 61 106 74 106 84 98 85 87 80 73 74 63 66 52 50 50 36 48 24 46 12 45\n336 0.999 133 57 130 72 129 87 139 95 154 96 169 97 184 100 201 103 216 105 228 106 242 104 247 88 246 70 233 68 219 67 204 65 189 63 173 61 159 59 145 56\n336 0.999 651 61 625 60 592 61 561 62 529 62 499 62 474 62 444 62 404 62 401 87 403 119 429 120 460 120 492 120 521 120 552 119 584 120 615 121 648 120 651 94\n336 0.999 75 101 74 120 88 129 107 130 127 133 146 136 165 139 187 142 207 145 221 148 241 146 242 124 225 119 206 115 187 113 165 110 144 107 128 105 109 103 92 100\n336 0.999 0 119 0 123 0 129 7 133 15 132 24 132 32 132 41 132 50 132 56 132 64 129 64 119 60 111 51 112 42 112 33 112 25 112 17 111 8 111 1 112\n336 0.999 479 131 479 144 489 150 502 149 515 149 528 149 542 149 558 149 571 149 582 149 595 148 593 133 582 127 568 127 555 128 543 128 530 128 516 128 502 128 490 128\n336 0.999 518 164 519 167 519 172 519 175 522 179 527 180 531 180 536 180 539 179 543 179 548 175 548 171 548 166 546 161 542 161 538 161 534 161 530 161 526 161 522 161\n336 0.999 1023 202 918 204 809 207 701 210 595 213 489 216 384 219 277 221 167 223 56 227 9 291 114 290 221 286 330 283 432 281 543 276 650 271 762 269 867 268 962 263\n336 0.999 958 272 934 274 906 275 880 277 851 277 825 278 803 278 777 278 750 281 751 308 755 339 778 338 806 337 833 335 859 335 886 335 914 335 942 335 964 327 961 299\n336 0.999 700 288 661 290 621 291 584 292 546 293 509 294 475 296 440 298 398 298 371 314 372 348 410 348 448 347 486 345 522 344 559 344 597 343 638 342 678 342 698 323\n336 0.999 314 302 291 303 268 304 247 305 223 306 203 307 185 307 165 309 144 319 145 337 146 359 168 359 190 359 213 358 233 357 255 356 277 356 302 356 313 344 314 323\n337 0.999 114 120 106 116 97 114 85 112 71 110 59 111 49 113 40 116 31 120 21 125 17 131 22 141 39 140 47 137 57 134 67 133 78 133 89 133 100 131 110 128\n337 0.999 311 116 307 114 301 115 292 118 284 121 280 126 274 130 270 133 265 137 262 142 263 147 269 151 278 149 283 145 288 142 293 138 301 135 307 134 311 128 312 123\n337 0.999 272 151 259 144 237 131 204 122 160 120 126 124 102 131 75 134 49 136 17 146 19 168 25 207 75 217 102 213 132 206 169 202 200 206 231 210 257 205 259 181\n337 0.999 498 133 490 133 470 131 437 130 408 145 411 172 408 196 395 199 376 208 352 220 352 229 357 247 390 238 412 232 433 226 457 219 486 213 498 199 498 177 498 159\n337 0.999 230 234 217 235 202 235 185 235 165 235 148 235 134 235 120 235 103 235 87 239 87 253 100 256 120 256 136 256 152 256 167 256 184 256 201 257 219 257 230 251\n337 0.999 382 252 392 258 410 258 420 258 431 258 445 258 458 258 473 258 489 258 499 256 498 238 489 237 480 237 467 237 454 237 440 237 425 237 411 237 397 237 384 239\n337 0.999 436 360 437 367 445 367 451 367 458 367 465 367 472 366 479 366 485 366 492 366 498 361 493 358 488 358 481 358 474 358 468 358 460 358 454 358 448 358 441 358\n338 0.999 42 21 57 73 103 72 149 72 190 72 223 72 259 72 302 72 348 72 388 72 432 63 421 21 388 19 346 19 303 18 259 18 216 17 172 17 126 17 84 17\n338 0.999 380 204 366 176 340 152 309 138 271 130 228 129 181 137 149 147 126 156 98 167 78 182 83 220 115 227 156 213 189 205 228 201 260 204 288 205 324 208 352 207\n338 0.999 265 254 282 258 303 260 318 261 338 258 355 257 366 256 383 255 406 248 407 224 406 205 388 205 375 205 361 208 339 212 320 214 306 209 291 204 272 212 268 232\n338 0.999 6 292 25 360 85 361 143 360 196 359 235 360 281 358 333 358 396 358 441 358 490 345 478 292 435 292 386 290 332 289 281 288 229 286 171 284 114 282 62 279\n339 0.999 47 52 58 97 103 91 148 84 193 79 235 77 275 78 317 83 364 90 405 99 441 98 442 55 392 40 351 32 305 28 263 25 221 24 175 25 132 30 89 38\n339 0.999 49 155 71 183 112 183 153 184 195 185 238 185 278 185 320 186 363 186 403 187 442 188 423 160 381 159 341 158 296 158 256 158 214 157 171 156 130 155 90 154\n339 0.999 179 230 176 270 171 316 204 321 248 321 280 320 319 320 355 319 396 319 430 319 471 317 472 280 469 232 437 233 399 233 364 233 329 232 290 230 250 230 214 229\n340 0.999 79 139 79 183 106 206 144 198 184 192 228 192 274 197 323 203 368 216 408 227 445 227 455 181 429 152 391 133 348 119 304 115 257 114 212 113 165 112 118 121\n340 0.999 45 154 42 153 39 153 33 153 30 154 26 154 22 155 19 157 19 161 19 164 19 166 20 169 25 168 28 168 31 166 36 166 39 165 42 165 45 162 45 158\n340 0.999 87 232 100 265 130 267 158 261 186 261 218 265 252 271 287 278 319 286 349 293 369 280 374 254 349 233 318 228 284 219 255 213 223 214 188 213 151 211 117 218\n341 0.999 459 6 460 10 466 10 471 10 477 10 483 10 489 10 495 10 499 9 499 5 499 6 497 0 493 0 489 1 483 0 478 0 472 0 467 0 463 0 462 0\n341 0.999 477 11 477 15 477 18 480 20 483 20 486 20 490 21 494 21 497 21 498 19 499 19 499 16 499 14 498 12 495 11 492 11 489 11 485 11 482 11 479 11\n341 0.999 275 43 275 46 279 48 283 47 286 46 291 46 295 46 298 46 302 46 305 47 308 42 306 39 302 39 298 37 295 37 291 37 286 38 283 39 280 39 276 41\n341 0.999 451 37 441 37 430 37 420 38 409 39 400 40 391 40 380 40 370 40 361 42 362 53 373 54 382 54 393 54 404 54 414 54 425 53 435 53 447 53 452 47\n341 0.999 192 41 179 40 165 41 152 42 138 42 125 43 112 44 98 45 85 46 75 53 75 67 88 69 101 68 116 68 130 67 143 66 158 66 172 65 188 65 192 54\n341 0.999 469 53 453 54 435 54 420 54 403 55 388 55 373 56 357 56 340 62 341 76 342 94 357 94 372 94 390 93 408 93 425 93 440 93 456 92 469 84 468 69\n341 0.999 197 67 181 67 165 68 151 68 135 69 121 69 106 70 91 70 75 71 69 81 71 99 85 100 100 100 116 99 132 99 147 98 163 98 179 98 195 96 196 82\n341 0.999 319 83 315 79 309 81 304 86 297 88 290 89 284 87 278 85 272 81 267 81 265 87 267 92 273 97 280 99 285 100 292 101 299 100 306 97 312 94 317 90\n341 0.999 228 98 205 99 181 101 160 102 135 103 114 104 92 104 69 104 43 104 37 117 39 146 60 146 82 145 106 145 130 144 154 143 176 142 200 141 225 140 228 121\n341 0.999 438 106 420 105 403 105 386 105 370 105 354 106 339 106 322 107 305 108 288 113 289 127 305 127 321 126 338 126 355 127 373 126 390 126 407 126 425 127 438 122\n341 0.999 244 130 245 156 244 182 255 195 281 195 314 195 347 195 384 194 399 194 414 193 438 193 440 168 441 141 427 131 400 132 371 133 344 133 318 133 295 132 269 130\n341 0.999 216 146 198 144 180 145 163 145 145 145 128 146 110 146 93 147 75 147 56 148 51 161 68 162 85 162 102 162 121 162 139 161 157 161 175 161 193 161 212 160\n341 0.999 44 176 44 185 51 190 61 190 70 190 80 190 91 190 101 189 111 189 118 189 127 187 128 177 119 174 110 174 100 174 91 174 81 174 72 174 63 174 53 174\n341 0.999 149 177 149 185 155 189 164 189 172 189 180 189 189 189 198 189 206 189 215 189 218 186 218 177 210 174 202 174 194 174 186 174 178 174 170 174 162 174 154 174\n341 0.999 129 192 130 204 135 210 148 210 161 210 173 211 185 211 198 211 210 211 219 211 226 208 226 196 218 189 206 189 194 189 183 189 172 189 159 189 148 189 137 189\n341 0.999 89 195 82 195 76 195 70 195 64 195 58 195 53 196 46 196 44 201 44 208 45 214 51 214 56 215 64 215 70 215 77 215 83 214 89 213 89 208 89 201\n341 0.999 439 194 415 193 392 194 370 194 346 194 325 195 304 195 278 195 250 195 246 206 247 231 271 231 293 231 315 232 340 232 363 232 385 232 409 233 434 234 438 215\n341 0.999 126 210 126 223 131 231 145 231 157 232 171 232 184 232 197 232 210 232 219 232 231 232 231 218 225 211 212 211 200 211 188 211 175 211 162 210 150 210 138 209\n341 0.999 44 215 44 221 44 227 45 233 52 234 58 234 66 234 72 234 79 234 82 234 87 233 88 227 88 221 85 215 80 215 73 215 67 215 61 214 55 214 50 214\n341 0.999 153 233 154 241 161 243 169 243 177 244 185 244 194 244 202 244 210 244 216 244 225 241 221 233 214 232 206 232 198 232 191 232 183 232 175 232 168 232 160 232\n341 0.999 113 239 113 242 115 247 118 246 121 245 126 245 130 245 133 245 136 246 139 246 142 242 141 238 138 237 134 236 130 236 127 235 123 235 121 235 117 236 114 238\n341 0.999 404 249 406 253 413 254 421 253 428 253 435 253 443 253 451 253 460 253 463 252 464 242 461 240 453 240 446 240 439 240 433 240 426 240 418 240 411 240 405 241\n341 0.999 44 242 44 247 45 252 50 253 55 253 60 253 66 253 72 253 77 253 80 253 84 251 86 245 82 242 77 241 72 241 67 241 64 241 58 241 54 241 48 241\n341 0.999 248 242 248 259 262 262 282 262 300 262 318 262 336 262 354 262 370 262 385 262 402 261 402 242 384 242 367 242 350 242 333 242 316 242 298 242 282 242 265 241\n341 0.999 455 254 450 252 444 252 438 252 431 252 426 253 420 253 414 253 408 253 404 257 404 263 408 265 414 265 420 265 427 265 433 265 439 265 445 265 452 265 455 261\n341 0.999 41 261 42 265 47 266 53 266 59 266 65 266 71 266 78 266 84 266 88 264 88 256 85 255 80 254 75 254 69 254 64 254 59 253 53 253 47 253 42 253\n341 0.999 404 273 406 277 412 277 418 277 425 277 431 277 438 277 444 277 451 277 454 275 454 267 451 265 445 265 439 265 433 265 427 265 421 265 415 265 409 265 404 265\n341 0.999 105 277 105 284 109 289 114 292 120 293 126 295 132 294 138 293 144 290 149 287 153 282 152 277 146 276 140 279 135 282 128 283 122 282 117 279 112 276 106 273\n341 0.999 394 277 381 276 368 276 355 276 342 276 329 276 316 276 303 276 290 277 277 278 270 285 283 286 295 287 308 288 323 288 336 288 349 288 363 288 375 288 389 288\n341 0.999 404 278 404 285 406 290 414 290 420 290 426 289 432 289 440 289 446 289 451 289 457 288 457 282 453 277 447 277 441 277 435 277 428 277 422 277 416 277 409 277\n341 0.999 53 294 61 306 78 308 96 309 115 310 134 310 153 310 171 310 188 310 204 311 222 310 214 299 195 298 177 298 159 297 141 296 122 296 104 295 87 295 70 294\n341 0.999 441 301 421 300 403 300 385 300 367 300 348 300 331 300 313 300 295 300 275 300 265 309 282 312 301 312 317 312 337 312 356 312 374 312 393 312 411 312 429 312\n342 0.999 37 73 22 140 81 170 127 193 178 217 236 219 271 188 219 179 254 179 295 210 340 237 409 237 408 188 347 177 298 152 239 123 169 110 168 168 140 147 94 101\n342 0.999 127 85 126 90 126 110 126 118 125 125 125 133 124 135 134 145 137 148 146 151 153 148 157 142 159 131 163 123 165 117 166 105 156 99 150 93 148 90 137 86\n343 0.999 39 133 89 121 136 105 181 97 228 92 275 92 324 96 373 105 417 117 413 75 364 52 330 23 276 10 223 8 186 30 227 40 201 42 159 49 112 64 69 83\n343 0.999 66 233 110 235 166 229 210 223 255 221 298 218 346 222 393 227 435 235 487 250 480 196 446 156 398 139 352 132 302 129 250 129 207 131 160 135 112 143 77 159\n344 0.999 66 84 105 83 143 71 190 72 241 73 287 73 335 74 385 75 428 77 473 78 517 60 473 57 428 54 383 53 334 52 288 51 240 49 190 48 146 51 106 64\n344 0.999 145 74 135 73 128 76 119 79 111 83 102 86 94 90 86 92 79 96 76 102 77 111 85 114 93 111 101 107 110 105 116 102 125 98 133 95 141 92 145 84\n344 0.999 477 80 483 88 490 91 497 94 505 98 512 103 520 107 530 111 540 114 547 112 557 109 553 102 545 97 536 92 527 87 518 84 511 80 502 78 494 77 485 76\n344 0.999 74 134 116 145 157 125 202 112 248 106 296 104 343 107 389 111 433 123 474 145 515 135 478 107 435 92 388 81 338 75 292 74 249 77 201 83 158 91 116 109\n344 0.999 466 94 468 100 475 103 482 106 489 109 496 113 504 116 511 119 519 121 528 121 534 116 530 109 524 107 517 104 510 100 503 96 495 93 487 89 480 87 472 87\n344 0.999 210 117 209 143 213 163 236 164 260 164 285 164 309 165 333 166 358 166 378 167 400 167 401 140 398 120 376 119 352 119 327 119 302 119 278 118 255 118 231 118\n344 0.999 246 185 254 187 267 188 280 188 293 188 307 188 320 188 334 188 349 188 361 186 363 170 354 166 341 166 328 165 315 165 303 165 290 165 275 164 262 164 248 170\n344 0.999 206 189 205 215 210 238 231 239 255 238 281 239 305 239 331 240 356 240 378 241 400 241 400 212 396 191 374 191 351 191 327 189 301 189 276 189 253 189 228 189\n344 0.999 211 267 227 271 245 274 265 277 286 278 306 278 326 278 346 278 368 277 388 275 396 255 383 253 365 253 343 255 323 255 302 255 280 254 262 252 243 251 218 248\n344 0.999 113 291 144 305 188 306 236 306 281 307 326 307 371 308 418 308 466 310 508 310 520 270 496 253 448 267 401 271 354 276 308 276 264 275 220 269 177 259 121 246\n345 0.999 93 32 95 47 107 52 122 56 135 64 147 74 156 87 160 102 160 115 162 130 178 131 179 117 179 103 175 87 168 74 160 61 150 51 137 43 123 37 107 33\n345 0.999 76 35 66 41 57 48 51 56 44 67 40 77 38 87 37 97 38 108 40 119 49 120 59 118 61 111 60 99 60 88 62 77 68 67 76 59 85 52 81 44\n345 0.999 61 153 70 161 81 168 92 173 103 176 115 178 127 178 139 176 150 170 159 162 164 155 157 146 148 146 137 154 123 158 111 157 99 155 88 150 78 142 69 145\n345 0.999 53 204 53 216 68 216 81 216 95 217 108 217 121 218 134 218 148 218 160 219 171 213 168 204 155 203 142 203 129 202 115 202 102 201 89 201 75 200 62 200\n345 0.999 66 221 65 233 75 236 87 236 99 236 110 236 121 236 133 236 145 236 156 237 166 235 166 225 157 221 145 221 133 221 122 221 110 220 99 220 87 220 75 220\n345 0.999 94 240 94 246 94 252 98 255 105 255 111 255 117 255 124 255 130 255 137 255 143 254 143 248 143 242 138 239 132 239 125 239 119 239 112 239 106 239 100 239\n345 0.999 165 257 154 258 141 258 129 258 117 258 106 258 94 258 82 259 70 259 63 264 64 275 74 276 86 275 98 275 110 275 122 275 134 274 146 274 158 274 165 269\n346 0.999 248 47 238 48 228 51 218 54 207 56 197 59 187 62 177 65 167 67 156 70 154 80 164 79 174 77 184 74 195 72 205 69 216 66 226 64 235 61 246 58\n346 0.999 251 63 241 64 228 67 217 70 204 73 192 76 180 78 167 81 156 84 145 88 147 104 159 103 170 100 183 98 196 95 210 92 222 89 234 86 246 83 254 76\n346 0.999 344 149 338 148 332 148 326 148 319 148 312 149 306 151 300 153 294 153 287 154 286 161 291 163 297 163 303 162 310 161 317 160 323 159 330 158 336 158 343 157\n346 0.999 346 158 340 158 333 158 327 158 320 159 314 160 307 161 300 162 294 163 286 163 285 170 291 172 297 172 303 171 311 170 317 169 324 169 331 168 338 168 344 167\n346 0.999 230 171 221 168 212 166 203 165 194 165 184 165 176 168 168 171 159 174 151 177 147 184 156 185 164 182 172 178 182 176 190 174 199 174 207 174 216 176 224 179\n346 0.999 304 181 292 181 278 183 265 184 252 186 239 187 226 187 213 188 200 189 189 191 190 209 200 212 212 211 227 210 241 209 255 208 269 206 282 204 295 201 304 197\n346 0.999 493 214 483 213 473 213 464 214 454 215 444 215 434 216 425 216 415 217 405 217 403 228 412 229 421 229 431 228 441 228 452 227 461 227 471 226 481 226 490 226\n346 0.999 227 223 225 222 222 222 219 222 217 222 214 223 211 223 209 223 206 223 204 223 204 229 206 230 209 230 211 230 214 230 217 230 220 230 223 230 225 230 227 228\n346 0.999 269 230 265 229 261 230 258 230 254 230 250 230 246 231 242 231 238 231 235 231 234 236 238 238 241 237 245 237 250 237 253 236 257 236 261 236 265 236 268 235\n346 0.999 205 230 205 231 205 233 205 235 206 236 210 236 211 236 213 236 215 236 218 235 218 235 220 232 219 230 217 230 215 230 213 230 211 230 209 230 208 230 205 230\n346 0.999 284 235 279 233 274 234 269 234 265 235 260 235 255 236 250 236 246 236 240 237 239 242 242 244 247 243 252 243 257 243 261 242 266 242 271 242 276 241 281 240\n346 0.999 228 236 226 235 223 235 220 235 218 236 214 236 212 236 210 236 206 236 204 237 204 241 206 243 209 243 212 243 214 243 218 243 220 243 223 243 226 242 228 241\n346 0.999 139 242 137 241 135 241 133 241 130 241 129 242 126 242 124 242 121 242 120 242 121 245 121 247 124 248 126 248 128 248 131 248 133 248 135 248 137 248 140 245\n347 0.999 170 58 181 69 194 60 208 51 222 47 236 46 252 47 267 52 278 58 288 68 303 61 295 51 283 41 269 34 255 31 239 29 223 30 208 33 194 38 181 47\n347 0.999 108 115 113 137 143 138 170 138 198 138 225 139 254 139 284 139 306 140 333 140 360 138 354 118 326 117 297 117 270 117 243 117 215 116 187 116 161 116 133 115\n347 0.999 172 138 172 155 171 172 184 174 203 174 217 174 237 174 253 174 262 174 277 174 292 171 293 154 294 140 281 139 266 139 250 140 233 139 217 138 202 137 185 137\n347 0.999 106 178 125 188 152 188 178 188 204 188 229 188 256 189 282 189 307 189 334 189 358 186 339 177 314 177 288 177 261 177 236 177 209 177 184 177 158 177 132 177\n347 0.999 114 194 132 198 154 198 175 198 197 199 218 199 240 199 261 199 282 199 303 200 323 193 306 190 284 190 263 190 241 189 220 189 198 189 177 189 155 189 134 189\n347 0.999 115 202 131 205 150 205 169 206 188 206 206 206 226 206 245 206 263 206 282 206 300 201 283 199 264 199 245 199 226 199 208 198 189 198 170 198 151 198 132 198\n347 0.999 114 210 128 213 145 213 163 213 180 213 197 213 214 213 231 214 248 213 265 213 280 208 266 206 249 206 232 206 215 205 198 205 181 205 164 205 147 205 129 205\n347 0.999 113 216 121 221 133 221 145 220 156 220 167 221 179 221 190 221 201 221 212 221 223 216 213 214 202 214 190 213 180 213 168 213 157 213 146 213 134 213 122 213\n347 0.999 233 216 243 221 256 222 269 221 281 221 293 221 306 222 318 222 330 222 343 222 354 216 344 214 332 214 319 214 307 214 294 214 282 214 269 214 257 214 244 214\n347 0.999 349 228 328 222 304 222 280 222 256 222 231 219 209 222 184 220 160 221 136 222 114 222 134 229 158 229 181 229 205 229 231 229 254 229 278 229 302 229 326 230\n347 0.999 166 232 173 239 186 239 198 239 211 238 222 239 235 239 248 239 260 239 272 239 284 234 275 229 263 230 250 230 238 230 226 230 214 230 201 230 189 229 176 229\n347 0.999 203 240 206 246 215 246 222 246 230 246 238 246 246 246 254 246 261 246 269 246 277 243 272 238 265 238 257 238 250 238 242 238 234 238 226 238 218 238 210 238\n347 0.999 205 248 206 254 214 254 220 254 227 254 233 254 240 254 246 254 253 254 258 254 263 247 258 246 252 246 246 246 239 246 234 246 230 245 223 245 217 245 209 245\n348 0.999 426 182 410 152 379 150 348 153 316 157 283 163 252 169 223 175 194 182 164 191 160 219 171 245 201 234 231 222 261 213 292 205 323 201 352 199 383 200 411 200\n349 0.999 233 3 226 2 220 2 214 3 209 6 205 10 201 15 198 20 195 25 195 30 195 32 202 36 206 34 208 30 210 23 215 19 220 16 226 14 231 13 232 8\n349 0.999 239 7 238 10 236 14 238 17 240 20 243 25 245 29 246 33 248 36 249 37 256 33 258 34 259 31 258 27 256 23 254 19 252 16 250 12 247 9 243 6\n349 0.999 7 63 15 62 35 56 57 50 79 48 101 47 122 48 144 48 166 47 187 45 186 21 169 20 153 19 135 20 112 19 89 19 67 18 46 18 23 23 8 29\n349 0.999 249 53 246 49 242 47 237 49 233 51 227 53 221 52 216 50 211 49 208 52 205 56 207 59 212 62 217 63 221 64 227 65 232 64 238 62 242 60 247 58\n349 0.999 139 124 144 126 155 126 168 127 183 128 197 129 211 130 225 130 238 131 248 127 252 109 246 107 233 106 221 104 208 103 194 103 181 102 167 101 153 100 143 102\n349 0.999 19 153 35 156 60 156 84 157 111 159 136 160 162 161 188 163 214 164 241 166 247 140 227 136 203 135 179 134 154 133 129 132 104 130 77 128 51 127 21 126\n349 0.999 71 190 80 192 95 193 112 193 129 194 146 195 162 196 179 197 195 198 212 197 214 177 204 176 188 175 172 174 156 173 140 172 123 171 106 170 89 169 74 169\n349 0.999 20 250 42 250 64 250 89 250 114 250 139 250 164 250 188 250 212 250 242 250 248 229 228 228 205 228 181 228 155 228 130 228 105 228 79 229 54 229 27 229\n349 0.999 223 264 209 252 192 252 174 252 156 252 139 252 121 252 103 252 86 251 70 251 55 253 66 265 84 265 102 265 121 264 139 264 156 264 174 264 190 264 207 264\n349 0.999 29 282 46 283 69 283 92 283 115 283 138 283 160 282 182 282 205 282 231 282 237 265 218 265 197 265 175 265 152 265 129 265 106 265 82 265 60 265 36 265\n350 0.999 438 140 403 124 364 112 322 104 284 100 243 101 203 105 165 113 126 125 102 137 85 162 115 177 151 162 190 152 228 145 259 144 297 144 336 150 387 164 420 169\n350 0.999 305 193 293 179 278 170 261 167 246 171 231 179 220 193 214 208 213 226 224 222 226 211 230 197 242 185 257 178 273 181 285 189 295 203 298 217 308 224 308 206\n350 0.999 346 187 346 196 346 206 346 217 347 225 353 227 365 226 378 226 388 226 406 222 407 225 406 215 406 205 406 195 405 187 390 186 372 186 361 186 364 185 354 186\n350 0.999 296 246 289 241 283 245 277 249 269 251 262 253 255 253 247 250 240 246 232 244 230 250 233 256 240 260 248 263 256 264 264 264 272 263 280 261 286 258 293 253\n350 0.999 363 273 338 273 314 274 289 274 267 275 242 275 218 276 194 277 168 278 159 282 161 314 182 315 206 315 230 314 252 313 268 313 287 312 310 312 349 311 359 294\n350 0.999 291 326 284 325 278 325 270 325 265 325 257 326 250 326 243 326 240 331 239 333 240 341 248 342 253 342 258 342 264 342 267 342 272 342 279 342 290 341 291 332\n351 0.999 433 100 398 93 360 92 323 92 286 95 247 98 210 105 173 112 137 122 96 133 88 164 120 164 157 153 195 144 233 138 269 133 306 132 343 132 379 135 415 140\n351 0.999 336 139 320 139 300 141 280 143 260 145 239 148 219 150 198 152 178 154 158 156 152 175 168 177 189 175 210 175 231 172 250 169 271 167 289 165 309 164 330 162\n351 0.999 465 157 441 158 415 160 390 163 365 165 339 168 313 170 285 172 261 176 237 180 235 197 252 203 277 205 304 205 332 203 357 201 383 199 407 197 432 195 458 192\n351 0.999 208 183 195 181 178 182 162 184 146 186 129 187 112 189 95 191 79 192 63 194 61 213 66 222 84 221 101 221 118 219 135 218 151 216 167 214 183 212 199 205\n351 0.999 345 232 336 231 323 232 311 232 299 233 286 233 273 234 260 235 247 236 238 241 239 257 250 258 263 258 276 258 289 257 301 257 314 256 325 256 337 255 345 250\n352 0.999 78 86 76 97 84 100 93 101 102 103 111 105 120 107 130 109 139 111 147 112 155 111 156 97 150 96 141 94 132 92 123 90 114 89 105 87 95 85 85 83\n352 0.999 29 113 43 117 63 120 81 123 100 126 118 129 137 132 157 135 176 139 193 142 200 126 178 117 169 118 151 115 132 113 114 110 95 106 76 103 57 99 37 96\n352 0.999 52 123 51 141 62 145 76 148 90 150 104 152 118 154 132 156 146 158 159 160 173 162 174 142 163 140 150 138 136 135 122 132 108 130 94 128 79 126 65 124\n353 0.999 294 53 285 53 277 54 268 56 258 57 249 59 239 60 230 62 221 63 215 67 213 79 223 79 232 77 242 76 251 75 260 74 269 73 278 72 286 70 295 67\n353 0.999 283 71 275 71 269 72 262 73 255 75 248 76 241 76 234 77 227 78 224 83 225 93 231 94 239 93 246 93 253 91 260 90 267 89 273 89 280 88 283 83\n353 0.999 375 129 358 129 327 131 296 135 266 140 234 145 201 149 170 154 140 161 135 170 143 223 151 229 182 226 214 225 253 222 288 219 318 212 340 207 365 202 374 177\n353 0.999 346 244 326 246 307 249 286 251 265 253 245 256 224 258 204 261 184 264 164 267 162 289 181 289 203 287 223 284 243 282 265 279 284 277 304 274 323 272 344 269\n354 0.999 518 81 481 66 436 55 391 50 345 51 300 57 255 66 212 77 169 95 125 115 115 151 148 149 190 131 233 117 276 103 320 95 363 91 408 90 451 95 494 105\n354 0.999 455 102 429 103 401 107 373 110 343 113 315 117 286 120 258 124 230 127 201 131 195 160 222 160 251 156 280 152 308 148 336 144 364 140 392 137 418 133 447 129\n354 0.999 519 129 482 133 436 139 393 146 348 153 304 160 261 166 218 171 173 179 125 187 120 224 156 229 199 223 245 216 289 207 332 200 375 194 418 187 461 180 501 172\n354 0.999 415 199 400 200 383 204 365 207 347 211 330 214 312 217 295 219 276 222 270 231 271 259 288 260 305 256 325 252 342 249 360 246 376 243 393 240 409 236 418 229\n354 0.999 426 239 409 241 390 245 370 249 351 253 331 257 312 260 293 264 273 268 254 273 255 295 273 296 293 292 313 288 332 284 351 280 370 275 388 271 407 267 425 263\n354 0.999 430 270 411 272 392 275 373 280 352 284 333 287 314 292 295 296 275 300 255 306 256 327 275 328 293 323 314 320 333 315 352 311 371 306 389 302 408 298 426 294\n354 0.999 583 286 575 287 564 289 553 292 541 294 530 299 521 304 511 309 501 313 500 320 500 337 511 335 521 331 532 328 542 324 553 320 562 316 571 313 581 311 586 306\n354 0.999 576 313 568 313 560 317 551 320 542 323 534 326 526 330 518 333 511 338 519 340 525 337 535 335 538 334 545 337 554 334 562 331 570 326 568 323 563 324 571 321\n354 0.999 556 341 548 342 540 345 531 347 522 350 513 355 505 359 497 361 489 364 482 368 482 379 491 381 499 377 509 374 517 371 525 367 533 364 541 361 548 357 556 354\n355 0.999 92 111 91 147 88 185 113 213 173 215 215 215 256 214 289 213 314 213 358 213 397 212 413 181 408 144 363 156 326 163 296 166 260 160 218 149 174 131 132 113\n355 0.999 396 213 365 214 335 215 308 215 285 216 254 217 228 217 191 217 155 219 137 234 138 265 162 273 188 271 219 270 256 268 286 266 313 264 341 261 371 258 394 241\n356 0.999 5 32 22 50 48 44 73 38 98 33 127 32 152 33 177 37 205 44 226 52 241 42 232 25 206 16 182 10 157 6 130 4 102 4 76 8 50 13 26 23\n356 0.999 39 159 55 162 71 161 92 162 112 162 134 162 154 162 169 162 195 161 211 160 214 143 200 137 182 137 163 138 143 137 124 137 104 137 84 137 64 136 43 137\n356 0.999 200 165 182 164 167 164 149 164 132 164 117 164 101 164 80 164 66 164 50 169 50 186 66 188 86 188 103 188 119 188 136 188 154 188 171 188 189 188 199 181\n356 0.999 57 192 57 209 68 214 85 214 102 214 119 214 135 214 148 214 169 214 180 214 193 213 193 194 181 191 166 191 150 191 134 191 118 191 101 191 85 191 71 191\n356 0.999 35 220 36 241 56 241 77 241 98 241 120 242 141 241 158 241 183 240 198 240 214 239 211 218 193 218 174 218 154 218 133 218 113 218 92 218 72 218 52 218\n357 0.999 58 78 60 124 79 139 126 123 172 111 213 104 261 104 306 110 351 119 393 127 433 134 436 87 409 63 365 52 325 42 278 35 230 35 182 39 137 49 97 60\n357 0.999 312 174 311 191 314 209 329 210 345 210 361 210 377 210 393 210 409 210 424 210 439 210 439 192 438 174 423 173 408 172 392 173 376 173 359 173 341 172 325 172\n357 0.999 57 178 57 196 65 209 80 209 97 209 114 209 129 209 147 209 163 209 180 209 195 210 195 192 191 178 173 177 156 175 140 174 123 174 105 174 87 174 70 175\n357 0.999 305 231 304 246 311 257 324 257 337 257 351 257 364 257 378 258 392 258 405 258 418 258 418 240 414 232 399 231 386 230 372 230 359 230 344 230 330 229 316 229\n357 0.999 70 255 78 258 91 257 102 257 114 256 125 257 138 257 150 257 162 257 168 253 169 237 161 231 148 230 136 230 125 230 113 230 100 230 88 230 75 230 70 240\n357 0.999 88 266 87 305 102 330 142 330 182 330 220 331 259 330 297 330 335 331 372 331 408 330 409 287 388 274 352 274 315 274 276 274 237 273 198 272 159 270 122 267\n358 0.999 495 23 462 28 429 30 396 31 367 32 335 35 306 41 275 47 243 52 240 80 245 108 279 105 312 99 343 96 375 94 405 93 434 92 464 92 497 90 496 56\n358 0.999 0 37 1 69 6 89 37 88 66 88 93 88 121 90 151 94 183 99 211 105 234 104 234 73 230 51 199 45 170 41 143 37 113 35 83 35 55 35 27 36\n358 0.999 206 147 222 152 243 152 264 152 284 153 305 154 325 154 346 155 368 155 390 156 400 142 384 141 364 140 342 138 321 138 300 137 280 136 259 135 237 134 217 133\n358 0.999 122 169 120 192 125 210 148 212 171 212 194 214 217 215 240 216 264 218 287 219 306 217 309 191 304 176 279 175 256 174 233 174 210 173 188 172 167 171 143 170\n358 0.999 312 194 322 201 342 203 365 203 386 205 409 206 431 207 453 207 473 209 492 210 499 194 488 186 468 185 447 184 425 184 405 182 384 181 364 180 343 179 323 179\n358 0.999 333 204 333 220 343 227 361 227 377 228 392 228 410 230 427 231 442 232 455 234 468 233 469 214 458 210 440 209 423 208 408 207 393 206 377 205 363 204 348 204\n358 0.999 146 232 174 237 208 239 237 242 267 244 298 246 329 248 360 250 392 253 423 255 450 246 425 239 394 237 362 235 331 233 299 230 268 228 236 226 205 223 174 222\n358 0.999 0 241 47 252 97 257 147 261 197 266 245 270 294 274 344 277 397 282 445 286 491 274 451 264 400 260 349 256 296 252 245 248 194 244 145 241 95 236 45 233\n358 0.999 38 257 69 266 104 270 138 273 172 276 206 280 240 283 274 286 308 289 342 291 377 286 346 280 312 276 278 272 239 269 207 267 173 264 139 262 105 259 71 257\n359 0.999 428 101 412 104 388 113 372 115 362 117 339 123 324 135 310 147 310 167 311 185 322 198 337 195 356 192 385 191 404 189 427 187 446 180 449 162 442 140 438 127\n359 0.999 299 142 289 119 265 125 232 134 199 136 171 136 140 133 108 128 72 119 61 142 52 175 63 194 94 199 123 204 154 205 185 205 215 204 250 200 275 196 296 181\n359 0.999 447 209 425 207 402 206 385 206 369 207 338 215 320 224 307 232 298 251 306 265 314 281 330 290 347 292 373 295 395 299 413 297 421 283 428 264 437 242 442 227\n359 0.999 67 248 76 267 83 291 92 311 119 310 146 303 169 300 199 298 226 298 246 300 255 278 256 256 259 235 245 222 219 222 190 221 167 219 138 217 109 220 91 230\n360 0.999 73 98 95 124 125 126 153 111 185 102 222 97 259 98 295 103 329 116 359 132 379 111 381 85 346 68 313 56 280 50 243 48 206 49 167 55 134 63 105 79\n360 0.999 365 135 332 134 299 133 264 134 233 135 202 135 171 137 140 137 109 139 104 163 105 195 134 195 169 194 201 193 234 194 268 193 300 192 332 193 359 192 363 168\n360 0.999 144 259 163 270 184 269 210 269 234 269 262 269 288 269 314 268 338 269 361 266 365 244 347 233 323 233 298 232 274 232 249 232 222 232 194 233 172 233 143 234\n360 0.999 145 244 129 246 118 247 105 248 93 249 82 249 74 250 73 262 74 274 75 284 75 294 85 295 99 294 111 294 123 295 140 296 144 291 144 278 145 267 146 258\n360 0.999 368 256 364 261 362 266 361 271 358 277 356 285 356 290 360 293 366 295 372 297 379 296 384 292 385 285 386 280 387 274 388 268 389 268 388 263 381 257 373 256\n360 0.999 158 270 157 293 172 304 191 304 212 304 236 304 257 304 279 304 300 304 320 304 340 301 340 278 327 268 306 268 284 269 263 268 240 268 218 269 196 269 177 269\n361 0.999 2 73 5 146 51 164 103 173 168 183 225 188 294 188 350 179 400 166 450 163 493 153 491 87 442 63 391 48 342 33 293 22 243 19 168 30 119 43 62 59\n361 0.999 496 167 445 173 396 181 346 190 299 197 253 204 224 201 174 194 123 185 76 176 51 212 94 220 150 231 197 238 246 243 286 241 336 233 382 225 425 217 471 208\n362 0.999 310 3 303 4 295 6 289 7 282 8 275 10 269 12 261 15 254 16 253 23 255 29 260 30 268 29 275 27 281 25 288 23 295 21 303 20 309 19 310 11\n362 0.999 62 7 62 30 66 48 93 48 120 48 146 49 170 49 189 49 212 49 236 51 254 50 253 24 250 6 227 5 203 6 179 6 155 6 131 6 106 6 83 6\n362 0.999 309 18 301 20 293 22 286 24 280 26 273 28 265 29 256 30 254 36 253 43 252 51 259 50 268 48 276 47 284 46 291 44 300 42 307 40 310 33 309 25\n362 0.999 12 132 39 147 63 125 93 105 127 94 163 90 199 93 230 104 259 125 282 145 309 136 288 110 261 88 231 72 200 62 164 59 128 62 92 71 62 87 36 108\n362 0.999 240 110 223 109 209 109 188 110 172 110 156 110 139 110 121 110 103 110 86 111 84 127 99 128 116 128 135 128 153 128 170 128 186 127 204 127 222 127 240 125\n362 0.999 259 135 239 133 218 133 198 133 178 133 158 133 138 133 116 134 94 134 74 134 69 148 88 152 110 152 130 152 150 152 170 151 190 151 211 151 233 151 253 150\n362 0.999 281 168 253 169 223 169 196 170 171 170 144 170 116 170 84 170 57 170 32 174 33 198 60 198 90 198 117 198 143 197 171 197 200 196 227 196 258 196 281 190\n362 0.999 270 217 270 221 270 226 270 229 275 231 279 231 283 231 286 231 290 231 295 230 298 229 298 225 297 221 297 216 294 215 289 216 285 216 281 216 277 216 273 216\n362 0.999 94 223 85 224 74 224 66 224 59 224 50 224 40 225 28 225 26 233 26 242 26 251 36 251 46 250 56 250 64 250 73 250 83 250 93 250 94 242 94 233\n362 0.999 118 230 118 250 140 250 165 249 189 250 210 251 232 252 252 253 273 254 294 254 310 247 303 229 282 229 262 227 243 226 220 225 198 224 176 224 154 224 133 224\n363 0.999 93 66 113 98 139 119 172 94 208 78 249 70 288 71 327 85 357 105 384 117 411 94 409 68 377 39 345 22 308 10 267 4 229 4 191 13 156 26 125 45\n363 0.999 68 316 110 331 166 314 223 301 292 284 377 266 441 246 461 193 450 135 384 122 350 128 282 134 237 131 170 132 218 160 235 157 183 163 134 146 90 181 82 247\n363 0.999 204 310 204 326 213 333 229 333 245 334 262 334 277 333 299 333 309 333 318 333 335 330 336 311 324 308 309 308 295 308 279 308 263 308 248 308 232 308 218 308\n363 0.999 79 373 107 397 139 422 177 439 216 445 256 445 296 444 331 435 368 419 402 393 414 368 387 336 355 350 326 367 286 382 242 385 201 377 166 357 142 336 112 345\n363 0.999 448 412 443 412 440 412 435 413 431 413 427 413 423 413 421 413 416 414 414 420 414 423 418 426 423 425 427 425 432 425 437 425 442 425 446 425 450 423 450 417\n363 0.999 448 425 443 424 438 425 433 425 428 425 424 425 419 425 417 425 413 426 412 434 412 439 414 442 420 441 425 441 431 441 436 441 442 440 446 440 449 436 449 429\n364 0.999 275 43 255 38 232 33 210 33 196 41 178 50 163 62 148 71 137 83 141 98 144 117 151 129 168 120 185 112 204 105 222 97 240 90 257 83 272 76 273 60\n364 0.999 21 182 20 198 39 200 62 200 79 201 98 201 119 203 137 203 162 203 170 192 173 176 172 157 156 155 138 154 122 152 99 149 79 146 60 144 37 149 30 166\n364 0.999 183 159 184 190 202 204 233 205 261 206 290 208 321 209 351 210 381 212 410 213 437 213 438 181 421 167 392 166 363 165 332 164 300 162 271 162 242 161 212 160\n364 0.999 27 201 30 248 63 266 112 269 156 272 202 274 251 278 296 282 344 285 390 289 425 282 430 240 401 219 359 216 313 215 262 212 213 210 166 208 119 206 73 203\n364 0.999 498 260 483 260 468 263 455 270 446 279 439 292 436 306 438 319 441 333 444 348 457 354 461 343 457 329 458 317 453 313 457 299 464 288 474 280 486 278 498 274\n364 0.999 17 278 25 317 74 319 119 323 162 328 208 330 255 334 300 338 345 342 390 345 435 348 419 313 377 305 334 302 290 298 243 294 196 291 151 287 105 284 61 280\n364 0.999 459 315 459 321 459 327 465 327 470 327 476 327 482 327 488 327 493 327 499 327 499 325 499 319 499 318 498 313 494 312 488 312 482 312 476 312 470 313 464 313\n365 0.999 78 76 92 89 105 94 120 82 137 74 156 72 177 74 194 80 212 91 223 95 237 80 230 67 214 56 197 48 180 44 160 43 141 44 124 48 107 55 93 65\n365 0.999 133 129 133 135 133 141 137 144 142 144 149 144 155 144 163 144 170 144 173 144 179 143 180 137 180 129 174 129 169 129 163 129 156 129 150 129 145 129 140 128\n365 0.999 190 144 181 144 173 145 165 145 156 145 148 145 141 145 134 145 125 145 122 152 122 160 130 162 137 162 145 162 155 162 163 162 172 162 180 161 189 160 190 152\n365 0.999 46 179 60 203 78 226 100 241 124 250 153 253 183 249 209 237 232 220 246 201 260 176 241 163 220 172 203 197 180 212 154 216 125 212 100 194 87 171 68 163\n366 0.999 109 30 126 60 159 63 191 51 214 46 247 44 280 46 314 51 348 62 377 78 395 65 389 37 361 23 333 10 300 3 262 0 234 0 196 2 169 6 137 17\n366 0.999 386 91 351 72 316 60 262 54 185 63 138 82 125 125 116 185 159 219 215 193 248 183 290 186 316 185 273 183 231 187 250 213 305 213 329 204 407 201 400 143\n366 0.999 459 237 420 210 388 241 347 263 308 272 265 275 220 269 175 256 137 234 103 212 66 248 96 276 136 301 175 319 221 329 265 333 308 328 354 318 393 299 428 270\n367 0.999 47 72 68 102 109 82 151 68 196 57 240 54 283 55 327 63 372 75 411 92 442 82 425 55 385 36 341 21 295 12 253 9 211 12 166 19 125 30 83 48\n367 0.999 476 15 476 18 475 21 475 24 475 27 479 28 482 27 484 28 488 28 491 28 495 25 496 23 497 19 496 14 494 14 491 14 489 14 485 14 482 14 479 14\n367 0.999 468 29 468 35 469 40 469 45 469 50 472 54 477 55 483 54 489 54 494 54 499 51 498 46 498 40 498 33 497 28 493 28 488 28 482 28 478 28 472 28\n367 0.999 77 122 98 145 131 128 166 118 201 113 237 109 273 111 310 116 350 124 382 134 410 131 395 109 361 96 322 88 285 83 251 80 216 82 178 87 144 93 109 105\n367 0.999 473 145 426 149 370 153 318 157 265 162 211 165 162 170 114 174 59 180 27 203 30 252 82 247 137 244 192 239 244 235 298 230 348 225 398 219 448 214 480 192\n367 0.999 195 264 195 269 199 271 203 272 206 272 210 273 213 273 217 273 221 275 225 275 229 272 228 267 224 267 220 266 216 266 212 265 210 264 206 264 201 264 198 263\n367 0.999 303 265 299 263 295 263 291 264 287 264 283 265 279 266 275 266 271 267 268 267 268 271 271 275 276 275 280 274 284 273 287 273 291 272 295 272 299 271 303 270\n367 0.999 92 300 123 313 160 322 193 329 229 331 263 331 296 328 331 322 365 313 398 300 407 278 381 271 346 278 311 286 274 289 239 288 205 284 173 279 140 270 109 264\n368 0.999 22 50 22 65 27 73 42 73 55 72 68 72 81 72 95 72 109 73 120 73 133 74 133 59 128 51 115 51 101 51 88 51 74 51 61 51 48 51 34 50\n368 0.999 245 78 221 85 192 94 164 105 134 117 107 130 81 142 54 158 31 171 22 204 24 234 43 233 70 213 96 195 122 180 147 165 176 150 201 137 230 124 244 102\n368 0.999 223 223 215 223 209 223 203 223 195 223 189 223 183 223 175 224 172 228 173 234 173 241 179 241 185 240 193 240 199 240 205 240 212 240 219 240 222 236 222 230\n368 0.999 181 250 183 252 186 252 190 252 193 252 196 252 199 252 203 252 206 251 209 251 212 249 209 245 207 245 204 245 200 245 196 245 192 245 189 245 186 245 182 246\n368 0.999 212 252 208 251 205 251 201 251 198 252 195 251 191 252 187 252 184 252 180 252 180 255 182 258 186 258 190 258 194 258 197 258 201 258 204 257 207 257 211 256\n368 0.999 186 266 186 268 188 271 190 271 192 271 194 271 196 271 199 271 202 271 204 271 207 270 207 267 205 265 204 265 201 265 198 265 196 265 193 265 191 265 188 264\n368 0.999 217 270 212 270 208 270 203 271 198 271 194 271 189 271 184 270 180 270 175 271 175 276 179 277 184 277 189 277 194 277 198 277 203 277 207 277 212 277 217 276\n369 0.999 59 24 53 16 45 11 39 14 32 18 25 21 18 24 11 27 4 31 0 35 0 40 0 51 9 52 16 48 23 44 29 41 36 37 42 34 47 32 54 27\n369 0.999 217 41 205 38 191 37 177 37 163 37 150 38 136 39 122 40 108 41 96 46 91 62 104 61 118 60 132 58 146 57 160 56 174 56 187 57 200 57 212 57\n369 0.999 268 77 254 70 225 69 196 68 170 67 144 68 116 69 89 72 60 76 48 90 50 126 64 129 93 125 125 118 151 116 176 116 203 115 229 115 253 116 272 113\n369 0.999 207 121 200 116 191 116 182 116 173 116 164 116 155 117 146 117 136 118 127 118 120 121 125 125 135 125 144 125 153 124 162 124 171 123 180 123 188 122 197 123\n369 0.999 260 129 249 126 221 127 195 128 172 128 148 129 122 130 97 131 69 133 65 146 66 183 77 184 106 184 131 182 158 181 183 180 208 179 231 178 251 177 260 163\n369 0.999 254 197 239 194 219 195 199 196 180 197 160 198 140 198 120 198 100 199 80 199 75 217 89 222 110 223 131 223 151 222 170 221 190 221 210 219 227 218 247 216\n370 0.999 117 58 121 75 130 88 147 85 165 83 183 88 198 95 213 107 224 123 236 130 251 117 252 100 241 87 230 76 218 66 203 59 186 53 169 50 150 49 134 53\n370 0.999 103 64 70 98 59 142 56 177 75 220 109 251 150 262 194 258 231 238 257 204 266 160 241 128 231 170 213 206 183 227 144 227 107 204 88 167 91 128 117 95\n370 0.999 93 138 93 156 93 174 106 176 126 176 144 177 161 177 178 178 194 179 221 180 226 179 227 156 220 142 207 141 192 140 178 139 162 139 145 139 129 138 111 138\n371 0.999 37 11 39 25 46 37 56 47 71 52 87 51 100 44 109 36 113 22 113 8 101 3 100 16 101 21 93 33 79 38 66 37 57 28 52 17 52 2 41 0\n371 0.999 54 0 53 7 53 14 54 22 57 28 61 34 69 36 78 36 84 37 92 35 97 30 98 24 99 17 98 9 97 1 91 0 83 0 75 0 67 0 60 0\n371 0.999 0 22 0 25 0 29 0 34 0 37 0 42 0 46 5 48 9 48 14 48 19 48 23 45 22 41 21 35 22 31 22 25 19 22 14 21 9 21 5 21\n371 0.999 0 61 0 65 0 68 0 72 0 77 0 82 4 85 9 85 13 85 17 85 21 84 23 79 23 77 23 71 23 66 22 60 17 60 13 60 8 59 5 59\n371 0.999 426 63 393 88 337 130 295 153 238 174 186 180 125 181 75 219 77 272 81 332 87 391 116 384 170 359 229 330 287 300 350 271 397 245 440 217 460 171 442 117\n371 0.999 275 114 255 114 238 114 216 116 195 117 174 119 153 119 130 121 112 128 113 144 115 166 133 167 154 165 175 163 195 161 216 159 238 159 259 157 276 152 275 134\n371 0.999 381 330 359 331 339 332 316 334 295 336 273 338 251 339 229 342 208 344 200 359 203 380 223 380 244 379 267 376 288 374 311 371 332 370 354 368 377 365 381 351\n371 0.999 490 336 480 343 474 352 468 363 462 374 457 384 453 396 451 404 450 414 450 423 458 422 471 422 484 419 487 408 491 396 495 385 498 376 499 368 498 358 497 345\n371 0.999 0 426 0 434 8 439 16 433 25 428 34 425 45 425 55 428 65 432 72 438 84 442 84 431 77 423 68 417 57 414 46 413 35 414 24 414 14 419 5 423\n371 0.999 80 445 71 443 65 449 58 452 48 454 39 455 31 455 22 453 14 448 4 445 2 454 8 459 16 462 24 466 34 467 43 468 53 466 62 464 71 459 79 454\n372 0.999 19 91 58 92 91 82 120 71 151 63 183 66 215 78 239 87 268 96 295 85 296 62 278 49 247 38 216 24 186 13 149 11 109 22 84 32 56 41 26 47\n372 0.999 273 130 258 120 230 120 203 120 173 120 145 120 116 120 72 120 52 120 45 130 47 173 77 182 107 181 137 181 163 181 193 180 215 180 246 180 272 175 273 149\n372 0.999 291 209 273 203 244 207 215 217 182 229 151 232 119 225 82 212 53 207 27 206 27 243 60 260 91 266 120 275 150 279 181 277 209 269 240 258 274 250 290 236\n373 0.999 45 56 53 81 74 96 104 91 126 86 155 79 172 80 191 85 220 94 246 93 253 69 250 50 233 31 207 27 184 28 159 29 132 30 109 30 85 26 59 36\n373 0.999 273 143 245 143 228 144 202 144 175 144 147 145 116 145 86 145 65 146 38 147 37 178 64 180 89 179 115 179 141 178 168 177 195 177 221 176 250 176 271 173\n373 0.999 244 269 221 269 208 269 188 269 166 270 144 271 120 271 97 272 77 273 58 274 57 297 77 299 98 299 117 298 138 297 160 296 180 295 202 295 224 294 241 292\n374 0.999 137 120 137 162 144 192 193 175 232 167 267 163 303 162 337 166 362 174 403 190 428 196 430 155 425 122 391 120 346 118 316 117 276 117 238 117 203 118 170 119\n374 0.999 222 189 225 204 244 204 266 204 284 204 298 204 315 204 329 204 340 204 346 185 347 178 342 169 326 166 310 166 291 166 280 166 264 166 247 165 225 169 223 171\n374 0.999 432 208 395 206 364 206 337 208 304 211 270 211 238 209 201 208 160 207 135 207 136 251 170 253 204 254 238 256 262 258 305 260 339 257 372 252 406 255 430 246\n375 0.999 118 57 110 56 101 56 93 56 84 56 78 56 68 56 59 60 58 68 57 78 57 85 66 86 75 86 83 86 93 85 101 85 111 85 119 82 118 75 118 65\n375 0.999 55 89 55 98 54 107 55 116 64 116 76 116 83 116 92 115 100 115 109 114 118 114 119 107 119 97 118 90 109 90 99 90 90 89 81 89 73 89 63 89\n375 0.999 59 117 58 125 59 133 58 141 64 145 76 145 82 145 91 145 99 144 106 144 114 143 115 136 115 128 115 120 108 117 100 118 91 118 82 117 75 117 66 117\n375 0.999 124 143 114 144 106 144 97 145 88 145 80 145 71 145 58 145 52 148 53 156 53 165 62 165 71 165 80 164 89 164 99 164 108 163 117 163 124 160 124 152\n375 0.999 107 171 102 169 98 169 93 169 89 169 85 169 81 169 76 169 71 169 69 171 69 176 72 178 76 178 81 177 85 177 90 177 94 177 98 177 103 178 106 176\n375 0.999 32 190 42 195 53 198 64 201 75 203 88 203 100 203 113 200 126 197 136 194 137 184 130 178 119 182 108 184 95 187 82 187 70 186 58 184 48 180 37 178\n376 0.999 389 7 350 17 311 26 274 37 237 48 200 59 163 68 126 79 88 92 54 105 31 130 69 120 105 107 142 96 181 85 218 75 257 66 293 56 332 46 366 37\n376 0.999 279 87 250 94 213 106 190 109 161 116 133 121 104 130 71 132 49 153 45 188 41 219 69 216 104 210 139 202 173 196 207 190 238 185 265 179 278 155 278 119\n376 0.999 360 114 357 129 353 145 349 158 364 161 379 164 392 166 408 170 423 176 437 181 449 179 454 166 458 151 462 137 445 131 432 128 417 124 403 121 389 119 373 115\n376 0.999 223 191 206 193 188 195 171 197 153 199 136 203 121 206 102 209 85 212 73 220 73 236 90 235 107 233 125 231 143 228 161 225 177 223 192 220 211 216 223 209\n377 0.999 192 30 184 30 176 29 170 28 160 28 148 29 142 29 141 36 140 44 139 52 140 59 148 58 155 57 163 56 172 55 180 55 186 56 191 53 191 45 191 38\n377 0.999 240 69 233 49 216 52 201 55 181 55 163 56 145 58 127 59 108 57 95 56 95 67 95 86 114 91 132 93 152 92 170 91 187 90 207 88 225 87 240 78\n377 0.999 112 111 112 129 119 138 138 138 156 138 171 137 188 137 205 138 222 138 240 138 255 137 255 120 245 114 230 114 212 114 195 114 178 114 161 113 144 112 128 111\n377 0.999 137 139 137 150 137 160 146 164 158 164 168 164 179 165 191 165 202 165 213 165 223 163 223 153 222 142 214 141 203 141 192 141 180 141 168 140 158 139 147 138\n377 0.999 158 210 157 216 158 223 166 223 173 224 179 224 186 224 192 224 199 224 206 224 209 220 208 213 206 208 201 205 194 205 188 205 181 205 174 205 168 204 161 205\n378 0.999 490 80 439 82 385 83 331 84 280 85 223 87 189 88 126 89 66 96 16 110 10 159 63 137 113 133 171 135 209 138 274 138 324 137 376 136 432 135 489 133\n378 0.999 481 154 433 153 382 153 328 154 280 154 226 155 197 155 137 157 83 158 31 159 28 211 74 212 129 211 184 210 223 210 277 209 325 209 375 209 428 207 481 203\n378 0.999 0 256 43 262 96 263 146 263 198 264 249 264 305 265 349 268 407 266 456 264 494 242 449 237 397 237 346 236 293 236 244 236 191 236 142 236 90 236 40 236\n378 0.999 149 263 174 279 210 279 244 279 282 279 320 279 355 279 387 279 426 279 462 279 497 278 469 266 433 266 398 268 362 268 330 262 292 262 258 262 220 262 184 262\n378 0.999 0 266 7 275 21 276 36 275 49 276 65 277 80 277 91 277 108 277 122 278 134 274 125 265 110 265 97 265 83 265 70 265 55 264 42 264 28 263 13 263\n379 0.999 194 8 190 7 164 11 141 19 123 26 97 34 79 40 59 50 60 70 62 88 63 105 77 109 100 99 119 91 137 85 153 78 174 70 194 63 198 42 196 23\n379 0.999 485 158 451 123 406 94 355 78 299 71 235 73 184 85 132 100 82 120 40 143 36 190 66 220 118 194 167 176 216 163 273 159 326 168 373 188 427 209 462 188\n379 0.999 353 186 337 174 306 172 276 174 248 175 201 177 174 179 146 184 138 208 138 232 138 265 163 268 201 268 230 267 260 265 290 264 317 261 347 259 354 227 353 200\n379 0.999 33 330 82 337 120 334 163 333 219 333 263 338 305 344 359 351 410 357 461 349 469 301 443 282 400 278 346 274 294 273 243 273 195 275 138 276 94 276 45 280\n380 0.999 159 89 155 66 150 60 133 62 116 62 97 63 77 63 63 64 49 64 33 64 21 72 22 89 32 100 53 99 66 98 82 97 97 96 112 95 131 94 154 94\n380 0.999 186 101 167 103 149 106 130 109 109 112 88 115 65 118 46 121 27 124 6 127 6 152 23 150 45 146 66 143 86 139 107 137 127 134 148 131 169 128 187 122\n380 0.999 176 175 165 167 152 161 136 158 121 157 105 158 88 159 73 161 59 165 46 170 40 181 50 190 64 183 80 179 94 176 110 176 124 176 139 178 155 181 168 186\n380 0.999 147 181 139 181 131 181 123 182 115 182 105 183 96 184 88 185 80 185 72 185 71 195 78 196 85 195 94 195 103 194 112 193 121 193 129 192 138 191 146 191\n380 0.999 14 213 13 236 31 244 57 244 80 244 100 243 121 243 145 243 167 243 185 242 204 242 205 216 189 214 166 214 146 214 124 214 102 213 80 213 57 213 36 213\n381 0.999 397 0 371 0 343 0 312 0 281 0 246 0 205 5 179 17 148 26 125 41 125 91 147 111 177 98 207 81 235 65 267 50 294 43 324 39 354 42 383 39\n381 0.999 397 125 364 124 315 130 267 138 225 140 200 144 153 150 119 169 116 223 120 255 150 228 182 203 216 198 245 202 282 218 325 243 359 271 389 266 395 225 397 186\n382 0.999 448 149 421 93 400 62 344 41 289 49 241 54 188 52 131 67 99 86 58 121 47 171 88 218 139 206 168 159 210 162 261 150 317 155 336 177 390 214 426 192\n382 0.999 111 226 108 238 108 248 108 262 109 279 115 284 127 284 143 286 158 295 167 301 175 292 177 282 177 274 180 255 182 240 174 235 162 231 132 226 135 226 121 226\n382 0.999 389 236 368 233 352 232 326 231 303 232 284 233 263 235 239 234 236 253 235 273 234 296 245 298 267 297 294 294 316 292 334 291 358 291 367 290 384 282 385 253\n383 0.999 187 67 187 89 188 103 194 125 207 123 226 121 242 120 258 119 281 118 310 119 315 114 314 98 314 86 314 70 295 67 278 67 259 66 240 66 219 66 202 66\n383 0.999 242 126 245 137 254 137 265 137 275 137 286 138 295 138 306 139 316 139 329 139 337 136 333 126 325 125 314 125 305 125 294 125 283 124 273 124 262 124 251 124\n383 0.999 131 162 131 194 153 196 185 196 215 197 241 196 268 196 295 196 325 197 356 198 379 195 375 165 350 165 326 164 298 163 271 162 241 162 213 161 183 160 155 159\n383 0.999 229 197 233 211 248 211 265 211 281 211 296 211 312 211 328 212 344 212 358 212 371 210 365 198 351 198 336 198 321 197 305 196 290 196 275 196 259 196 243 196\n383 0.999 215 236 224 237 230 237 242 237 254 237 262 236 271 236 283 237 291 238 302 234 302 222 293 222 285 222 276 223 266 223 257 221 246 221 236 221 225 220 215 222\n383 0.999 168 268 167 298 176 304 206 294 227 291 246 289 267 288 289 289 313 291 341 298 355 297 352 270 341 263 321 261 299 261 276 260 252 260 229 260 205 261 185 263\n383 0.999 324 314 321 295 309 291 300 291 286 291 268 290 252 290 237 291 224 291 215 292 204 295 204 309 217 316 231 315 246 315 260 315 275 315 290 315 304 315 319 315\n384 0.999 94 67 111 101 131 126 162 116 194 109 226 108 262 110 296 117 328 126 357 137 376 122 388 94 357 72 328 59 296 52 263 47 228 45 190 44 156 48 128 56\n384 0.999 91 179 98 214 131 214 166 215 199 216 233 217 268 218 304 219 338 220 370 221 399 217 396 181 359 179 326 177 292 175 258 174 223 173 187 172 152 171 118 171\n384 0.999 103 217 104 254 126 272 159 272 190 273 222 273 256 274 291 275 325 275 356 276 388 277 390 237 366 222 335 221 302 219 270 218 235 217 200 216 165 215 133 214\n385 0.999 17 16 17 35 27 41 42 42 56 42 72 43 87 43 102 43 118 44 132 44 145 42 145 20 135 19 121 18 107 18 92 17 77 16 61 16 45 16 30 15\n385 0.999 37 67 37 78 43 83 52 80 62 79 70 78 80 78 89 79 98 80 107 82 115 82 117 68 110 65 101 63 92 61 83 61 73 60 63 61 54 62 45 64\n385 0.999 17 109 17 125 25 135 39 135 52 135 66 135 80 135 94 135 109 135 121 135 135 134 135 114 129 110 116 109 101 108 87 108 73 108 58 108 44 108 30 108\n386 0.999 53 15 53 21 54 32 54 41 58 45 67 44 76 44 83 44 91 44 97 45 103 43 104 35 104 26 104 17 101 15 91 14 83 14 74 14 65 14 59 13\n386 0.999 38 73 39 82 43 87 52 85 62 84 71 83 80 83 89 84 98 86 106 88 113 88 114 76 109 72 100 69 91 67 82 66 73 65 63 66 53 67 44 69\n386 0.999 19 116 19 131 24 142 35 142 50 142 64 142 79 142 93 142 106 142 119 142 132 141 133 126 131 116 116 116 103 115 87 115 73 115 58 115 44 115 31 115\n387 0.999 74 30 75 36 79 39 85 39 91 39 98 39 104 40 110 40 117 40 123 41 129 39 129 33 126 30 119 28 112 27 105 26 99 25 92 26 85 27 80 28\n387 0.999 58 44 58 54 63 59 73 59 84 58 94 58 104 58 114 59 124 59 135 60 144 61 144 50 140 44 129 43 118 43 107 42 97 42 87 42 77 42 67 43\n387 0.999 158 50 158 55 158 61 158 65 163 67 170 67 175 68 181 68 186 69 192 69 197 69 198 63 197 58 196 54 191 53 185 51 179 50 173 49 168 48 163 48\n387 0.999 79 64 80 70 85 74 88 76 93 78 98 80 104 80 110 79 116 76 122 73 126 69 126 65 124 62 119 62 111 62 105 62 100 62 94 62 88 62 84 62\n387 0.999 103 150 100 155 99 163 100 171 100 179 99 186 99 192 99 200 99 208 100 217 106 221 111 216 111 207 111 199 111 191 111 183 111 176 111 169 111 163 111 155\n387 0.999 139 178 142 180 148 180 154 180 160 180 166 180 172 180 178 179 184 179 190 179 194 174 191 172 185 172 179 172 173 172 167 172 160 172 155 172 148 172 142 172\n387 0.999 193 180 187 179 181 179 175 179 169 179 163 180 157 180 151 180 146 179 140 179 139 185 144 186 150 186 156 186 162 186 168 186 174 186 180 186 186 186 192 185\n387 0.999 198 186 192 185 185 186 179 186 172 186 165 186 160 186 153 186 146 186 140 187 140 193 146 194 153 194 159 194 165 194 172 194 178 194 185 193 191 194 197 192\n387 0.999 199 235 193 235 187 236 182 237 176 240 172 244 168 248 165 253 162 257 160 262 162 266 169 268 170 265 172 260 176 255 181 251 186 249 192 247 198 246 199 241\n388 0.999 61 50 61 65 73 67 88 67 102 67 115 67 130 67 145 67 159 67 169 67 181 65 181 51 167 50 153 50 141 49 127 49 114 49 100 49 87 49 74 49\n388 0.999 23 64 22 92 22 120 37 129 63 125 94 125 124 124 151 125 177 123 187 123 211 125 217 102 217 75 197 68 173 69 147 70 123 71 97 70 70 70 46 67\n388 0.999 210 126 187 123 162 123 144 124 123 124 97 125 75 125 55 125 31 128 30 159 30 182 51 179 72 176 97 175 125 175 146 175 172 178 195 180 207 173 208 148\n388 0.999 199 199 181 194 163 189 147 188 130 188 112 188 94 189 77 190 61 191 41 200 43 215 55 221 70 218 90 216 109 214 127 214 145 215 165 216 183 220 194 215\n388 0.999 174 286 166 276 156 282 144 286 133 289 121 290 109 290 97 289 86 285 75 281 67 284 73 293 84 298 95 301 106 303 118 304 131 303 142 301 154 297 164 291\n389 0.999 39 60 51 65 66 53 82 45 99 42 113 36 126 42 145 47 162 52 177 61 190 59 181 46 165 36 148 29 131 25 116 26 99 24 81 29 64 37 51 46\n389 0.999 60 77 67 83 78 78 89 73 99 72 109 71 121 71 133 72 144 76 155 80 165 76 157 69 147 64 136 60 125 58 113 58 102 58 89 60 78 65 69 70\n389 0.999 204 92 182 93 170 93 146 93 116 93 93 93 71 95 50 95 34 101 34 126 35 147 56 143 80 142 103 142 124 142 145 142 166 141 188 142 206 141 205 114\n389 0.999 39 145 40 165 44 189 66 184 80 183 95 183 115 183 138 184 158 186 178 188 194 189 195 167 196 148 178 141 159 141 138 142 118 142 95 141 74 141 57 143\n389 0.999 37 200 41 217 58 223 76 222 91 221 106 220 123 221 141 221 160 223 177 225 192 223 195 205 178 198 162 195 144 193 126 192 109 191 89 191 70 192 54 195\n390 0.999 32 48 45 53 63 41 81 33 99 28 117 24 129 26 149 30 168 36 184 46 197 42 186 28 169 18 151 12 132 8 115 7 96 9 75 15 58 24 45 35\n390 0.999 173 58 161 53 151 48 139 45 126 44 113 44 98 45 86 48 75 53 64 58 58 68 69 71 80 64 92 60 104 58 118 58 130 58 141 59 153 62 165 68\n390 0.999 211 79 187 79 176 79 153 81 130 82 106 83 69 85 46 84 25 90 27 118 28 141 48 138 75 135 98 134 121 133 145 132 170 131 192 131 212 131 212 103\n390 0.999 204 132 183 130 176 131 152 132 131 132 108 133 74 135 53 136 35 143 36 166 37 188 56 184 80 180 101 178 121 177 145 178 168 179 186 180 203 179 203 153\n390 0.999 34 198 39 220 61 223 79 220 99 219 118 218 127 217 148 217 169 218 188 220 202 216 206 197 189 190 169 186 148 186 130 185 110 185 88 186 69 188 51 192\n391 0.999 57 60 60 73 75 73 89 73 103 73 116 73 130 74 144 74 157 74 171 74 183 71 179 57 165 56 152 56 138 56 124 56 111 56 97 56 83 56 69 56\n391 0.999 24 71 24 98 22 131 43 132 67 131 93 131 120 131 147 131 172 131 194 131 220 133 222 106 222 77 197 75 172 76 146 76 123 77 97 76 69 75 48 73\n391 0.999 31 134 32 158 32 183 52 181 75 180 96 180 121 180 146 181 168 183 189 185 208 185 210 158 210 133 189 131 168 131 142 131 121 130 97 130 73 130 52 130\n391 0.999 40 196 45 215 54 226 73 224 91 222 108 221 126 222 143 222 160 224 177 226 189 222 195 204 178 195 162 193 144 191 127 190 110 190 92 190 75 190 57 191\n391 0.999 38 241 49 250 66 248 83 248 100 248 117 248 133 249 150 250 167 251 184 254 199 255 190 243 174 240 157 238 140 236 123 236 106 236 89 236 72 236 55 238\n391 0.999 66 298 73 305 84 310 95 314 106 316 117 317 129 316 140 315 151 312 162 310 170 301 162 296 151 300 141 302 130 303 119 304 108 305 97 303 86 299 75 294\n392 0.999 46 78 45 99 45 126 59 126 80 123 103 122 125 122 146 122 167 121 183 122 201 124 204 105 204 79 187 78 166 78 147 78 125 78 104 78 83 78 61 78\n392 0.999 52 128 53 147 53 168 69 164 88 160 108 159 127 159 147 160 165 162 181 165 195 166 195 145 195 123 178 121 160 121 143 122 124 122 105 122 86 122 68 124\n392 0.999 60 183 64 197 78 196 91 194 105 193 119 193 132 193 146 194 159 195 172 198 183 196 183 182 169 178 155 177 142 176 128 175 114 176 100 176 87 177 73 179\n392 0.999 60 248 69 259 81 268 94 276 109 280 126 281 141 280 156 276 169 270 180 261 188 248 176 240 165 248 151 255 137 260 122 261 107 259 93 253 80 244 70 235\n392 0.999 79 276 86 282 94 286 104 290 114 292 124 293 134 292 144 291 153 288 161 284 169 276 161 274 152 278 143 280 133 282 123 282 114 281 105 280 96 276 87 272\n393 0.999 186 10 181 9 176 9 171 10 165 10 160 10 154 10 149 11 144 12 142 15 145 21 150 22 154 24 161 24 167 24 172 24 177 23 182 23 186 20 187 16\n393 0.999 127 107 130 129 161 130 189 131 217 132 245 133 273 134 301 135 328 137 356 138 383 138 377 116 349 115 321 114 293 112 265 112 237 110 209 110 181 109 154 108\n393 0.999 412 123 403 125 396 130 388 137 385 147 384 158 388 166 392 175 400 183 409 187 418 188 426 186 436 181 443 171 445 163 447 153 445 142 440 134 432 128 422 124\n393 0.999 59 132 61 137 66 137 70 137 75 137 79 137 84 137 89 138 94 138 99 138 104 135 101 132 96 132 92 132 87 132 82 131 78 131 73 131 68 130 64 130\n393 0.999 63 137 64 142 68 142 72 142 77 142 80 142 85 142 88 144 92 144 96 144 101 140 98 137 94 137 89 137 86 137 82 137 78 137 74 136 71 137 67 136\n393 0.999 168 159 180 160 198 161 215 161 233 161 251 162 268 162 285 163 302 164 323 164 325 152 313 145 295 145 278 145 260 144 242 144 225 143 207 142 190 141 169 141\n393 0.999 206 161 206 169 209 176 219 177 229 177 239 177 248 177 258 179 267 179 277 180 287 179 288 168 282 163 273 162 263 162 253 162 244 162 234 161 225 161 215 160\n393 0.999 61 170 63 175 66 176 70 177 74 177 78 178 82 178 87 178 92 177 96 177 100 174 98 170 94 171 89 171 85 171 80 171 76 171 72 170 68 169 64 168\n393 0.999 126 183 127 206 156 207 185 208 212 209 239 209 266 210 292 210 318 212 345 213 372 213 367 189 339 188 313 187 286 187 259 186 231 185 204 185 177 184 151 184\n394 0.999 175 87 175 105 175 127 198 123 220 117 231 116 253 116 271 116 295 119 304 121 320 122 323 105 322 87 304 82 286 80 266 77 247 76 223 76 203 79 188 84\n394 0.999 411 214 374 217 341 217 313 217 278 216 232 218 197 218 156 219 122 219 86 233 85 273 119 272 158 268 195 265 232 267 271 267 304 267 338 265 373 264 408 252\n394 0.999 394 288 373 272 345 272 317 273 287 273 254 274 224 274 191 274 161 275 131 275 102 277 123 295 154 295 185 294 215 294 246 293 276 293 303 292 334 292 363 292\n395 0.999 29 37 38 51 50 44 64 37 79 33 95 31 111 32 127 36 141 43 154 51 165 44 161 35 149 25 134 19 118 15 103 13 87 13 72 17 56 21 43 29\n395 0.999 61 45 62 54 68 56 76 56 85 56 94 56 102 56 110 56 120 56 128 57 136 57 135 45 128 45 121 44 112 45 103 45 95 44 86 44 78 44 69 44\n395 0.999 41 59 45 70 56 70 69 69 82 70 94 70 107 70 119 70 132 71 145 71 155 69 152 59 139 58 127 58 115 58 102 58 89 57 77 57 64 57 52 57\n395 0.999 71 88 74 94 77 98 83 97 89 95 95 94 101 94 107 95 113 97 118 100 122 93 124 89 119 85 114 83 107 81 101 80 94 80 88 81 82 83 77 85\n395 0.999 68 164 74 166 80 169 86 171 92 172 99 173 106 172 112 171 118 169 123 167 123 161 120 155 114 155 108 158 101 160 94 160 88 158 82 155 76 154 72 158\n395 0.999 86 189 86 192 86 195 86 198 89 200 93 201 97 201 100 201 104 201 107 201 111 199 111 197 110 192 110 189 107 188 103 188 100 188 97 188 93 188 89 188\n395 0.999 44 199 48 202 53 205 58 208 65 210 70 213 77 215 83 216 89 216 94 216 96 209 91 206 86 204 80 203 73 201 67 199 61 195 55 192 51 189 47 191\n395 0.999 151 202 147 195 143 191 138 194 132 197 126 200 120 202 113 203 108 205 104 207 104 207 106 216 107 217 112 217 119 216 125 214 130 211 136 210 141 207 147 204\n396 0.999 137 123 145 147 158 158 182 149 207 142 235 138 262 139 287 142 313 150 335 159 351 156 353 132 338 115 311 104 287 97 260 94 235 93 207 96 184 103 160 113\n396 0.999 262 170 253 171 248 171 242 171 236 173 230 173 227 176 227 183 227 188 227 192 227 196 233 197 239 198 246 198 253 198 259 198 261 192 260 187 261 182 262 176\n396 0.999 112 236 112 272 138 283 168 291 199 296 233 298 265 297 296 295 328 290 357 283 382 272 382 240 355 244 324 251 293 255 262 258 229 257 197 254 163 248 132 238\n397 0.999 367 61 344 41 318 29 291 20 257 16 222 19 190 28 159 36 129 49 126 79 128 127 153 123 181 105 210 91 240 85 271 85 301 94 330 108 356 121 369 103\n397 0.999 365 233 343 232 315 232 286 233 257 234 228 235 197 235 167 234 140 235 123 249 123 288 148 293 179 296 208 298 236 297 265 297 294 294 321 291 345 289 366 275\n398 0.999 91 33 82 34 73 37 65 42 58 48 52 54 47 61 43 70 42 78 39 87 41 93 52 95 55 90 58 80 63 70 68 63 75 56 83 51 91 48 93 40\n398 0.999 111 39 109 48 113 54 121 59 128 66 133 73 138 81 142 91 145 101 147 108 157 110 159 101 157 93 154 84 151 75 147 67 141 60 135 53 127 46 120 41\n398 0.999 49 95 48 110 51 121 64 121 78 123 91 125 105 127 118 129 130 131 143 132 156 134 157 121 154 113 140 111 127 108 114 106 101 103 88 101 74 99 61 96\n398 0.999 44 126 44 140 52 146 67 148 81 150 94 152 107 154 121 156 135 158 148 160 162 161 162 148 153 142 139 140 125 138 112 136 98 134 85 132 71 130 57 128\n398 0.999 57 160 57 171 63 177 75 178 86 179 97 180 107 181 118 182 130 183 141 185 151 185 153 174 147 168 135 166 124 165 113 164 102 162 90 161 79 161 68 160\n398 0.999 70 187 70 197 70 205 76 210 86 210 95 211 104 211 114 212 123 213 132 215 141 216 141 207 141 198 136 193 126 192 117 190 107 189 97 189 88 188 79 187\n398 0.999 69 220 65 219 61 219 58 220 54 220 51 221 48 222 44 223 42 225 43 229 43 232 45 233 50 233 54 232 57 231 60 230 64 230 67 230 69 226 69 223\n398 0.999 76 218 76 230 81 236 92 237 103 238 114 239 124 240 135 241 146 241 157 243 168 244 169 233 163 226 152 225 141 225 130 223 119 222 108 221 97 220 87 219\n399 0.999 283 49 259 43 240 38 210 35 177 33 148 33 122 34 93 37 69 52 64 76 61 105 79 105 107 96 134 90 162 85 189 84 216 86 244 91 268 96 280 77\n399 0.999 63 163 62 189 85 199 109 200 133 201 159 202 189 204 215 205 241 206 266 208 293 207 292 181 272 172 247 171 220 169 195 168 168 167 140 165 112 163 88 163\n400 0.999 50 19 50 52 50 94 71 111 100 115 130 120 165 128 193 136 221 151 253 161 286 152 292 124 290 86 270 62 240 56 211 50 181 45 146 38 109 30 80 24\n400 0.999 57 149 67 174 92 171 119 169 145 169 171 171 197 176 222 181 247 189 272 199 295 204 289 183 265 170 241 161 216 153 191 147 163 143 136 141 110 140 83 143\n401 0.999 189 96 202 117 233 117 265 118 293 118 328 117 357 117 390 117 417 117 443 116 469 113 454 94 426 94 398 94 366 93 336 93 305 93 274 93 245 93 214 93\n401 0.999 48 114 50 118 55 119 61 119 66 119 72 120 80 120 86 120 89 120 94 119 98 113 95 108 89 108 84 108 78 108 72 108 66 107 61 107 56 108 51 108\n401 0.999 202 118 190 116 171 115 153 115 129 114 116 115 93 116 73 117 49 125 51 134 56 167 66 171 88 163 109 157 129 156 149 158 169 161 194 164 202 156 202 140\n401 0.999 108 188 113 193 122 193 132 193 140 193 152 193 161 193 172 193 181 193 184 188 185 178 178 177 171 177 162 177 153 177 143 177 134 177 125 177 116 177 108 177\n401 0.999 462 187 435 187 411 186 385 186 356 186 331 187 303 187 276 187 246 188 221 190 221 218 247 219 275 219 304 220 331 220 358 220 388 220 416 220 443 220 461 211\n401 0.999 189 197 172 196 157 196 140 196 121 196 105 196 86 196 68 196 49 196 32 199 31 217 46 219 65 219 85 219 101 219 120 219 139 219 156 219 175 219 189 214\n402 0.999 294 139 295 153 300 167 314 165 329 160 341 156 355 156 369 157 383 160 394 164 406 163 409 149 401 140 389 134 376 129 361 126 347 123 332 121 316 124 303 130\n402 0.999 338 158 337 163 336 168 336 172 340 175 344 175 349 176 353 176 358 177 363 177 368 176 369 170 369 164 369 159 365 157 361 157 356 156 351 156 346 156 341 157\n402 0.999 309 182 310 194 315 199 326 199 337 199 346 199 357 200 368 199 378 200 388 200 397 200 397 183 393 181 382 181 371 181 361 181 352 181 340 181 330 181 319 181\n402 0.999 295 199 296 215 307 217 321 217 334 217 347 217 360 218 374 218 388 218 401 219 414 219 413 200 402 199 389 200 375 199 363 199 349 199 335 199 321 199 308 198\n403 0.999 231 124 256 135 274 122 292 111 313 105 333 105 353 110 374 117 391 131 409 143 429 135 414 117 396 103 378 92 356 85 333 81 311 81 288 85 268 95 249 107\n403 0.999 263 153 275 166 293 165 302 153 314 145 329 141 343 143 356 148 369 158 383 164 395 151 385 139 374 127 361 120 345 114 330 112 314 113 298 118 284 127 273 138\n403 0.999 65 158 66 241 103 270 171 270 226 270 280 272 340 273 398 276 460 276 518 280 585 283 584 218 556 180 491 176 435 175 375 172 318 168 251 166 181 163 120 160\n404 0.999 55 102 49 131 49 155 46 185 45 222 67 232 90 232 132 231 167 228 189 225 206 222 219 192 221 169 224 144 221 112 200 97 178 97 146 95 109 95 82 97\n404 0.999 228 248 218 227 202 228 184 235 164 240 145 242 123 244 103 243 84 240 65 237 52 249 53 268 72 272 91 276 108 277 130 278 153 275 174 271 192 265 209 258\n405 0.999 329 107 314 102 301 100 289 100 276 103 264 107 254 114 244 121 236 129 229 138 226 149 235 150 243 141 252 133 261 127 273 122 286 119 298 119 310 121 320 117\n405 0.999 329 156 318 146 307 139 296 133 282 132 269 134 257 140 247 147 240 157 233 167 227 178 234 183 241 175 248 166 258 158 270 153 283 151 296 155 306 162 318 162\n405 0.999 307 166 296 167 282 167 270 169 259 171 247 175 238 180 227 184 225 191 226 200 226 216 234 216 244 213 255 210 267 207 280 204 293 201 304 198 306 190 306 177\n405 0.999 22 192 17 201 16 209 21 213 28 217 35 221 42 226 49 231 55 235 61 240 71 244 80 238 77 233 70 230 64 225 57 218 50 213 43 206 36 201 30 196\n405 0.999 317 214 305 215 294 216 282 217 271 219 260 222 250 225 241 230 231 234 225 241 226 254 233 252 243 247 254 244 266 241 276 238 288 236 300 234 310 233 316 226\n406 0.999 63 76 69 85 77 86 85 82 95 79 105 78 115 78 125 80 135 82 144 88 151 82 150 74 141 70 132 67 122 65 112 64 101 64 91 65 81 68 73 72\n406 0.999 37 101 37 109 37 117 37 126 36 134 36 138 36 146 36 158 37 166 46 166 55 166 56 158 55 150 55 141 55 133 55 128 55 118 55 108 54 101 45 100\n406 0.999 171 118 172 123 172 128 172 132 172 138 172 144 172 148 177 149 182 149 187 149 193 149 194 145 193 141 192 135 191 129 191 123 190 118 187 116 182 116 177 116\n406 0.999 137 186 130 187 124 188 118 188 110 188 100 188 99 192 99 197 99 204 99 211 99 216 106 216 112 215 120 215 126 215 134 215 138 211 138 206 137 200 137 194\n406 0.999 82 251 78 250 72 250 68 250 63 250 59 250 54 251 50 251 45 251 41 251 40 255 44 257 49 257 54 256 58 256 62 256 67 256 71 256 76 256 80 256\n407 0.999 136 85 149 107 173 96 197 87 220 83 245 81 270 81 294 85 317 92 340 100 360 96 354 81 331 70 306 62 281 58 257 56 231 57 207 60 182 67 159 75\n407 0.999 114 158 114 162 115 171 115 177 120 179 126 178 132 178 138 179 144 179 150 179 156 178 156 171 156 161 155 156 149 156 144 156 138 156 131 156 125 156 117 156\n407 0.999 338 160 338 165 338 175 342 180 347 180 353 179 360 179 367 179 373 179 380 180 386 178 386 171 386 160 383 157 377 157 371 157 364 157 357 157 350 157 342 157\n407 0.999 152 257 170 263 190 270 210 273 231 276 255 277 276 275 296 273 315 268 335 263 341 246 325 240 304 246 285 250 263 251 243 252 222 251 201 248 181 242 161 236\n408 0.999 278 81 285 95 301 92 317 86 332 83 349 82 365 82 382 85 397 89 412 95 423 87 417 75 403 70 388 65 371 62 355 62 338 62 322 65 305 68 290 74\n408 0.999 335 85 335 90 335 95 335 102 340 103 344 103 350 103 355 103 360 103 364 103 369 102 369 97 369 91 369 85 364 85 359 85 354 84 350 84 344 84 340 85\n408 0.999 74 123 82 137 100 131 117 124 134 120 152 118 170 119 189 122 206 127 223 134 236 132 230 117 215 110 197 104 179 100 160 99 141 100 123 103 106 108 90 115\n408 0.999 415 110 397 112 377 113 356 113 335 112 311 111 292 112 290 132 289 153 289 173 288 190 305 189 327 188 354 187 376 186 395 185 416 182 418 165 416 146 415 130\n408 0.999 202 131 192 131 180 131 168 131 157 131 143 131 131 131 130 140 130 152 131 161 131 173 139 174 148 174 163 174 175 174 184 174 197 171 198 160 200 150 201 142\n408 0.999 77 174 77 178 77 181 78 186 81 186 85 186 89 186 92 186 96 186 98 186 103 184 103 181 103 178 102 174 98 173 95 173 90 173 87 173 83 173 80 173\n408 0.999 213 176 213 179 213 182 215 186 218 186 221 186 225 186 228 186 231 186 234 186 238 184 238 181 238 177 236 174 233 174 229 174 226 174 222 174 218 174 216 175\n408 0.999 110 188 117 190 128 190 139 190 148 190 159 190 170 190 180 190 190 190 203 190 205 179 196 178 186 178 177 178 166 178 155 178 145 178 134 178 125 178 114 178\n408 0.999 425 201 407 201 391 201 375 201 358 201 342 201 326 201 307 201 292 201 281 204 280 220 296 222 313 222 329 222 345 222 363 222 379 222 396 222 413 222 423 215\n408 0.999 88 214 93 224 107 224 125 224 141 225 157 225 173 224 189 224 205 224 222 225 227 210 221 202 205 202 189 202 172 202 156 202 140 201 125 201 109 202 90 201\n408 0.999 90 237 101 239 115 238 130 238 144 238 159 238 173 238 189 238 203 238 218 238 225 226 213 225 198 226 184 226 169 225 153 225 139 225 125 225 110 225 96 225\n408 0.999 280 245 291 255 303 255 322 255 339 255 356 255 372 255 389 255 406 255 422 256 425 237 416 231 399 231 382 231 365 231 348 231 331 230 315 230 298 230 281 231\n408 0.999 280 266 291 269 307 269 322 269 338 269 353 269 368 269 383 269 398 269 414 269 423 258 410 256 395 256 380 256 365 256 350 256 334 256 319 256 304 256 289 256\n409 0.999 314 26 305 6 282 7 258 8 235 9 213 9 190 9 166 8 143 9 121 10 101 16 112 36 135 36 159 36 181 35 205 35 228 34 251 34 274 34 296 33\n409 0.999 306 82 280 81 246 82 213 92 187 109 158 120 125 123 90 123 58 124 25 125 18 159 45 161 78 161 111 160 146 160 181 157 211 142 236 124 268 118 298 118\n409 0.999 195 85 177 85 158 85 138 85 119 86 100 86 80 86 60 87 41 87 21 87 16 106 33 107 53 107 73 106 92 106 112 106 131 106 151 105 170 105 189 105\n409 0.999 288 124 284 124 278 124 271 124 265 124 259 124 252 125 246 125 239 126 238 129 238 140 243 141 250 141 256 141 262 141 268 141 275 141 281 141 287 140 289 135\n409 0.999 289 142 284 141 278 141 271 141 266 142 260 142 253 142 247 142 240 143 239 147 239 156 244 157 250 157 256 157 263 157 268 157 275 157 281 157 287 157 289 152\n409 0.999 311 162 282 161 251 161 220 163 189 163 159 163 127 164 95 165 64 165 32 165 20 189 48 191 79 191 110 192 142 192 174 192 205 191 236 189 266 188 297 187\n410 0.999 371 75 346 62 317 44 285 33 254 32 217 39 191 48 160 60 126 85 120 106 137 133 155 158 175 138 194 108 220 88 256 79 281 80 303 85 340 105 361 113\n410 0.999 290 98 288 101 288 107 286 113 286 118 288 125 288 128 289 132 296 134 306 133 311 133 315 131 315 127 314 121 313 115 313 109 311 105 311 104 306 98 296 98\n410 0.999 276 99 266 100 255 100 244 101 234 103 221 104 214 105 209 107 209 118 210 134 211 144 219 144 231 143 242 141 253 140 265 139 275 139 278 139 279 127 278 112\n410 0.999 360 150 335 153 311 157 288 160 266 163 240 168 218 171 195 173 168 178 149 192 148 216 169 215 194 210 219 205 241 199 267 196 290 191 305 188 335 184 356 174\n410 0.999 325 189 319 189 313 188 307 189 302 190 295 192 292 196 292 201 293 207 294 214 297 218 301 219 306 219 313 219 319 219 322 217 323 213 325 203 325 200 325 195\n410 0.999 274 199 267 200 260 201 253 201 246 202 239 203 236 208 234 214 234 220 234 228 235 234 241 234 247 233 254 232 260 231 268 230 272 230 273 224 273 218 274 208\n410 0.999 346 216 339 216 332 218 327 220 321 222 314 224 309 224 303 226 298 229 300 237 301 243 306 243 312 242 320 241 325 240 331 238 336 236 340 234 347 232 347 224\n410 0.999 268 234 257 236 246 239 235 241 225 243 214 245 204 248 193 250 182 253 176 261 179 271 189 274 200 271 210 268 221 265 232 262 242 259 249 257 261 255 269 246\n410 0.999 357 237 346 240 335 242 322 245 311 249 297 252 290 254 279 257 277 268 280 283 282 293 292 292 302 289 315 287 327 284 340 281 352 277 355 277 361 268 359 251\n410 0.999 242 267 233 268 224 269 215 272 206 274 195 278 192 280 194 287 196 298 199 311 200 320 208 320 215 318 225 316 234 314 244 312 246 310 246 300 245 292 243 278\n411 0.999 166 79 176 91 189 96 200 87 212 81 226 78 241 79 254 85 265 95 275 105 285 92 288 78 277 65 265 56 250 51 235 47 218 48 203 52 189 59 177 68\n411 0.999 95 111 93 151 119 157 149 158 178 159 208 161 239 163 269 164 299 165 329 166 357 167 359 125 334 123 305 121 276 120 246 118 215 117 183 115 152 113 122 111\n411 0.999 130 180 129 206 150 210 170 211 190 212 211 213 233 215 254 216 275 217 296 218 316 219 317 190 299 187 280 185 258 184 237 183 215 181 192 180 171 179 149 178\n411 0.999 93 229 108 245 137 247 164 248 191 249 220 251 249 253 276 254 305 256 333 257 350 244 336 225 308 223 281 222 253 220 224 218 196 217 166 215 137 214 108 212\n412 0.999 46 80 54 88 63 86 71 80 81 77 91 76 103 76 113 80 122 86 131 90 138 80 134 74 125 67 116 61 105 58 94 57 83 59 72 62 63 66 54 72\n412 0.999 77 96 78 100 81 105 85 104 88 103 92 102 95 101 99 102 103 104 107 104 109 100 111 97 108 94 104 93 100 91 95 91 92 91 87 92 83 93 80 95\n412 0.999 91 108 89 110 89 114 89 116 88 119 89 122 89 125 89 128 89 129 93 130 96 129 99 130 100 128 100 125 100 122 99 118 99 115 98 111 97 110 94 108\n412 0.999 86 135 85 138 85 141 86 143 88 144 90 144 93 145 96 145 98 145 101 144 103 143 104 141 104 137 103 135 101 134 98 135 96 135 93 135 91 135 88 135\n412 0.999 61 157 66 160 73 163 81 166 88 167 96 167 103 166 111 165 117 162 123 159 122 153 119 146 113 149 105 151 98 152 91 153 84 152 77 150 70 147 65 150\n412 0.999 7 199 6 222 18 230 39 230 58 230 80 230 102 231 124 231 145 231 164 232 185 231 185 205 174 200 150 200 130 200 109 200 88 200 68 200 47 200 25 199\n413 0.999 51 274 66 284 84 297 103 307 123 316 144 322 167 325 188 325 208 321 227 315 232 297 218 283 200 289 180 292 160 291 140 287 120 280 102 271 83 260 66 251\n414 0.999 101 118 131 154 165 156 203 136 245 127 287 129 326 142 360 165 385 197 412 220 441 197 438 161 409 127 378 102 340 83 300 71 256 67 216 73 177 81 139 98\n414 0.999 60 186 85 209 112 229 151 232 187 232 224 233 261 233 297 234 329 235 363 236 377 215 365 185 326 183 290 182 253 181 218 179 183 178 148 174 116 159 85 151\n414 0.999 49 244 75 264 118 265 161 265 202 265 244 266 286 266 329 266 370 266 412 266 453 265 428 242 384 241 343 241 301 241 259 241 215 240 173 239 132 239 88 238\n414 0.999 115 269 131 288 162 288 191 288 219 288 248 289 277 290 307 290 335 290 365 289 394 288 380 267 350 267 322 267 292 267 263 267 233 268 203 268 174 268 144 267\n415 0.999 204 81 187 69 172 58 153 52 132 47 112 48 92 50 74 56 55 63 39 71 26 84 43 98 57 88 75 79 95 71 115 70 134 71 153 78 171 87 187 94\n415 0.999 157 92 150 90 141 90 131 90 119 90 110 91 101 91 93 91 82 91 74 93 73 103 82 105 91 104 101 104 109 104 119 104 129 104 138 104 147 104 156 101\n415 0.999 159 106 150 105 143 105 130 105 115 105 101 105 90 105 79 105 72 115 72 127 72 138 82 140 96 139 108 139 117 139 130 138 144 138 153 138 160 132 159 119\n415 0.999 33 187 31 213 51 222 71 223 81 224 96 224 121 225 151 224 164 224 183 223 202 221 201 201 194 186 172 185 143 184 124 183 111 183 94 183 73 185 51 186\n415 0.999 34 245 58 254 76 255 92 252 105 252 123 252 143 251 166 251 185 251 204 251 206 235 186 230 171 230 151 230 128 230 107 230 94 229 76 228 56 227 35 227\n415 0.999 177 265 168 265 157 266 141 266 123 266 107 267 93 268 82 267 64 267 62 280 61 293 76 295 91 294 106 294 117 295 133 295 150 295 160 295 175 294 177 281\n416 0.999 196 42 200 54 204 62 215 59 227 56 239 55 250 56 263 60 273 64 282 70 289 69 291 56 285 47 276 41 264 37 252 34 240 33 228 33 216 36 206 39\n416 0.999 124 119 137 124 162 125 185 126 212 128 238 129 263 131 288 133 313 134 338 134 341 114 324 103 303 101 278 100 253 98 228 96 202 94 178 92 152 90 127 91\n416 0.999 211 144 210 157 220 163 233 163 247 164 260 165 273 165 287 166 300 166 312 167 323 163 322 149 312 145 299 144 285 144 271 143 258 143 245 142 231 141 219 140\n416 0.999 190 146 181 148 172 148 164 149 156 150 147 150 139 151 135 155 135 163 135 169 135 176 142 177 151 176 159 176 168 175 176 174 184 174 188 168 188 162 189 155\n416 0.999 204 168 203 183 213 190 229 190 244 190 259 191 274 191 290 191 305 192 318 192 334 192 335 174 324 170 309 169 293 169 278 169 261 169 247 168 231 168 217 168\n416 0.999 155 235 166 242 188 237 211 235 234 235 256 235 278 236 301 235 322 235 343 234 341 212 326 205 306 206 283 205 260 205 238 204 214 202 192 199 171 200 156 205\n417 0.999 394 37 356 19 319 9 282 5 248 6 213 12 177 25 144 38 114 56 93 74 82 118 105 113 134 96 168 82 202 75 239 73 276 73 316 73 346 73 375 62\n417 0.999 417 69 380 72 335 80 288 87 243 91 197 95 156 101 115 114 80 132 52 153 53 188 89 175 128 162 173 158 220 164 261 170 309 168 350 154 388 141 406 106\n417 0.999 84 215 105 253 123 255 150 256 185 267 213 275 251 276 284 271 331 259 353 237 354 220 355 187 326 185 295 188 267 190 230 190 201 185 163 182 129 181 101 187\n417 0.999 122 274 120 283 131 287 141 289 150 290 159 291 168 293 178 295 190 296 195 288 199 282 197 273 188 271 179 267 170 265 160 264 151 263 141 262 131 262 125 266\n417 0.999 377 261 375 271 374 280 373 290 378 294 387 294 396 295 405 295 414 296 427 296 430 295 432 287 433 278 434 267 428 265 420 264 411 263 400 262 391 261 384 261\n418 0.999 115 10 116 42 145 48 178 49 212 49 242 49 276 50 305 49 339 50 370 49 403 49 402 19 371 11 339 11 308 11 278 10 245 9 210 9 176 9 145 10\n418 0.999 104 59 109 105 107 155 102 200 135 203 177 174 225 158 245 157 293 166 339 178 388 191 415 167 414 123 413 74 376 61 339 65 297 69 245 69 194 64 151 59\n418 0.999 174 180 175 202 179 223 204 223 228 223 249 223 272 223 291 222 316 222 339 222 360 220 360 197 351 181 329 180 308 180 289 180 266 178 243 178 218 178 196 179\n418 0.999 409 223 376 224 333 229 296 232 268 233 228 234 188 235 155 236 118 236 111 271 111 305 146 308 184 306 225 306 256 306 292 305 331 303 371 302 403 299 408 258\n419 0.999 275 59 243 44 212 34 175 31 145 32 109 37 74 47 41 66 49 98 88 89 111 84 143 82 180 87 215 100 245 117 241 78 209 62 198 58 233 71 266 88\n419 0.999 5 114 5 147 33 149 64 150 95 151 125 153 155 154 184 156 216 157 251 158 275 160 271 130 244 128 214 126 185 125 157 123 127 121 94 119 63 117 32 116\n419 0.999 25 185 26 216 45 225 75 225 106 225 136 226 166 225 194 225 226 225 262 225 283 226 281 193 261 186 229 185 202 184 175 184 145 183 112 183 80 184 53 184\n419 0.999 27 233 28 264 49 272 80 270 110 270 140 269 171 269 198 267 229 267 261 266 287 266 285 237 261 232 232 232 202 231 174 230 146 230 114 230 83 230 54 231\n419 0.999 292 273 281 269 269 269 257 269 245 269 233 269 221 269 209 269 196 269 187 270 184 283 192 287 206 287 217 286 230 286 241 286 253 286 265 286 277 286 290 285\n419 0.999 164 275 148 273 134 273 120 274 106 274 91 274 76 275 61 275 47 276 34 276 32 289 42 293 58 292 74 292 88 291 102 291 117 290 131 290 145 290 160 289\n419 0.999 292 289 262 288 235 288 206 289 178 291 151 292 121 293 92 294 64 295 39 296 33 319 61 319 91 319 119 319 147 318 176 317 203 316 231 315 262 314 291 313\n420 0.999 64 135 95 163 139 146 180 133 224 126 273 122 304 124 346 132 390 144 430 162 462 155 447 126 405 108 365 94 317 84 282 76 233 78 186 87 144 96 103 113\n420 0.999 61 242 77 282 122 283 166 283 210 283 254 284 296 284 342 285 385 285 431 287 474 287 462 247 416 246 373 245 329 244 286 244 240 243 194 243 148 242 104 241\n421 0.999 142 95 155 115 173 141 195 144 222 135 245 132 271 135 298 143 322 156 341 168 356 156 360 123 342 104 320 90 299 78 274 73 249 72 221 72 192 75 167 83\n421 0.999 189 187 185 224 208 238 241 242 274 246 306 249 338 253 372 257 405 261 436 264 470 264 471 223 448 210 416 208 386 204 353 200 321 197 288 193 253 188 219 186\n422 0.999 29 77 48 75 62 61 78 49 95 43 114 42 133 45 151 54 166 67 178 83 194 86 183 68 169 53 152 41 134 33 115 31 95 32 76 37 58 47 43 60\n422 0.999 59 105 67 107 81 108 92 109 103 110 115 111 126 111 138 112 150 112 161 113 164 98 155 95 144 93 131 92 120 91 108 91 96 90 85 90 74 89 61 90\n422 0.999 59 109 58 128 60 143 71 143 82 143 95 144 108 145 121 145 134 146 148 147 158 147 160 130 160 115 147 114 135 113 123 112 109 111 97 111 84 110 69 110\n422 0.999 27 167 39 183 53 196 69 206 86 212 104 215 121 214 139 210 156 202 170 191 184 179 170 165 156 178 140 190 124 195 106 196 89 194 72 187 58 174 44 161\n423 0.999 239 80 219 72 207 66 185 59 160 56 140 57 125 58 99 62 70 70 68 88 72 108 83 118 104 112 126 108 148 105 168 105 188 111 196 114 225 117 232 99\n423 0.999 181 119 172 120 168 120 159 120 148 120 142 120 136 120 125 120 118 128 117 138 118 144 125 145 137 145 146 144 155 144 163 144 170 144 173 144 182 136 182 127\n423 0.999 82 154 82 171 99 174 114 175 128 175 149 175 167 175 183 175 194 176 210 175 221 171 220 155 207 153 191 152 175 152 160 152 140 152 118 152 111 152 95 152\n423 0.999 233 182 216 179 200 178 182 178 160 179 143 179 127 180 108 180 87 180 68 180 62 191 76 196 95 196 114 196 133 196 151 197 168 197 184 196 209 196 228 196\n424 0.999 52 138 66 144 76 135 84 127 95 124 106 125 116 131 125 139 131 149 137 159 150 154 144 141 138 130 128 121 118 114 106 110 93 109 80 112 69 118 60 126\n424 0.999 56 169 61 177 68 185 76 190 84 195 93 199 103 201 113 201 122 200 131 196 138 191 131 182 121 184 113 188 104 188 95 186 86 181 78 175 72 167 66 164\n424 0.999 83 172 84 176 87 178 90 180 93 181 96 183 100 183 104 184 108 184 111 183 114 183 113 178 110 177 107 178 103 177 99 176 96 176 92 174 89 172 86 171\n424 0.999 14 221 15 238 17 259 28 261 43 265 59 269 74 272 90 277 103 281 118 285 136 284 147 259 133 253 119 248 104 243 89 240 74 236 60 231 47 228 28 222\n425 0.999 160 153 173 172 193 171 211 159 231 153 253 150 271 152 291 160 310 171 325 182 343 164 335 149 318 134 300 124 279 118 257 116 236 116 214 121 195 129 177 141\n425 0.999 120 231 137 244 167 245 195 245 224 245 253 246 281 246 310 246 338 247 366 248 382 231 366 218 337 216 309 216 281 215 253 215 225 214 195 213 167 213 136 213\n425 0.999 136 278 148 303 170 323 193 336 220 346 248 349 275 348 300 340 321 327 341 312 363 286 341 269 319 274 298 293 275 306 250 310 225 308 201 295 181 275 160 265\n426 0.999 273 65 257 45 232 31 206 24 180 22 153 22 126 27 101 36 78 46 56 60 49 78 68 93 91 80 114 68 140 60 165 57 192 58 217 64 241 76 261 86\n426 0.999 242 120 226 120 206 120 187 120 169 121 149 122 129 123 110 123 91 124 73 125 67 141 83 144 103 144 123 143 142 142 160 142 179 141 197 141 216 140 235 139\n427 0.999 177 52 189 66 202 72 219 62 238 58 255 56 275 58 292 63 308 71 321 81 334 73 329 58 316 48 301 38 283 33 265 29 245 28 227 30 210 35 194 44\n427 0.999 238 62 238 67 241 72 247 70 252 70 257 70 263 70 268 72 272 73 277 74 283 73 283 67 279 63 274 62 268 60 264 60 258 60 253 60 247 60 242 61\n427 0.999 246 92 246 96 246 100 250 101 253 101 256 101 260 101 264 101 268 102 270 102 275 101 275 97 273 93 270 93 267 92 264 92 259 92 256 92 253 91 249 91\n427 0.999 245 100 245 103 246 107 250 109 253 109 257 109 260 109 264 110 268 110 271 111 276 109 276 105 275 101 270 101 267 101 264 101 259 101 255 101 252 101 248 100\n427 0.999 195 115 194 130 198 141 215 142 231 143 247 143 263 144 278 145 293 146 308 146 324 146 325 131 318 121 303 121 288 120 271 119 256 119 241 118 226 117 210 116\n427 0.999 224 166 223 175 224 184 229 190 240 190 250 190 260 190 270 190 279 191 288 191 298 191 298 180 297 170 293 166 282 166 272 166 262 166 252 165 243 165 233 165\n427 0.999 198 199 198 214 203 225 219 224 234 225 249 225 264 225 280 225 294 225 308 225 323 224 322 208 318 199 303 199 288 198 272 198 257 199 242 198 228 199 213 199\n427 0.999 498 204 487 204 474 204 462 205 449 206 438 206 426 207 423 216 425 226 426 236 426 248 435 249 447 248 460 247 472 247 484 247 495 246 498 237 497 226 498 216\n427 0.999 474 272 473 278 474 285 474 292 475 298 475 306 475 312 478 315 485 314 492 314 498 313 499 307 499 300 499 293 499 286 499 279 499 273 493 271 486 271 479 272\n428 0.999 71 90 75 107 95 110 113 112 132 114 153 116 172 117 190 119 210 119 230 119 248 119 241 105 222 104 203 103 185 102 166 100 147 98 128 96 109 94 90 92\n428 0.999 316 107 288 114 261 121 232 125 202 127 172 131 146 136 119 144 89 159 95 187 101 214 116 225 142 211 169 198 197 190 229 183 261 178 290 172 314 163 318 136\n428 0.999 351 145 332 156 314 167 293 177 269 183 247 187 227 191 202 197 190 212 198 235 203 258 216 264 242 256 264 248 285 239 309 228 330 215 351 202 361 184 355 167\n428 0.999 385 238 372 226 355 231 336 236 318 241 301 246 283 251 265 255 245 261 229 266 218 276 229 290 248 285 266 279 283 275 303 269 321 263 340 258 358 252 375 247\n429 0.999 213 56 213 62 213 67 212 73 215 78 220 79 225 78 231 78 237 79 244 79 250 78 251 70 251 65 251 58 248 55 242 55 237 55 230 55 224 55 219 55\n429 0.999 177 60 162 63 147 70 133 76 119 84 107 92 95 101 84 112 72 123 64 133 75 144 88 143 99 132 111 123 123 114 135 106 149 99 163 92 179 87 181 75\n429 0.999 290 62 287 77 289 88 304 92 320 99 333 104 347 111 361 120 375 127 388 136 400 137 410 125 403 115 390 105 376 97 363 89 349 82 334 75 319 69 304 65\n429 0.999 330 147 308 149 286 149 266 150 245 150 225 151 204 151 184 152 161 152 146 158 147 176 168 178 190 177 211 177 231 176 251 176 272 176 293 176 317 176 329 166\n429 0.999 201 186 200 195 201 204 207 208 216 208 225 208 234 208 242 208 253 208 262 208 271 206 272 197 272 187 264 185 256 185 246 185 237 185 228 185 218 185 209 184\n429 0.999 81 219 81 232 85 241 98 240 112 240 124 240 137 241 150 241 164 241 177 241 190 240 192 226 185 217 172 217 159 217 146 217 133 217 120 217 106 217 93 217\n429 0.999 353 217 336 217 319 218 301 218 285 218 269 217 253 218 236 218 218 218 206 222 210 240 225 241 242 241 260 241 275 241 291 241 308 241 325 241 344 243 352 233\n430 0.999 137 55 138 62 145 65 152 62 159 60 166 58 173 58 181 59 188 62 194 64 200 64 200 55 194 50 188 48 180 45 172 45 164 45 155 46 147 49 143 51\n430 0.999 228 117 219 100 210 84 195 72 177 65 153 67 135 78 121 92 111 108 104 124 98 142 115 148 126 133 135 116 152 101 171 97 187 101 201 116 208 128 219 135\n430 0.999 279 67 277 89 275 116 275 140 298 142 312 141 334 138 373 135 393 133 396 133 420 131 423 114 427 91 430 71 408 69 385 68 360 66 337 65 310 64 296 65\n430 0.999 137 122 138 128 144 131 150 127 157 122 163 120 170 119 179 121 184 124 189 128 195 127 195 118 190 113 184 108 177 104 170 104 160 105 152 108 146 113 141 117\n430 0.999 159 124 159 126 159 130 160 133 163 134 164 134 167 134 170 134 172 133 173 133 176 131 176 128 175 126 175 123 173 121 170 121 167 121 164 121 160 123 160 123\n430 0.999 250 134 227 134 202 137 182 140 165 142 133 146 110 150 91 152 66 161 66 183 67 206 87 206 111 202 136 199 163 194 185 192 204 189 228 186 240 183 250 156\n430 0.999 316 174 314 185 319 187 335 185 351 183 358 183 367 182 379 180 383 170 384 168 384 160 385 152 376 151 365 151 354 151 343 151 333 150 318 158 316 169 317 166\n430 0.999 228 195 209 192 203 206 193 224 179 239 160 247 141 242 126 229 115 212 103 210 94 222 103 239 114 256 131 268 153 275 171 271 190 261 203 246 213 231 222 213\n430 0.999 426 190 404 192 382 194 363 197 344 201 320 205 299 209 281 212 263 215 243 227 245 249 263 249 283 246 305 241 327 237 348 233 368 229 387 225 402 222 424 211\n430 0.999 195 197 188 197 179 198 173 199 167 200 155 202 147 203 140 205 134 209 134 217 134 225 140 227 148 226 156 224 166 223 174 222 182 220 188 219 192 217 195 207\n431 0.999 250 0 247 3 246 6 244 10 242 14 242 18 246 19 249 20 252 21 257 23 260 23 262 21 264 17 266 12 267 8 267 4 264 2 260 1 256 0 252 0\n431 0.999 216 19 191 13 166 9 142 9 117 12 93 22 70 35 52 52 36 73 22 95 14 114 43 124 56 109 69 88 88 69 110 56 137 47 159 42 187 44 206 41\n431 0.999 239 33 230 43 225 53 229 64 238 73 247 83 256 92 263 102 270 114 278 123 290 119 302 116 302 106 296 94 290 83 283 72 275 61 266 52 257 44 247 35\n431 0.999 127 65 132 71 140 71 146 67 153 63 160 62 168 63 176 67 183 71 191 73 197 66 191 60 185 55 178 50 170 47 162 47 154 47 146 49 139 54 132 59\n431 0.999 299 130 269 130 240 130 210 130 181 130 151 131 122 131 92 131 62 131 37 132 21 150 50 151 81 151 109 151 139 151 168 150 198 150 227 150 256 150 283 149\n431 0.999 249 152 230 152 210 152 191 152 172 152 152 153 133 153 114 153 95 153 76 154 74 170 93 172 112 172 132 172 150 172 170 172 189 171 209 172 228 172 246 171\n431 0.999 81 186 95 193 109 193 127 193 144 193 161 193 179 193 195 193 213 193 235 192 239 178 223 174 206 174 190 174 173 174 156 174 139 174 122 174 104 174 84 175\n431 0.999 141 196 140 202 141 208 140 214 146 216 152 216 158 216 163 216 170 216 178 216 182 215 182 209 182 203 181 195 175 195 169 195 163 195 157 195 151 195 146 195\n431 0.999 238 216 221 216 203 216 186 216 168 216 151 217 133 217 116 217 99 217 83 217 81 235 97 237 115 236 133 236 150 236 168 236 186 236 204 236 220 236 237 234\n431 0.999 293 238 272 217 250 232 223 253 193 266 162 269 130 267 101 255 75 235 53 218 25 235 44 254 67 276 94 292 125 304 156 307 189 301 217 294 247 282 271 261\n432 0.999 36 97 67 128 103 110 138 106 175 120 214 137 242 157 266 177 285 216 304 248 335 246 317 207 299 173 277 142 252 117 222 95 186 76 148 63 106 56 69 68\n432 0.999 24 157 26 206 44 247 82 264 121 281 159 297 198 311 231 328 274 346 315 363 353 370 356 330 337 303 298 285 263 265 225 247 188 233 150 218 108 197 67 170\n433 0.999 210 24 198 22 182 22 168 24 153 26 136 29 123 31 108 34 95 36 88 45 92 66 105 69 119 65 135 62 150 60 163 59 178 58 192 58 206 58 210 43\n433 0.999 258 78 238 67 212 62 189 62 164 62 138 66 115 72 92 79 70 88 46 98 49 121 68 126 89 116 113 108 136 101 158 98 183 97 206 99 230 105 250 105\n433 0.999 293 162 289 123 257 129 227 145 200 158 168 167 139 171 109 171 80 166 53 157 28 158 31 188 60 197 90 205 123 205 154 204 186 198 215 190 243 186 271 185\n434 0.999 158 66 159 83 159 105 162 118 176 119 194 119 210 119 227 121 243 123 258 124 264 115 268 97 270 78 268 65 253 64 238 63 221 63 203 63 186 63 170 65\n434 0.999 122 130 123 154 125 182 144 182 165 181 189 181 212 183 237 186 262 192 281 198 291 186 295 161 299 142 279 133 257 128 236 125 211 123 187 122 164 123 142 125\n434 0.999 57 215 68 254 93 260 126 250 164 246 199 247 236 256 274 271 307 289 337 310 353 287 368 258 343 233 305 212 271 196 236 188 197 183 158 182 122 186 89 199\n435 0.999 239 38 219 38 201 37 190 37 168 37 146 38 131 38 114 38 96 38 84 48 84 67 102 67 120 67 140 67 157 67 176 68 194 68 212 67 230 68 238 56\n435 0.999 226 76 209 75 193 74 185 75 166 74 147 74 134 74 120 74 104 75 97 88 97 104 112 104 127 104 144 104 159 104 175 104 190 105 206 104 221 104 225 91\n435 0.999 100 156 101 172 101 187 117 192 123 188 136 183 166 184 185 186 198 195 202 197 211 197 211 180 211 163 208 146 190 141 175 136 158 133 135 137 124 143 114 149\n435 0.999 223 212 205 211 189 211 182 211 161 211 142 211 130 211 115 211 97 212 92 227 92 241 109 242 125 242 142 242 158 242 173 243 189 243 206 243 221 242 222 227\n435 0.999 82 256 81 272 98 276 126 276 137 277 155 277 180 277 202 276 216 276 229 276 241 266 240 247 223 247 205 247 186 247 168 247 148 247 125 246 107 247 92 247\n436 0.999 156 116 157 141 156 177 178 174 202 168 226 164 248 161 269 162 293 166 316 172 336 175 346 161 334 124 321 116 297 116 276 116 250 115 227 114 202 113 178 112\n437 0.999 195 32 188 28 179 29 171 29 161 30 152 30 141 31 130 31 121 33 110 35 109 48 115 49 125 47 135 47 146 46 155 45 165 45 173 44 183 44 194 44\n437 0.999 44 72 53 92 80 84 110 78 136 74 163 72 190 73 215 74 244 77 265 84 280 78 276 59 250 51 223 48 195 47 167 47 140 48 112 51 86 55 65 63\n437 0.999 257 85 238 81 222 82 203 83 184 84 164 85 143 85 124 86 106 86 87 87 73 99 93 102 112 101 132 100 151 100 171 100 190 99 208 99 229 99 247 99\n437 0.999 265 120 260 100 238 104 212 109 188 111 164 112 139 112 118 111 95 109 71 105 60 116 62 136 87 140 110 142 134 143 159 143 185 141 210 140 232 137 253 132\n437 0.999 222 144 211 143 196 143 184 144 170 144 156 144 140 144 126 144 112 144 97 144 95 157 104 163 118 162 133 162 149 162 161 162 175 162 188 161 204 160 219 159\n437 0.999 275 175 251 175 227 175 203 175 179 176 155 177 130 177 105 177 81 177 57 177 38 187 62 188 86 188 111 187 136 187 160 186 185 186 206 186 232 185 255 185\n437 0.999 286 214 261 215 236 215 210 216 184 216 156 217 128 217 97 217 71 217 38 218 37 250 63 250 90 249 120 249 149 248 175 249 204 248 228 248 257 247 286 242\n438 0.999 6 112 33 119 38 96 46 72 66 53 93 46 120 51 145 68 151 94 161 115 183 112 185 81 174 55 151 35 123 24 96 20 67 24 41 37 22 58 13 82\n438 0.999 188 21 188 38 187 59 200 61 217 60 233 60 248 61 265 61 282 61 301 61 311 61 310 43 309 27 300 21 284 21 268 21 250 21 233 20 215 20 202 21\n438 0.999 202 66 202 79 201 102 202 106 215 106 229 107 243 107 258 107 271 108 288 109 295 108 295 91 296 78 294 66 282 66 268 66 254 65 239 65 224 64 214 64\n438 0.999 201 131 208 133 220 133 231 134 242 134 253 134 265 135 276 135 287 135 298 133 300 121 296 117 282 117 270 116 259 116 248 116 236 116 225 115 213 115 202 120\n438 0.999 156 125 147 118 132 118 120 118 106 118 92 118 78 118 64 119 50 119 36 119 33 129 42 136 55 136 68 136 81 136 97 137 112 137 126 136 140 136 155 136\n438 0.999 31 140 31 156 46 157 61 157 75 156 90 156 104 156 118 156 131 156 147 155 158 151 157 138 141 137 128 137 113 137 98 137 84 137 70 137 55 137 39 137\n438 0.999 182 155 197 156 210 156 224 157 240 157 255 157 271 157 287 157 302 158 315 157 317 140 305 138 291 138 276 137 258 137 243 137 228 137 213 136 199 136 183 138\n438 0.999 44 187 42 214 57 231 84 231 113 231 141 232 169 232 198 232 226 232 255 233 279 233 280 206 265 189 236 189 209 188 181 187 151 187 123 187 93 187 69 186\n439 0.999 151 54 168 71 189 62 208 52 230 46 252 44 274 47 296 54 317 66 335 78 353 68 342 52 323 39 303 29 282 21 258 18 234 18 211 21 186 31 170 41\n439 0.999 130 77 132 109 140 141 158 143 184 144 217 145 246 146 281 146 312 147 338 149 364 146 365 119 363 94 341 81 313 79 283 77 250 74 218 74 182 74 161 74\n439 0.999 117 177 127 191 168 191 188 192 218 193 250 195 280 196 310 196 343 196 369 195 374 167 361 151 333 149 298 146 269 146 242 146 212 146 181 144 143 141 127 148\n439 0.999 295 209 269 207 262 209 227 204 202 205 177 207 154 209 130 213 108 218 101 233 105 255 120 264 142 260 167 257 190 256 215 254 239 253 256 244 258 248 289 239\n439 0.999 296 208 291 225 289 242 287 255 298 258 314 260 328 262 343 264 358 267 369 270 383 271 384 258 387 244 390 228 380 222 365 219 349 215 334 212 315 210 308 209\n440 0.999 100 172 130 183 159 146 185 125 222 119 272 117 314 125 356 149 380 174 405 196 411 154 392 123 372 84 326 78 282 71 238 68 190 75 185 97 151 106 113 127\n440 0.999 206 148 206 162 206 184 210 190 224 191 239 192 254 193 270 194 285 194 299 194 313 190 312 178 313 157 307 149 294 148 279 148 263 148 248 147 233 146 218 146\n440 0.999 177 211 179 233 183 252 199 253 218 253 238 254 257 254 278 255 298 255 316 255 334 252 333 230 332 211 310 210 293 210 275 210 254 209 234 209 215 208 194 208\n441 0.999 42 32 85 43 132 47 191 50 225 53 277 55 326 56 381 59 430 60 478 61 498 39 451 25 407 24 356 21 307 19 258 16 204 14 167 24 118 19 70 15\n441 0.999 0 104 7 157 57 145 119 135 168 128 225 126 281 126 340 130 393 135 442 147 483 152 481 107 428 89 373 79 320 74 264 71 204 71 155 73 98 79 45 88\n441 0.999 375 145 354 140 328 140 304 140 274 139 247 140 223 140 198 140 171 140 143 140 137 158 157 169 184 168 211 168 236 168 262 168 287 169 314 168 343 168 368 167\n441 0.999 161 171 162 192 180 197 203 196 224 196 246 196 268 196 292 196 314 195 334 195 352 194 352 171 331 170 309 170 289 170 267 169 242 169 223 169 202 169 180 169\n441 0.999 337 227 337 237 341 243 353 243 361 243 371 244 382 244 395 246 408 246 412 243 413 236 413 227 408 219 397 219 388 218 378 218 365 217 358 217 348 218 339 217\n441 0.999 239 224 238 234 240 245 251 246 261 246 271 246 283 247 296 247 309 248 317 246 319 241 320 231 317 220 307 219 298 219 287 220 274 219 265 219 256 219 245 220\n441 0.999 218 222 206 220 196 220 189 220 174 220 163 220 153 220 143 220 134 225 134 234 135 247 144 249 155 249 167 249 177 249 188 250 198 250 210 251 216 243 216 233\n441 0.999 4 224 4 235 4 246 11 253 22 253 33 253 45 253 58 253 71 253 81 253 86 252 87 240 88 227 79 224 69 224 58 224 44 223 36 224 25 224 14 223\n441 0.999 486 250 463 246 438 246 418 246 391 245 367 246 344 247 322 248 298 248 275 247 266 267 286 274 310 273 335 273 356 272 381 271 404 270 430 270 456 268 478 266\n441 0.999 259 256 248 250 235 249 228 249 211 249 199 250 189 250 179 250 167 250 158 255 158 266 160 275 176 277 189 276 200 275 212 275 222 276 236 275 248 274 254 267\n441 0.999 24 253 23 268 28 280 43 280 57 280 72 281 87 281 105 282 123 282 133 282 144 279 148 265 140 256 125 254 111 253 95 253 77 252 66 252 51 253 36 253\n441 0.999 376 286 350 283 327 281 309 281 281 282 259 283 237 283 215 284 192 283 178 291 177 315 194 324 216 322 240 321 262 320 284 320 304 320 328 319 351 318 365 307\n442 0.999 480 57 446 51 409 48 373 48 336 51 297 57 260 66 226 77 193 91 164 107 165 134 192 135 223 121 258 110 292 101 331 93 367 92 403 93 439 97 471 92\n442 0.999 37 50 35 61 35 73 42 83 50 92 56 102 64 111 70 121 77 130 85 141 92 148 95 137 93 124 86 116 79 106 72 96 66 86 60 77 53 67 46 58\n442 0.999 29 82 24 102 20 126 21 150 38 169 55 190 73 208 92 223 110 236 130 255 148 271 151 246 152 224 144 202 128 183 110 167 92 152 76 137 59 120 43 98\n442 0.999 457 105 430 111 402 118 370 119 339 122 306 127 273 132 243 136 214 141 196 154 198 184 220 187 250 183 281 177 310 173 343 169 374 164 404 156 431 143 459 130\n442 0.999 470 168 438 172 402 178 368 183 333 188 296 194 259 199 226 204 190 209 177 228 177 260 209 255 244 250 279 246 314 240 348 236 383 232 421 226 456 221 472 203\n442 0.999 96 229 95 239 94 249 100 258 106 265 113 274 118 282 125 289 132 297 139 304 145 310 148 300 148 290 143 282 136 275 130 267 123 259 117 251 110 243 104 234\n443 0.999 221 67 224 81 234 80 246 75 259 73 271 73 284 73 295 75 307 80 317 86 326 88 327 74 316 67 305 61 293 58 281 56 268 56 257 56 243 59 231 62\n443 0.999 150 122 165 139 192 141 221 142 251 143 280 143 309 144 338 145 366 146 393 147 420 147 408 127 377 126 348 125 319 124 291 123 263 123 234 122 205 122 177 120\n443 0.999 149 152 160 168 185 170 210 170 235 171 260 171 286 171 309 172 333 173 356 174 379 172 369 154 340 154 317 153 293 153 270 152 246 151 221 150 195 150 171 149\n443 0.999 150 180 160 199 189 199 218 201 245 201 274 201 302 202 331 202 357 203 385 204 412 204 401 183 373 182 341 182 314 182 287 181 260 181 232 180 203 179 176 178\n443 0.999 149 210 160 229 186 230 213 231 240 231 268 232 296 232 322 232 347 233 374 233 398 231 389 211 359 210 334 210 307 209 280 209 254 209 228 209 201 209 176 209\n443 0.999 36 223 36 227 37 230 41 232 45 232 49 232 54 232 58 233 62 233 66 233 70 229 71 224 68 220 64 220 60 220 55 220 51 220 47 220 43 220 39 220\n443 0.999 37 240 37 244 37 250 37 255 37 260 43 261 48 261 52 261 57 261 61 261 66 260 66 255 67 248 67 243 66 238 61 238 56 238 51 238 46 238 40 238\n443 0.999 70 253 70 257 70 261 73 264 77 264 81 264 85 264 90 264 94 265 98 265 103 264 103 260 103 254 100 253 95 253 91 253 87 253 83 252 78 251 74 251\n443 0.999 69 267 67 270 68 278 68 285 68 289 71 293 78 293 83 293 88 293 94 295 99 294 100 287 100 279 101 274 102 267 97 266 92 266 85 266 80 266 75 265\n443 0.999 39 267 37 269 37 276 37 281 37 286 37 290 43 291 48 291 52 291 57 292 62 291 63 288 64 282 64 276 64 270 63 267 58 267 54 267 49 266 44 267\n444 0.999 75 150 111 175 145 168 177 146 215 132 255 129 286 134 321 149 353 176 392 181 417 154 410 126 375 103 333 85 301 74 255 67 208 70 166 82 129 98 106 121\n444 0.999 425 185 380 184 358 183 317 178 279 177 230 180 183 182 143 184 103 185 73 197 73 238 115 236 156 235 195 234 234 234 270 234 305 235 339 235 381 235 424 234\n445 0.999 53 147 94 174 138 165 176 138 219 124 267 122 301 127 340 144 373 173 420 178 452 155 440 117 401 93 360 72 320 57 272 48 205 52 161 66 117 85 91 110\n445 0.999 459 186 407 186 381 185 334 181 291 180 241 182 184 186 136 187 86 188 51 200 53 252 104 248 150 247 195 246 241 246 282 246 318 245 357 246 408 246 458 243\n446 0.999 47 52 74 78 90 110 111 127 172 118 225 113 269 115 316 117 347 121 402 132 435 124 453 84 409 65 362 56 312 51 267 50 220 50 163 50 113 53 83 58\n446 0.999 449 221 412 205 365 195 335 198 306 208 270 210 231 202 188 197 141 199 105 207 97 236 128 247 171 240 209 242 246 252 289 254 321 247 343 236 383 240 429 259\n447 0.999 114 94 118 113 131 112 147 111 163 111 179 111 195 113 212 116 227 117 242 122 256 121 258 106 243 101 228 96 211 93 194 91 178 90 161 89 145 90 129 91\n447 0.999 138 121 145 124 155 125 165 125 176 126 186 127 196 127 207 128 217 130 227 130 235 123 226 118 216 117 206 117 195 116 185 115 174 114 165 114 154 113 144 112\n447 0.999 108 144 106 162 120 164 136 165 152 166 167 168 183 169 199 169 215 171 231 172 247 173 247 152 234 150 218 149 203 148 187 147 170 146 154 145 138 144 123 142\n447 0.999 127 169 126 186 132 189 146 191 158 191 171 192 183 192 197 193 211 194 223 195 236 196 237 178 229 174 217 174 203 173 191 172 177 171 164 171 152 170 139 169\n447 0.999 241 199 240 201 239 205 239 208 239 212 239 216 239 219 241 221 245 221 249 222 254 222 255 219 255 216 255 210 255 207 256 203 255 200 251 199 248 199 244 199\n447 0.999 406 256 406 258 406 259 406 261 409 262 410 262 412 262 414 262 415 262 417 263 420 259 420 257 420 255 417 254 416 254 414 254 412 254 410 254 408 254 406 254\n448 0.999 400 143 375 123 346 110 316 98 281 90 248 89 218 91 186 95 155 99 123 110 90 130 127 147 155 134 188 122 220 116 251 117 284 123 313 128 344 138 373 152\n448 0.999 174 157 176 177 193 178 208 178 221 178 236 179 253 179 269 180 285 181 298 182 314 179 313 162 300 161 283 160 268 160 251 159 236 159 218 159 201 158 185 158\n448 0.999 74 232 109 234 151 235 188 236 218 237 259 239 301 240 345 241 382 241 419 239 414 192 389 191 356 191 317 189 281 188 236 187 197 186 149 185 103 184 79 195\n449 0.999 109 33 111 60 113 90 131 91 159 82 179 75 201 72 226 73 247 80 269 92 289 98 292 76 293 43 275 38 251 35 229 33 203 32 177 32 153 32 128 32\n449 0.999 172 86 173 96 175 103 185 104 194 104 201 104 210 104 218 105 227 105 234 106 243 106 243 90 240 86 232 86 223 85 215 84 206 84 197 84 189 84 180 84\n449 0.999 132 134 132 151 146 152 160 152 175 153 189 154 203 154 218 154 233 155 246 155 260 155 259 134 247 133 232 133 218 133 204 132 189 132 174 132 159 132 145 131\n449 0.999 141 155 143 171 154 172 168 173 182 173 195 174 207 174 222 174 235 174 248 175 261 175 260 156 248 155 234 154 221 154 208 154 194 154 180 154 167 154 154 154\n449 0.999 140 175 140 191 149 193 163 193 176 193 189 194 201 194 215 195 228 195 240 195 253 196 253 177 243 175 230 174 217 174 205 174 191 174 178 173 164 174 152 174\n450 0.999 184 45 193 68 215 69 242 64 269 66 295 72 313 83 336 100 352 120 365 143 391 137 388 118 376 95 359 74 339 59 315 47 291 38 262 32 235 31 207 37\n450 0.999 172 52 161 59 151 67 140 77 130 88 122 98 115 108 110 120 105 132 114 139 125 147 135 144 141 132 148 120 156 109 164 100 173 92 185 83 190 73 182 62\n450 0.999 368 167 367 180 368 193 369 206 373 217 384 217 395 216 408 216 421 216 436 216 444 214 443 200 444 188 443 174 441 166 431 166 419 165 404 165 392 165 379 166\n450 0.999 55 171 53 202 52 230 91 230 127 230 160 230 191 230 226 230 260 230 295 231 325 230 326 198 323 173 288 173 254 173 223 173 191 173 153 173 121 173 86 172\n450 0.999 443 216 435 216 426 216 417 217 408 217 400 217 393 217 384 218 381 224 381 231 382 238 390 239 398 238 406 238 414 238 423 238 431 239 441 239 445 233 443 224\n450 0.999 305 264 296 256 283 256 270 256 258 256 246 257 235 257 223 257 211 257 202 258 200 267 208 276 220 276 231 276 243 276 255 276 267 276 280 276 292 276 303 274\n450 0.999 133 294 153 308 177 324 203 334 228 340 258 341 285 338 310 331 335 319 356 302 368 285 348 266 326 278 302 292 275 300 247 303 221 299 193 287 170 272 149 272\n450 0.999 426 381 430 385 440 385 448 385 458 385 466 385 475 385 483 385 493 385 498 385 499 376 498 372 490 372 481 372 471 372 463 372 455 372 445 372 437 372 427 372\n451 0.999 397 100 360 89 321 81 280 79 237 79 195 80 155 86 118 96 78 107 50 123 18 142 62 147 99 147 140 140 179 131 215 125 255 128 295 127 336 128 376 137\n451 0.999 453 141 412 140 366 142 320 143 280 144 239 144 192 144 143 145 97 148 52 149 42 192 80 203 127 200 171 199 220 198 261 197 305 196 349 195 394 195 439 193\n451 0.999 468 232 440 201 393 202 346 201 302 202 256 203 208 204 158 205 109 206 61 206 35 234 56 265 106 261 153 260 201 257 244 257 293 253 340 252 388 251 433 251\n452 0.999 130 111 141 130 155 135 178 123 203 113 229 107 253 106 278 107 304 115 325 122 334 100 336 76 314 67 290 65 266 63 238 61 211 62 188 68 166 80 148 93\n452 0.999 331 146 305 148 278 150 250 154 223 156 196 158 175 160 146 162 141 182 141 205 145 226 174 224 200 224 226 224 251 224 277 225 303 225 327 223 328 195 329 173\n452 0.999 140 266 140 283 158 286 179 286 200 287 220 288 241 290 260 292 281 293 300 295 315 288 316 270 296 269 277 268 256 268 236 267 217 266 195 265 177 264 158 265\n452 0.999 96 288 95 292 95 296 96 299 96 303 100 305 103 305 108 305 111 305 115 304 119 302 120 298 119 294 119 291 118 287 114 287 110 287 105 286 102 287 99 287\n452 0.999 289 298 289 309 289 320 300 321 312 322 323 323 334 323 345 324 357 325 368 326 379 323 378 311 377 301 367 300 355 300 344 300 332 299 320 298 309 297 299 297\n453 0.999 51 11 49 11 38 11 30 11 21 11 12 12 2 13 0 18 0 28 0 38 0 47 0 49 7 49 19 48 30 48 40 48 47 48 50 43 51 35 52 21\n453 0.999 154 81 148 76 139 72 130 70 121 70 112 70 103 72 96 74 88 78 82 81 85 90 91 96 97 93 105 90 113 88 121 88 130 89 138 91 145 95 150 90\n453 0.999 165 230 159 215 150 218 142 223 133 227 122 229 112 229 103 228 94 224 84 220 78 228 79 236 88 240 98 244 109 246 120 246 129 245 138 242 147 239 156 235\n454 0.999 51 72 52 90 58 101 76 95 96 89 114 87 132 86 151 88 170 92 187 99 202 100 204 79 196 68 177 61 158 56 140 54 122 54 103 56 84 59 67 64\n454 0.999 52 113 60 121 77 115 94 110 111 107 127 106 144 108 161 110 178 115 193 121 203 115 194 104 179 97 162 92 145 89 128 89 112 90 95 92 78 96 62 102\n454 0.999 129 142 124 143 117 145 113 148 113 154 113 160 114 165 114 170 115 176 118 180 121 181 128 181 134 181 135 177 135 172 135 166 134 161 134 155 133 149 133 145\n454 0.999 117 190 117 194 117 197 117 201 119 202 122 202 125 202 128 202 131 202 134 202 138 200 139 199 139 194 138 190 135 190 132 190 130 190 126 190 123 190 120 190\n454 0.999 64 218 64 232 78 232 92 232 106 232 120 232 133 232 146 232 160 232 174 232 188 230 186 218 173 218 159 218 145 218 131 218 118 218 104 218 91 218 77 218\n454 0.999 82 234 82 244 91 246 101 245 111 245 120 245 130 246 140 246 149 246 159 246 169 243 167 234 158 234 148 233 138 233 129 233 120 233 110 233 100 233 91 233\n455 0.999 362 95 337 74 312 56 278 45 244 42 213 42 182 46 152 52 118 58 81 75 96 101 114 122 142 105 173 92 204 86 238 86 269 93 293 112 319 137 341 117\n455 0.999 84 256 84 285 120 288 149 289 180 290 210 291 242 292 276 293 308 294 334 294 362 293 361 262 330 261 300 261 269 261 237 261 205 259 168 257 138 255 114 254\n455 0.999 177 297 175 307 178 319 189 319 201 319 212 319 223 319 236 319 250 319 257 319 267 318 268 307 265 298 254 297 242 297 231 297 219 296 203 296 189 295 186 295\n456 0.999 379 63 352 58 329 57 304 57 281 57 255 58 230 59 206 60 180 62 155 63 144 80 163 84 190 81 215 78 240 76 265 76 290 76 313 77 339 78 364 79\n456 0.999 108 128 129 152 161 141 190 131 232 123 259 122 291 122 329 125 365 134 390 143 405 128 396 104 365 92 332 85 293 83 261 82 229 82 197 87 165 96 136 110\n456 0.999 340 146 320 147 304 147 285 148 272 149 250 150 232 151 217 150 198 150 184 159 184 177 200 179 221 179 239 178 257 178 276 178 294 176 313 176 332 176 340 164\n456 0.999 85 197 118 223 150 240 188 251 228 254 269 256 307 256 344 249 381 235 416 218 441 196 408 174 371 188 335 200 297 206 258 208 218 205 182 198 147 184 111 169\n457 0.999 4 38 16 50 30 59 46 47 64 39 85 34 103 35 123 42 139 52 150 59 163 44 164 29 149 17 130 8 111 4 91 2 70 3 51 7 35 14 20 26\n457 0.999 169 60 158 60 146 60 135 60 124 60 115 62 114 78 123 78 133 79 144 79 155 79 166 78 164 71 153 71 143 71 132 71 134 71 144 72 155 72 167 70\n457 0.999 124 79 123 84 123 89 125 93 130 93 134 93 139 93 144 93 148 93 152 93 157 93 157 87 157 83 156 79 151 79 146 79 142 79 136 79 132 79 127 79\n457 0.999 76 81 70 81 64 82 59 83 54 83 49 83 44 84 38 84 35 87 35 91 35 96 40 96 45 95 51 95 56 94 62 94 66 93 72 93 76 91 76 86\n457 0.999 82 92 76 92 69 93 63 94 57 95 51 95 45 96 38 96 32 96 31 101 32 107 37 108 44 107 50 107 56 106 62 106 68 105 73 104 80 104 82 99\n457 0.999 121 96 121 101 123 104 127 104 132 104 136 104 141 104 146 104 150 104 155 104 159 103 160 98 158 96 153 96 148 96 144 96 139 96 134 96 129 96 125 96\n457 0.999 112 109 117 112 122 112 129 112 135 112 141 112 147 112 153 112 160 112 166 112 168 105 163 104 158 104 151 104 145 104 139 104 133 104 126 104 120 103 114 103\n457 0.999 115 112 115 117 120 120 126 120 131 120 137 120 142 119 148 119 153 119 159 119 164 118 164 112 160 111 154 111 148 111 143 111 137 111 131 111 126 111 120 111\n457 0.999 19 132 18 151 19 171 31 178 49 179 69 179 86 181 107 183 127 184 144 188 160 188 160 168 161 147 148 137 130 137 113 136 92 135 72 134 54 134 38 133\n457 0.999 35 184 35 196 44 200 58 199 70 199 82 199 94 199 107 199 119 199 130 199 142 198 142 187 132 185 120 184 108 184 95 184 83 184 70 183 58 183 46 183\n457 0.999 40 242 48 247 63 247 79 247 93 246 107 247 120 245 115 241 103 240 100 238 113 238 130 233 131 211 120 203 103 203 88 202 70 201 50 203 45 218 46 235\n458 0.999 216 45 199 56 185 70 172 86 161 102 150 120 142 140 133 159 128 179 123 198 139 208 155 205 159 185 165 167 172 148 181 128 194 111 208 94 223 77 227 62\n458 0.999 283 54 273 69 281 84 292 93 304 107 314 123 324 139 333 156 339 176 347 192 368 191 375 182 369 166 364 151 357 133 348 117 338 101 327 84 313 70 298 58\n458 0.999 295 110 275 111 257 112 229 113 196 117 193 148 191 166 188 192 185 213 193 225 225 226 236 227 262 225 292 224 310 220 323 218 319 199 307 182 300 155 297 131\n458 0.999 294 261 290 249 285 245 273 246 264 247 253 248 242 248 230 248 219 247 212 248 210 256 210 269 217 272 227 272 238 272 250 272 261 272 272 271 285 270 292 268\n458 0.999 366 290 340 288 315 288 287 289 262 289 234 289 207 289 179 289 151 289 131 295 128 320 155 323 180 323 207 325 234 325 262 325 289 325 316 325 345 324 363 315\n458 0.999 35 304 35 311 35 318 35 327 35 332 42 337 52 337 61 336 73 336 83 335 89 335 90 327 89 320 90 313 88 304 83 302 73 303 63 302 53 302 43 302\n458 0.999 408 305 407 314 407 320 407 329 409 336 416 337 426 338 435 338 446 338 455 338 462 338 463 329 463 323 463 315 460 305 454 305 444 305 434 305 425 305 415 304\n459 0.999 184 123 175 123 167 124 157 124 148 124 140 125 132 125 123 126 114 126 106 127 106 137 115 138 123 138 132 138 141 137 151 137 160 137 168 136 177 136 184 133\n459 0.999 194 131 200 133 208 135 216 136 224 137 233 140 241 141 250 143 259 144 266 144 267 136 262 135 254 133 247 132 238 131 230 129 222 127 214 126 206 124 197 124\n459 0.999 95 130 91 129 85 131 80 133 75 135 70 137 66 139 61 141 56 142 52 145 52 150 57 152 62 151 68 148 73 146 78 144 83 142 88 140 94 138 97 135\n459 0.999 190 135 182 135 172 136 162 136 152 137 143 137 134 137 125 138 115 138 105 140 105 149 114 151 123 150 133 150 143 149 152 149 162 149 172 148 182 148 190 146\n459 0.999 225 141 222 140 218 140 214 140 210 140 207 140 204 141 201 141 197 141 193 141 193 144 195 147 199 147 203 147 206 147 210 147 214 147 217 147 221 147 224 146\n459 0.999 21 182 19 180 16 181 13 181 11 181 9 181 6 181 4 182 2 182 0 184 0 186 2 189 4 189 7 189 10 189 12 189 15 189 18 189 20 188 21 184\n460 0.999 146 6 129 5 111 5 93 8 76 13 61 19 47 28 35 41 25 54 13 69 15 82 34 82 44 69 54 56 67 44 82 35 98 30 115 27 131 28 146 23\n460 0.999 167 12 162 23 159 34 167 40 177 46 185 53 192 61 199 69 203 77 208 85 216 87 223 79 225 71 220 62 214 53 209 43 202 35 194 28 185 22 175 16\n460 0.999 47 94 55 112 73 112 92 111 115 111 135 111 157 111 180 111 201 110 207 103 202 77 195 65 177 62 157 62 137 62 116 61 97 61 77 62 51 62 49 79\n460 0.999 38 118 40 141 43 160 60 165 82 166 103 166 124 165 146 165 167 165 184 165 199 158 200 140 199 119 181 119 160 119 141 119 121 119 99 119 79 119 59 118\n460 0.999 212 189 198 171 183 184 165 197 142 206 121 209 100 206 80 199 61 186 45 174 29 186 38 201 54 214 74 225 96 232 118 234 140 232 161 226 180 216 197 202\n461 0.999 241 75 255 96 273 85 291 70 313 60 338 57 359 60 382 67 405 79 420 92 435 87 425 68 408 53 390 39 367 33 345 32 320 33 298 39 276 46 258 61\n461 0.999 243 106 246 129 264 132 286 127 310 126 336 125 360 124 381 126 403 129 421 133 440 136 441 110 423 104 400 99 376 96 355 96 331 95 307 96 286 99 263 103\n461 0.999 305 146 311 151 320 152 328 154 336 154 344 155 353 154 362 153 371 152 380 150 381 139 375 136 366 138 357 139 349 140 340 140 331 140 323 138 315 136 308 137\n461 0.999 263 149 276 160 292 169 309 174 326 177 345 179 361 178 380 175 396 168 413 159 427 151 411 141 395 149 378 157 361 162 343 163 326 162 309 158 293 151 277 142\n461 0.999 454 187 438 177 412 180 384 182 360 183 337 182 313 181 286 180 261 178 239 183 236 211 256 216 281 216 308 217 333 216 358 216 384 217 411 216 436 216 454 210\n462 0.999 18 76 32 113 76 107 118 105 161 104 207 107 250 111 293 117 334 129 378 148 415 166 407 125 366 99 325 83 284 75 240 67 196 66 147 65 103 66 59 69\n463 0.999 211 155 226 178 248 202 273 192 302 183 329 181 357 184 386 192 412 206 433 208 442 182 448 149 425 133 400 121 375 113 347 111 314 111 283 117 258 126 236 138\n463 0.999 244 304 273 310 311 310 346 312 383 314 417 315 453 317 489 319 524 320 560 316 560 274 540 261 501 260 466 259 432 258 396 257 357 256 318 255 282 255 251 262\n464 0.999 406 121 381 108 346 105 314 107 282 110 251 114 217 122 186 130 158 138 128 148 123 175 146 181 174 172 206 164 238 158 270 152 301 151 332 151 362 154 393 157\n464 0.999 301 157 290 158 278 159 266 161 253 162 242 164 228 166 216 168 204 169 192 171 192 185 203 186 215 184 227 182 239 181 252 179 263 178 275 176 287 176 298 174\n464 0.999 407 175 376 178 344 182 312 186 278 190 246 193 212 197 180 201 147 205 115 209 102 234 134 233 166 230 200 226 233 223 266 220 299 217 331 213 362 210 395 208\n464 0.999 442 235 412 226 373 229 336 232 298 236 262 239 224 241 185 244 149 247 112 250 80 260 111 266 148 263 185 260 223 258 261 255 298 253 335 250 371 247 409 244\n465 0.999 248 52 231 48 210 46 188 47 171 48 158 48 136 49 120 53 106 58 94 70 101 86 114 92 133 87 146 84 169 83 183 83 206 87 225 91 238 90 244 70\n465 0.999 95 152 94 171 93 192 103 204 122 204 157 203 173 202 190 202 213 202 226 202 245 201 247 182 248 164 237 152 219 151 194 151 174 151 153 151 131 152 113 152\n466 0.999 7 50 21 65 46 63 65 51 83 45 106 43 126 45 147 53 168 68 183 75 201 58 192 42 175 26 155 15 133 8 110 5 86 6 65 13 43 23 25 35\n466 0.999 48 60 50 75 62 84 77 83 91 83 105 83 111 83 124 83 139 83 151 82 162 79 162 66 151 57 139 53 127 48 114 46 98 46 84 48 70 50 59 54\n466 0.999 204 85 178 85 166 85 145 85 120 85 94 86 59 85 35 86 13 87 13 109 14 134 38 135 64 134 88 134 110 133 132 133 160 133 181 133 201 133 204 109\n466 0.999 17 141 21 158 48 158 69 158 89 158 109 158 124 157 143 157 167 157 184 157 200 152 194 135 174 135 153 135 134 135 115 135 93 135 73 135 51 136 32 136\n466 0.999 206 158 182 158 170 158 150 158 124 158 100 158 66 159 48 160 25 160 12 170 12 195 36 197 60 197 83 197 104 197 127 197 153 196 173 196 192 196 206 183\n466 0.999 205 198 172 198 146 198 123 198 89 198 56 199 11 206 12 241 39 265 77 264 107 263 130 263 156 263 188 262 204 235 176 232 143 233 135 230 164 230 205 228\n466 0.999 205 266 180 265 168 265 147 266 121 266 97 266 60 267 39 267 19 268 10 283 10 308 34 310 59 310 83 309 105 309 127 308 156 308 175 308 195 307 205 289\n467 0.999 69 109 72 150 76 190 94 218 144 208 190 181 227 174 252 177 292 186 333 211 375 210 376 179 375 141 358 110 321 108 280 110 234 110 191 109 148 106 111 105\n467 0.999 295 195 283 185 268 182 254 182 243 182 232 182 213 182 197 182 186 182 179 192 179 207 192 209 206 209 221 209 236 209 250 210 264 210 281 210 293 209 296 196\n467 0.999 444 197 444 201 444 209 444 219 452 224 465 223 468 222 473 220 484 220 489 220 499 218 499 211 499 202 498 192 493 191 484 191 477 191 468 191 458 191 452 191\n467 0.999 438 244 440 246 448 247 457 246 466 246 478 246 484 247 493 246 498 241 498 242 499 238 499 233 498 227 489 227 480 227 472 227 463 227 455 228 447 228 439 230\n467 0.999 140 237 143 257 152 268 176 268 200 269 227 269 242 269 263 268 286 268 305 268 326 267 328 244 317 234 295 234 271 234 248 234 227 234 203 234 181 234 162 233\n468 0.999 246 22 242 22 238 23 234 23 230 24 226 25 222 25 218 27 216 29 216 35 216 39 220 42 225 40 229 39 232 39 236 38 239 38 243 37 246 34 246 28\n468 0.999 304 58 290 58 275 60 260 62 245 65 230 68 216 71 202 75 186 79 185 92 188 110 198 113 213 108 230 106 244 103 259 101 272 99 286 99 299 99 301 82\n468 0.999 346 111 325 106 300 101 276 101 252 104 227 108 204 115 182 124 161 134 139 147 143 172 161 173 183 162 207 152 231 146 254 142 277 141 300 143 323 151 338 145\n468 0.999 373 161 350 166 319 181 292 196 263 208 233 216 204 217 173 215 146 205 121 203 114 236 140 246 168 254 201 257 233 254 264 248 293 240 321 234 348 223 374 206\n469 0.999 258 124 257 103 242 81 221 67 199 58 171 52 141 53 116 58 97 66 79 78 61 95 61 127 71 139 102 137 127 136 151 135 174 134 197 134 217 133 239 133\n469 0.999 252 216 245 208 220 208 198 209 176 210 151 210 126 210 101 209 83 212 82 235 88 267 101 277 118 285 142 290 170 289 193 285 213 279 229 272 247 262 249 243\n470 0.999 125 145 151 140 165 111 185 89 207 75 240 67 277 73 296 84 320 107 335 135 359 129 346 101 326 77 299 56 268 43 233 41 200 49 171 66 149 89 134 117\n470 0.999 207 87 206 89 207 93 209 99 209 100 211 100 216 100 218 100 222 100 225 100 228 99 230 96 230 91 229 87 226 86 223 86 220 86 215 86 210 86 210 86\n470 0.999 301 199 292 199 277 199 264 200 248 201 233 202 219 202 205 202 191 204 191 214 191 226 205 227 218 227 233 227 245 226 259 226 271 226 284 226 297 225 301 214\n470 0.999 396 229 372 224 340 225 308 226 272 227 239 228 206 229 172 230 138 233 110 233 96 253 123 260 155 258 188 257 221 256 254 254 284 253 316 251 345 250 378 249\n471 0.999 378 72 361 54 340 43 317 37 293 35 269 36 245 39 222 44 200 54 181 68 162 84 184 96 204 83 224 71 247 64 270 61 293 62 316 65 336 72 358 83\n471 0.999 379 96 360 97 333 99 306 100 280 101 253 102 226 102 199 101 172 100 157 113 155 150 167 158 198 157 224 156 253 154 280 153 307 152 333 151 358 150 376 143\n471 0.999 351 154 335 154 316 155 297 157 277 158 258 158 238 160 218 161 199 162 180 164 178 186 195 189 215 188 235 188 254 187 273 186 292 185 312 184 330 183 349 182\n471 0.999 384 196 361 196 335 197 310 198 285 199 259 199 233 200 207 202 182 203 156 204 141 221 165 222 191 221 217 220 242 219 268 219 294 217 318 216 343 215 369 214\n471 0.999 379 225 357 225 331 226 306 227 281 228 255 229 230 230 205 231 179 231 153 232 146 259 166 263 193 262 220 261 246 260 271 260 297 260 322 258 346 256 372 255\n472 0.999 448 131 422 99 381 95 324 94 271 96 227 96 188 98 150 104 116 123 82 140 73 182 111 215 151 204 191 185 237 176 275 172 303 171 347 172 408 180 439 169\n472 0.999 102 292 141 292 190 287 216 286 256 287 294 293 346 300 386 293 419 281 438 245 422 200 405 200 361 199 328 198 288 184 242 176 185 189 147 208 126 213 95 234\n473 0.999 113 59 113 79 113 96 120 104 140 104 158 104 176 104 194 104 217 104 230 105 246 105 244 86 243 69 234 60 215 59 199 59 182 58 163 58 145 58 129 59\n473 0.999 173 119 174 139 177 157 195 149 212 147 229 153 247 162 266 169 291 170 304 166 320 161 321 144 319 129 300 133 280 127 263 120 246 115 226 112 205 111 188 114\n473 0.999 365 215 346 214 327 214 309 214 288 215 270 215 252 215 235 215 219 215 207 224 207 244 224 245 242 246 262 246 280 246 299 246 318 246 335 246 355 246 362 233\n474 0.999 382 80 351 68 317 64 286 67 259 72 233 80 203 93 178 107 156 121 128 148 125 178 150 165 176 147 203 133 237 120 271 110 300 107 321 107 347 112 379 118\n474 0.999 148 209 149 217 162 225 183 231 207 230 228 227 244 225 255 216 260 198 260 201 262 189 261 182 261 168 240 171 221 174 202 169 186 164 162 170 156 188 152 199\n474 0.999 353 203 347 206 342 210 337 212 334 215 332 216 327 219 325 224 325 222 326 231 328 237 331 238 337 234 341 231 347 227 352 224 355 221 354 222 356 220 356 214\n474 0.999 319 209 299 217 277 227 255 236 238 240 226 244 203 247 183 247 166 246 153 256 152 287 170 289 195 288 218 285 240 281 263 274 283 266 293 261 307 255 318 244\n475 0.999 169 34 164 33 159 33 154 32 147 32 142 33 136 33 130 33 126 34 123 38 124 44 129 45 134 45 139 44 144 44 150 44 155 44 160 44 166 44 169 40\n475 0.999 108 32 98 33 86 33 77 34 64 36 55 38 42 41 31 45 30 54 31 62 32 74 42 71 51 67 62 64 73 62 83 61 94 61 105 62 108 53 108 42\n475 0.999 272 33 261 34 247 34 237 35 219 35 207 35 191 36 180 37 182 50 185 62 188 77 197 78 209 75 223 73 236 73 251 75 263 77 273 75 272 59 272 46\n475 0.999 107 72 95 72 83 72 73 73 56 73 46 74 30 75 20 75 17 83 18 93 19 105 31 105 43 104 56 104 68 104 81 104 92 104 104 104 108 95 108 83\n475 0.999 191 83 191 93 200 95 211 94 220 94 229 94 239 94 248 94 258 95 266 95 273 89 273 80 263 79 253 79 244 79 235 79 226 79 216 79 206 79 197 79\n475 0.999 188 80 178 82 169 84 160 86 149 86 141 86 130 85 121 83 112 80 109 89 109 100 118 101 127 101 137 101 146 102 156 102 166 101 176 100 186 100 188 92\n475 0.999 154 154 161 168 168 178 183 173 195 169 210 167 224 168 240 171 253 179 264 180 274 169 277 158 265 150 252 143 238 138 224 136 209 136 193 138 178 142 167 148\n475 0.999 137 140 121 140 104 140 89 140 66 140 49 141 30 140 23 148 24 164 24 180 25 195 40 193 57 192 75 192 92 194 109 196 126 198 139 192 138 174 138 156\n475 0.999 253 180 243 177 233 175 224 173 213 174 206 170 194 170 184 172 171 176 170 180 170 190 171 198 184 199 195 198 205 198 216 199 226 199 237 199 246 199 251 190\n475 0.999 52 191 55 201 57 210 68 210 79 212 89 215 97 210 100 215 112 217 122 216 133 215 136 207 130 201 121 201 110 201 102 200 93 195 82 193 72 191 61 190\n475 0.999 185 206 177 207 171 210 166 210 159 213 161 222 160 225 165 229 168 233 175 232 181 230 188 226 191 221 187 217 181 220 175 223 174 224 179 220 185 219 187 212\n475 0.999 30 220 30 234 29 246 37 253 51 253 66 252 79 252 92 252 106 252 118 252 132 252 132 241 132 226 125 220 110 220 96 219 82 219 68 218 55 218 42 219\n475 0.999 47 257 48 267 48 277 55 281 64 280 73 280 82 280 91 280 101 280 109 280 117 280 118 270 118 261 111 258 102 258 92 258 84 258 74 257 65 257 56 256\n475 0.999 167 272 169 284 175 290 186 290 197 290 209 290 220 290 231 290 242 290 252 290 263 290 263 278 256 272 245 272 234 272 223 273 212 272 200 272 189 272 178 272\n476 0.999 63 203 103 225 131 195 166 166 209 157 256 157 303 161 349 178 387 199 426 223 457 208 433 175 395 145 353 124 307 113 259 107 211 110 165 120 124 135 90 168\n476 0.999 200 173 199 188 201 204 216 204 233 204 250 204 265 204 281 205 296 205 312 205 328 205 329 191 325 178 310 176 293 175 278 173 262 172 245 171 231 171 214 171\n476 0.999 103 237 103 250 106 263 120 262 134 263 147 263 161 263 174 263 188 263 201 264 215 264 215 250 211 239 197 238 183 237 170 237 156 237 141 236 129 235 115 236\n476 0.999 402 236 393 236 384 236 373 236 364 236 355 236 346 236 337 236 335 244 336 253 337 262 345 262 355 262 364 262 373 262 382 263 391 263 401 262 402 253 402 245\n476 0.999 103 271 103 284 106 296 120 296 134 295 147 295 160 295 174 295 186 295 200 295 213 295 213 281 210 271 195 271 182 271 170 270 156 270 141 270 129 270 115 270\n476 0.999 424 268 410 269 396 268 381 269 368 269 355 269 341 269 327 269 314 270 313 281 314 295 326 296 341 295 355 296 367 296 382 296 394 297 410 297 422 294 423 283\n476 0.999 59 342 97 348 138 348 179 348 224 348 268 348 313 348 356 348 398 348 441 347 465 318 429 312 390 312 344 312 301 312 257 312 214 312 167 311 126 311 81 311\n477 0.999 183 39 183 45 182 54 185 62 192 63 198 65 206 67 213 69 220 70 225 74 233 77 235 68 235 58 232 54 227 50 221 47 215 45 207 43 199 41 191 39\n477 0.999 185 65 184 72 184 81 188 88 193 90 200 91 207 93 214 95 221 97 227 98 234 100 236 94 236 82 232 79 228 75 222 72 215 70 208 69 200 67 191 65\n477 0.999 100 77 98 116 106 151 105 202 141 219 159 230 215 224 242 215 275 215 308 224 344 245 364 228 356 187 346 152 314 147 278 137 248 125 225 107 177 90 132 81\n478 0.999 225 76 226 82 234 85 241 82 248 79 256 79 263 79 271 80 277 83 283 85 291 84 290 76 284 72 277 70 269 68 262 67 254 67 245 68 238 71 231 73\n478 0.999 188 89 186 108 186 125 191 136 215 129 234 124 253 123 272 124 290 127 308 134 326 138 327 121 327 103 322 93 301 91 280 90 260 89 238 89 220 88 203 88\n478 0.999 393 108 392 121 404 124 417 124 429 125 440 125 451 126 463 127 475 127 486 128 497 127 496 116 485 114 474 113 462 113 451 113 439 112 426 111 415 110 403 108\n478 0.999 394 138 402 141 413 141 423 142 435 142 447 142 459 142 471 142 482 144 494 145 497 133 489 131 476 130 466 130 455 129 443 128 430 128 419 127 408 126 397 126\n478 0.999 194 147 201 152 215 152 227 152 242 152 257 153 272 154 284 154 295 154 308 154 316 144 308 140 295 140 282 139 269 139 256 138 241 138 228 138 216 137 204 137\n478 0.999 422 151 423 154 431 154 438 154 444 154 450 155 459 155 466 155 472 155 475 152 475 142 471 142 465 142 459 142 453 142 447 142 439 142 433 141 426 141 422 143\n478 0.999 184 154 185 170 197 174 213 174 231 174 247 174 263 174 280 174 295 174 311 174 326 173 326 155 311 154 296 154 280 154 263 154 247 154 230 154 214 154 199 154\n478 0.999 243 175 242 181 242 185 242 190 243 195 247 196 253 196 257 196 262 196 266 196 271 195 271 189 272 185 272 180 271 176 266 175 260 175 256 175 251 175 246 175\n478 0.999 316 197 302 196 287 196 274 196 259 196 245 196 231 197 217 198 204 198 195 202 195 217 209 217 222 217 236 217 250 217 264 217 278 217 293 217 307 216 316 210\n479 0.999 61 162 82 161 104 151 132 142 156 137 183 133 209 133 234 134 260 137 289 139 282 128 255 119 231 113 204 111 179 110 154 110 129 112 103 116 78 123 59 139\n479 0.999 168 150 180 152 192 152 204 153 218 154 231 154 243 155 256 155 268 156 281 157 286 149 277 145 265 145 252 144 239 144 227 143 213 142 201 142 189 141 174 141\n479 0.999 140 151 151 161 170 162 187 162 204 164 223 164 241 165 260 165 277 166 295 167 311 166 299 158 281 157 264 156 246 155 228 155 210 154 192 153 174 152 156 151\n479 0.999 164 164 171 174 185 174 198 174 211 175 225 175 239 175 252 176 266 176 278 176 291 175 283 168 269 167 256 166 243 166 229 165 216 165 203 164 189 164 176 163\n479 0.999 19 171 20 174 21 176 24 176 26 176 29 176 32 176 35 177 37 177 40 176 41 174 42 170 39 169 36 169 33 169 31 169 29 169 26 169 24 169 21 169\n479 0.999 139 221 138 210 128 203 121 204 109 204 99 206 88 206 78 206 68 206 58 206 46 206 47 223 56 229 65 229 77 229 91 227 103 226 114 226 125 226 131 226\n479 0.999 132 227 123 226 113 227 105 227 95 228 85 228 77 229 67 229 58 229 49 230 45 237 52 240 62 240 71 239 80 239 91 239 101 238 111 238 120 237 129 237\n479 0.999 346 247 338 247 331 247 325 248 317 249 310 248 303 249 296 249 289 249 281 249 280 257 286 258 293 258 300 258 307 258 317 257 324 257 331 257 339 257 346 256\n479 0.999 347 257 340 257 332 257 325 257 318 257 310 258 304 258 296 258 289 258 281 258 279 266 285 268 292 268 299 267 307 267 316 267 323 267 331 266 338 266 345 266\n480 0.999 48 190 90 211 108 172 141 146 186 136 223 140 267 152 309 171 344 199 377 231 415 213 393 175 359 140 320 113 275 93 232 86 191 84 139 91 97 113 67 148\n480 0.999 137 226 146 262 162 267 182 270 203 272 223 274 246 276 270 277 293 277 313 276 332 273 331 238 317 236 295 237 273 238 252 237 231 236 206 234 183 231 161 229\n481 0.999 91 19 90 27 92 33 99 33 107 35 113 38 119 44 124 52 127 59 134 57 142 54 144 49 141 43 138 35 133 29 127 24 120 21 113 19 106 18 98 18\n481 0.999 84 23 79 25 73 28 69 31 64 35 61 38 57 42 55 47 52 52 51 57 56 59 62 61 66 59 69 53 72 49 76 44 80 40 84 37 87 33 86 28\n481 0.999 139 83 128 78 119 78 112 87 102 94 90 96 79 94 70 87 62 77 51 80 47 86 54 96 61 105 71 111 82 115 94 115 106 112 116 107 125 100 132 92\n481 0.999 18 128 16 146 15 165 30 165 51 164 66 165 86 167 103 167 120 168 138 169 156 170 158 155 156 141 138 140 122 139 104 137 86 135 68 134 50 131 33 129\n481 0.999 50 166 49 175 51 180 58 181 66 182 73 182 80 183 88 184 95 184 103 185 108 184 109 175 107 169 100 169 92 169 85 168 78 167 70 167 63 167 55 166\n481 0.999 9 192 22 195 38 196 52 198 68 199 83 201 99 202 114 203 130 205 149 206 154 192 138 188 125 187 111 186 95 185 79 183 61 182 45 180 31 179 14 177\n482 0.999 160 57 154 56 148 58 142 59 136 60 130 61 124 62 117 63 111 64 105 65 104 72 109 73 116 71 122 70 128 69 134 68 140 67 145 66 151 66 158 65\n482 0.999 181 69 179 68 177 68 174 68 172 69 169 69 167 70 165 70 162 70 160 71 160 75 162 76 165 76 167 75 170 75 171 75 174 75 177 75 179 75 181 72\n482 0.999 140 69 138 68 135 68 133 69 131 69 129 70 126 70 124 70 122 71 122 74 122 78 124 78 127 78 129 78 131 77 133 77 135 77 137 76 139 76 140 73\n482 0.999 102 80 99 80 97 80 94 81 92 81 89 81 87 82 84 83 81 83 80 84 80 88 82 90 85 89 88 88 90 88 92 88 95 87 97 86 99 86 102 84\n482 0.999 221 107 205 101 184 95 163 92 142 92 120 95 100 100 80 108 63 117 43 129 47 149 59 153 79 143 99 135 118 129 137 126 158 127 178 130 196 136 211 134\n482 0.999 169 139 167 138 165 138 164 138 162 139 160 139 159 139 158 139 156 140 155 140 155 143 156 144 158 144 160 144 161 144 162 143 164 143 165 143 167 143 169 141\n482 0.999 95 146 93 146 91 146 90 147 88 147 87 147 85 147 83 148 81 148 81 150 81 152 82 153 84 153 86 153 88 153 89 153 91 153 92 152 94 151 95 148\n482 0.999 198 163 184 164 165 166 147 168 130 170 112 171 93 173 75 175 56 177 48 180 46 208 61 207 79 205 99 203 116 202 133 200 150 199 168 197 185 196 197 191\n482 0.999 133 205 122 205 111 206 100 207 89 208 78 209 67 210 56 211 45 212 35 214 33 227 43 227 55 226 66 226 77 225 87 224 98 223 109 222 119 221 131 220\n482 0.999 210 221 196 220 182 221 168 221 154 222 139 223 126 224 111 225 97 226 82 227 79 241 91 243 106 241 120 241 134 240 148 239 163 238 176 237 190 236 205 235\n483 0.999 195 43 195 54 195 66 198 78 211 74 222 71 223 75 234 76 243 70 233 65 225 67 229 65 240 65 251 66 255 51 252 43 241 43 228 41 216 41 205 41\n483 0.999 208 73 208 78 207 84 213 86 219 86 225 86 230 86 237 86 245 86 248 87 253 87 255 80 252 76 246 75 242 75 236 75 230 74 225 74 219 72 214 72\n483 0.999 165 99 167 115 180 119 196 119 213 120 228 120 243 120 259 121 279 121 291 122 305 121 303 104 288 103 273 103 258 103 242 102 227 101 210 100 195 100 180 99\n483 0.999 101 145 125 159 150 159 175 159 208 159 236 159 265 159 296 159 336 159 355 154 354 135 332 124 306 124 278 123 251 122 221 122 192 121 163 120 134 120 102 122\n483 0.999 363 164 334 164 304 164 275 164 246 165 218 165 189 165 159 165 134 166 101 166 86 182 114 186 144 185 173 184 201 184 230 183 259 182 289 181 318 181 346 180\n483 0.999 332 229 308 230 282 233 260 234 237 236 214 238 189 240 167 241 153 242 128 252 130 284 152 284 178 281 202 278 223 276 247 273 270 271 295 269 322 266 332 251\n484 0.999 189 61 171 46 150 35 127 30 105 30 82 35 63 44 49 61 42 81 42 102 42 126 58 122 59 97 60 75 68 60 86 50 109 44 132 45 153 52 171 67\n484 0.999 173 74 173 82 173 90 174 98 174 105 174 111 174 119 174 125 175 133 183 133 191 133 192 129 191 122 191 115 191 103 191 95 190 88 190 80 188 75 178 75\n484 0.999 102 99 103 104 103 107 104 111 106 113 109 113 113 114 117 114 121 113 124 113 128 112 128 108 127 105 126 101 124 100 120 99 116 99 112 98 109 98 102 99\n484 0.999 43 154 57 163 70 173 84 179 99 183 115 185 131 184 149 180 163 172 177 164 185 152 173 145 160 154 146 161 132 166 116 168 98 165 82 160 68 150 57 141\n484 0.999 28 228 47 231 62 231 83 231 102 231 121 231 140 231 160 232 180 232 203 231 204 216 186 211 169 211 150 211 131 210 111 210 92 210 72 209 50 208 29 208\n485 0.999 65 126 102 123 140 112 179 99 218 94 256 92 296 98 333 107 371 120 409 129 428 115 396 97 357 84 318 71 277 63 238 62 199 66 159 74 122 85 84 95\n485 0.999 148 125 153 144 176 144 199 144 222 144 244 144 266 144 288 144 310 144 332 144 353 141 348 121 323 121 300 121 278 121 256 121 234 121 211 120 189 120 166 120\n485 0.999 108 170 115 195 145 195 175 195 205 195 234 195 265 195 293 195 324 195 353 195 383 195 376 171 344 171 315 171 286 171 256 170 226 170 196 170 167 170 137 170\n485 0.999 68 219 72 255 104 255 140 255 174 256 207 256 241 256 275 256 309 256 342 257 372 253 372 225 330 220 297 219 264 218 231 218 197 217 162 216 129 216 95 214\n485 0.999 401 230 400 234 400 238 401 242 401 247 401 251 402 254 406 256 412 256 416 256 421 254 422 253 421 248 421 242 421 236 421 231 419 230 413 229 409 229 405 229\n485 0.999 61 260 64 304 103 304 147 304 187 304 227 304 268 304 309 304 349 304 389 304 429 303 427 263 382 265 343 265 303 265 262 265 221 266 180 266 139 263 99 261\n486 0.999 108 62 121 95 150 99 182 95 213 96 245 98 273 106 304 115 334 127 362 140 387 139 392 111 356 94 330 81 302 72 272 63 240 57 207 53 173 52 141 53\n486 0.999 106 105 106 147 108 189 147 193 184 199 222 203 258 209 296 215 334 220 369 225 413 229 411 193 405 154 365 148 330 142 297 134 262 127 227 120 183 108 143 104\n486 0.999 152 220 152 250 171 267 199 270 226 272 253 275 280 280 307 283 336 287 363 290 391 292 391 258 375 244 346 240 321 236 295 233 268 230 240 226 211 223 182 219\n487 0.999 27 81 47 86 63 72 82 63 102 58 123 56 145 58 165 64 183 73 200 86 218 79 202 64 184 53 163 46 143 42 122 41 100 43 80 47 60 53 43 65\n487 0.999 187 81 170 80 149 81 131 81 113 82 95 82 79 82 61 82 47 87 47 106 46 122 65 124 86 123 104 123 120 123 137 124 157 124 177 124 184 112 186 96\n487 0.999 226 117 216 113 207 110 199 118 193 126 186 134 178 138 168 143 159 146 150 148 149 158 153 165 164 162 174 159 182 156 191 151 199 146 207 140 214 133 221 125\n487 0.999 20 123 30 135 40 146 52 154 65 160 80 165 94 167 109 169 123 169 137 169 149 160 139 151 123 151 109 151 95 150 80 146 66 140 54 133 44 122 33 116\n487 0.999 104 131 103 136 104 140 104 143 107 145 112 146 116 146 120 146 125 146 131 146 134 145 135 140 135 135 133 131 129 131 124 131 119 131 115 131 111 131 107 131\n487 0.999 62 223 72 223 84 219 95 215 107 212 118 212 131 213 142 216 154 220 165 224 174 219 163 213 152 209 140 205 128 203 116 202 103 203 92 206 80 210 69 214\n487 0.999 107 213 107 216 107 218 109 220 112 220 114 220 117 220 119 221 122 221 125 221 127 218 127 216 126 212 124 212 121 212 119 212 116 212 114 212 112 212 109 212\n487 0.999 92 229 94 231 99 231 105 232 110 232 116 232 122 232 126 232 132 232 138 231 139 222 137 221 131 221 125 221 120 221 114 221 109 220 102 220 95 220 93 221\n487 0.999 67 223 67 228 71 231 77 232 83 234 88 236 94 239 99 241 104 243 109 245 114 243 113 236 108 235 102 232 97 231 92 228 87 226 82 225 77 223 72 222\n487 0.999 164 223 159 221 155 223 150 225 146 227 141 229 136 231 131 233 126 234 123 237 125 243 128 244 134 242 138 240 143 238 147 236 153 235 158 233 163 231 165 227\n488 0.999 184 101 180 124 175 149 179 165 205 166 233 166 259 168 283 171 310 179 332 188 354 197 359 179 362 159 350 147 327 140 304 130 278 121 253 114 228 109 206 105\n488 0.999 195 174 194 196 192 216 210 221 233 223 257 226 281 229 304 232 326 234 346 237 369 239 368 225 365 208 348 199 325 195 304 191 281 187 258 183 237 180 214 176\n488 0.999 392 191 395 205 397 216 406 222 420 224 433 226 448 229 460 231 473 233 486 235 497 237 498 224 496 213 484 210 471 207 457 204 445 201 431 198 418 195 406 193\n488 0.999 117 197 111 201 108 206 103 214 99 220 94 226 90 232 87 237 83 243 83 249 84 256 91 251 94 246 98 241 102 236 108 230 113 225 117 219 117 212 117 205\n488 0.999 412 237 413 247 419 253 429 253 438 254 448 255 459 257 468 257 477 258 486 258 495 256 495 247 487 246 478 244 469 243 460 242 450 240 440 239 430 239 421 238\n488 0.999 231 252 236 257 248 258 260 259 272 260 283 261 295 262 307 263 319 263 331 264 334 250 327 246 315 246 305 245 293 245 281 244 270 243 257 242 246 241 234 241\n488 0.999 96 260 93 262 90 265 88 267 85 271 84 274 83 277 82 280 82 283 82 287 82 290 87 290 89 287 93 284 95 281 98 279 99 274 98 271 98 266 97 263\n488 0.999 314 292 314 297 317 301 322 301 327 301 334 301 338 301 344 301 349 302 353 302 358 297 356 291 352 290 348 290 343 289 337 289 332 289 327 288 321 288 316 287\n488 0.999 150 297 150 303 150 308 156 309 162 309 168 309 174 309 180 308 185 308 191 308 197 308 197 302 195 297 189 297 184 296 179 296 172 296 166 296 161 296 154 296\n488 0.999 313 309 314 314 320 316 325 316 332 316 338 315 344 315 350 315 356 315 362 315 365 307 362 304 358 302 352 302 346 302 340 302 333 302 327 302 321 302 315 302\n488 0.999 289 305 279 304 271 306 262 308 253 309 243 311 234 313 224 315 216 316 213 322 215 333 220 336 228 333 238 331 246 329 256 326 265 324 274 321 282 318 289 314\n488 0.999 149 312 149 317 149 322 155 325 161 324 167 324 174 324 179 324 186 324 192 324 198 322 198 316 196 310 191 310 185 310 178 310 172 310 165 310 159 310 153 311\n488 0.999 304 324 290 326 275 331 259 335 249 338 263 334 278 330 293 326 289 319 275 322 262 325 246 329 234 336 219 339 221 353 237 350 254 347 269 343 284 341 299 338\n488 0.999 368 316 362 315 356 315 349 315 343 316 336 316 329 316 323 316 317 316 316 321 316 328 319 332 325 331 332 330 338 330 345 330 353 329 358 329 365 328 369 323\n488 0.999 149 328 149 333 149 339 154 342 159 341 165 341 170 341 176 340 182 339 187 339 192 339 193 333 192 327 187 326 182 326 176 327 170 327 164 327 158 328 153 327\n489 0.999 24 28 25 65 24 104 43 119 71 106 101 96 143 90 175 90 219 96 254 108 284 119 293 97 293 60 277 31 241 30 207 27 171 24 133 25 96 27 62 27\n489 0.999 81 107 85 121 102 121 121 120 137 120 152 120 170 119 186 119 206 119 223 119 236 118 231 104 215 103 198 102 182 102 165 101 148 101 131 102 114 104 97 105\n489 0.999 64 126 64 150 65 175 92 181 110 181 132 180 160 180 182 179 215 179 241 178 255 177 258 155 258 133 235 127 212 128 187 128 163 129 137 129 111 129 88 126\n489 0.999 296 186 267 185 238 186 212 186 182 186 153 187 125 187 97 188 71 188 43 189 28 205 56 207 84 207 114 206 141 206 170 205 197 205 225 204 254 204 280 203\n489 0.999 79 225 73 224 66 224 59 224 52 224 46 225 40 225 33 226 26 226 20 226 15 229 21 230 28 230 35 230 42 230 48 230 55 230 62 230 68 230 75 230\n490 0.999 96 78 112 105 129 124 159 112 193 103 227 98 258 98 292 103 323 113 352 125 365 105 376 78 347 62 320 52 287 44 256 41 222 42 186 46 155 54 128 64\n490 0.999 137 177 139 205 137 229 135 261 162 266 192 268 221 268 251 269 281 271 307 271 332 270 331 243 332 214 333 189 306 184 278 187 250 188 218 187 189 182 166 178\n490 0.999 115 200 105 203 95 205 86 209 76 212 66 216 57 219 49 223 44 228 47 236 51 245 60 245 69 241 78 237 87 233 96 230 105 227 114 226 118 217 116 209\n490 0.999 359 203 356 213 355 223 364 229 375 232 386 236 396 240 406 244 417 248 427 252 437 254 440 243 442 233 433 229 422 224 412 220 402 216 390 211 379 206 369 204\n490 0.999 123 300 141 312 162 312 188 313 216 313 242 314 268 314 295 316 321 316 347 316 350 292 332 279 309 278 283 278 258 278 231 276 205 276 179 275 151 275 123 277\n491 0.999 145 132 154 170 175 203 219 201 254 200 295 205 336 201 374 204 415 219 429 220 451 183 443 159 411 140 375 130 340 125 304 123 278 115 246 126 225 121 186 121\n491 0.999 142 190 133 191 126 191 116 192 107 194 100 195 93 196 83 197 73 198 69 204 71 214 79 215 88 214 97 213 106 212 115 211 122 209 130 208 141 208 143 200\n491 0.999 278 203 268 203 258 199 241 198 228 199 218 199 211 199 198 199 179 201 175 206 176 219 184 222 196 222 209 221 222 222 234 221 245 220 255 221 272 221 278 213\n491 0.999 291 203 293 210 294 222 305 222 315 221 327 222 341 222 353 221 363 222 368 222 377 220 379 211 376 205 367 202 357 201 346 200 337 199 323 199 314 198 301 198\n491 0.999 136 210 129 210 123 211 114 213 108 213 102 214 96 215 89 215 81 218 80 226 81 233 89 233 95 232 103 231 110 230 117 229 123 228 129 227 137 225 138 217\n491 0.999 399 228 373 228 348 228 323 228 297 228 275 228 250 228 223 228 199 228 173 228 161 246 188 247 213 247 238 247 262 247 288 247 313 247 335 248 365 248 393 247\n491 0.999 266 320 266 324 270 325 273 325 277 325 281 326 285 325 289 325 292 325 294 326 298 324 297 319 294 317 291 317 287 317 283 317 280 317 274 317 272 318 269 317\n492 0.999 452 172 416 161 373 162 331 163 287 163 232 170 189 177 148 181 104 180 81 204 78 263 120 258 156 250 202 244 242 239 285 244 336 247 379 248 420 248 446 224\n492 0.999 273 241 267 238 262 238 256 238 250 238 244 238 238 239 233 239 226 239 223 242 223 250 227 253 233 252 239 252 244 251 250 251 257 251 262 251 268 251 272 248\n493 0.999 26 156 34 225 95 197 100 151 140 116 199 95 270 90 341 104 396 147 403 189 447 230 472 189 466 124 422 72 357 40 298 25 228 24 161 36 101 65 57 104\n493 0.999 389 137 359 130 325 130 288 130 254 130 221 131 188 132 155 133 121 134 106 153 107 185 139 187 171 187 207 187 242 186 273 186 306 186 337 185 373 185 387 164\n493 0.999 128 199 129 227 137 250 164 251 192 250 217 251 251 250 281 250 312 250 340 250 364 247 368 219 360 198 333 197 302 197 275 197 244 196 215 196 184 196 155 196\n493 0.999 79 259 97 286 135 286 166 286 201 286 237 286 274 288 311 288 350 288 382 288 415 283 398 258 368 257 333 257 295 256 259 256 221 256 184 255 146 255 111 255\n493 0.999 229 294 229 298 228 304 227 312 228 314 232 316 238 316 243 316 250 316 254 315 260 314 260 309 260 304 260 297 259 291 255 290 249 290 243 290 237 290 233 291\n494 0.999 123 55 146 85 172 82 202 72 233 68 264 71 295 78 325 90 352 107 378 125 399 114 392 91 366 70 339 54 310 43 279 34 248 27 215 26 183 31 153 42\n494 0.999 118 106 119 145 151 151 185 156 218 162 252 167 286 173 321 179 355 183 390 188 426 190 429 154 393 148 359 142 327 136 293 130 258 125 223 120 190 115 153 108\n494 0.999 129 201 153 226 179 245 210 259 242 273 275 282 313 285 348 283 380 276 407 263 422 239 399 209 369 223 338 231 304 235 274 232 242 222 211 209 184 190 153 171\n495 0.999 80 111 111 119 138 101 168 86 200 78 234 76 267 78 301 85 331 99 358 117 385 106 364 84 334 66 302 56 268 50 234 48 200 50 167 57 134 68 106 87\n495 0.999 63 146 82 169 120 169 157 169 195 169 232 170 269 170 307 169 344 169 382 169 418 169 399 144 361 144 323 143 287 143 250 144 212 144 174 144 137 143 99 144\n495 0.999 63 172 88 189 125 189 161 189 198 190 235 191 272 191 308 191 344 191 380 191 417 191 392 174 356 174 319 174 283 173 246 173 210 173 172 172 136 172 99 172\n495 0.999 90 256 113 273 140 289 171 300 203 307 236 309 268 308 300 302 329 292 356 280 374 261 348 245 322 260 293 270 262 275 231 276 200 273 170 264 141 251 113 236\n496 0.999 120 77 130 108 155 105 184 98 213 92 240 87 266 88 294 94 321 102 346 108 370 111 370 83 339 74 314 66 286 57 258 52 229 53 201 57 174 64 147 71\n496 0.999 147 150 154 167 176 167 197 168 218 168 239 169 259 169 280 170 301 171 321 171 343 170 336 153 315 153 294 152 273 152 253 151 231 151 209 150 189 149 168 149\n496 0.999 98 172 100 214 116 230 152 231 190 230 224 228 256 227 291 227 325 226 356 226 388 225 391 184 367 176 333 176 301 175 268 175 233 175 200 174 165 173 130 172\n497 0.999 37 87 66 136 102 123 146 109 188 103 238 99 286 101 329 106 371 119 413 133 442 118 428 80 393 65 353 53 309 45 264 41 215 41 167 46 123 56 83 70\n497 0.999 372 172 369 151 361 127 336 125 295 125 264 125 236 126 198 126 160 125 135 125 118 128 118 160 131 184 176 184 203 183 237 184 268 183 298 184 330 184 362 184\n497 0.999 31 202 55 245 96 244 146 245 188 245 241 245 289 245 332 245 376 244 421 244 461 241 439 200 402 196 359 196 314 196 266 196 219 196 170 196 123 196 76 197\n497 0.999 42 255 58 301 99 299 146 301 188 301 238 301 285 301 327 300 369 299 415 299 455 295 439 256 400 255 361 255 316 255 271 254 226 254 176 253 130 254 85 254\n498 0.999 63 25 64 35 70 36 79 34 87 33 96 33 104 33 113 34 121 35 130 37 137 36 139 27 132 25 123 23 114 22 105 22 96 22 88 22 79 22 71 24\n498 0.999 48 76 51 87 61 84 73 82 84 80 96 79 107 79 119 80 131 81 142 84 152 85 151 75 140 71 129 69 117 67 105 67 94 67 82 68 70 70 59 73\n498 0.999 67 114 68 123 75 123 84 121 92 121 101 120 109 121 118 121 126 122 135 124 141 123 143 115 135 112 127 111 118 110 109 109 100 109 92 109 84 110 75 111\n498 0.999 50 129 54 139 65 136 76 134 88 132 100 132 112 132 124 133 137 135 148 139 159 141 157 130 146 126 134 124 121 122 109 121 97 121 85 122 73 124 62 126\n498 0.999 53 174 58 188 63 192 76 188 91 186 103 185 117 186 130 188 143 191 155 196 163 191 167 180 156 174 144 169 130 165 117 163 104 163 91 163 77 166 65 170\n498 0.999 45 212 44 220 45 227 46 234 47 241 50 248 53 254 56 260 60 266 64 270 72 270 72 264 68 257 65 251 62 244 60 237 58 230 56 222 56 215 52 212\n498 0.999 172 218 162 218 159 220 157 228 155 234 152 240 149 247 146 253 142 259 138 266 141 271 148 275 153 270 157 263 160 257 164 251 167 245 168 238 170 232 171 225\n498 0.999 87 281 87 287 87 290 90 292 95 293 99 293 103 293 108 293 112 294 116 294 121 293 121 289 121 284 118 281 114 281 109 281 104 281 100 281 96 280 92 280\n499 0.999 208 35 186 33 163 33 142 36 121 41 102 49 84 59 68 69 53 83 39 98 44 110 65 118 79 104 95 92 115 81 134 74 155 70 175 68 199 68 205 57\n499 0.999 79 135 99 139 112 139 132 139 153 139 173 139 192 139 213 139 233 139 254 138 254 118 243 112 222 112 202 112 182 112 161 112 140 111 118 111 98 111 80 113\n499 0.999 100 150 100 168 106 178 124 178 142 178 158 178 174 178 191 178 207 178 222 178 237 176 238 160 231 151 214 151 198 150 182 150 164 150 145 149 130 149 115 149\n"
  },
  {
    "path": "dataset/ctw1500/Readme.md",
    "content": "# Evaluation_step (Make sure you have downloaded the data and put them in the correct path)\n\n(In ROOT directory)\n\n1. Follow the format in Evaluation_Results_Example to output your own detection results. (Each line: x1,y1,x2,y2,...,xn,yn. Note that number of n is not limited to 14)\n\n2. run \"python tools/ctw1500_evaluation/ctw1500_eval.py\"\n\nExample output: \"ap: 0.6143, recall: 0.8162, pred: 0.7497, FM: 0.7815\"\n"
  },
  {
    "path": "dataset/ctw1500_text.py",
    "content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n__author__ = '古溪'\nimport os\nimport numpy as np\nfrom dataset.data_util import pil_load_img\nfrom dataset.dataload import TextDataset, TextInstance\nfrom util.io import read_lines\nimport cv2\n\n\nclass Ctw1500Text(TextDataset):\n    def __init__(self, data_root, is_training=True, load_memory=False, transform=None, ignore_list=None):\n        super().__init__(transform, is_training)\n        self.data_root = data_root\n        self.is_training = is_training\n        self.load_memory = load_memory\n\n        self.image_root = os.path.join(data_root, 'train' if is_training else 'test', \"text_image\")\n        self.annotation_root = os.path.join(data_root, 'train' if is_training else 'test', \"text_label_circum\")\n        self.image_list = os.listdir(self.image_root)\n        self.annotation_list = ['{}'.format(img_name.replace('.jpg', '')) for img_name in self.image_list]\n\n        if self.load_memory:\n            self.datas = list()\n            for item in range(len(self.image_list)):\n                self.datas.append(self.load_img_gt(item))\n\n    @staticmethod\n    def parse_carve_txt(gt_path):\n        \"\"\"\n        .mat file parser\n        :param gt_path: (str), mat file path\n        :return: (list), TextInstance\n        \"\"\"\n        lines = read_lines(gt_path + \".txt\")\n        polygons = []\n        for line in lines:\n            # line = strs.remove_all(line.strip('\\ufeff'), '\\xef\\xbb\\xbf')\n            gt = list(map(int, line.split(',')))\n            pts = np.stack([gt[4::2], gt[5::2]]).T.astype(np.int32)\n\n            pts[:, 0] = pts[:, 0] + gt[0]\n            pts[:, 1] = pts[:, 1] + gt[1]\n            polygons.append(TextInstance(pts, 'c', \"**\"))\n\n        return polygons\n\n    def load_img_gt(self, item):\n        image_id = self.image_list[item]\n        image_path = os.path.join(self.image_root, image_id)\n\n        # Read image data\n        image = pil_load_img(image_path)\n        try:\n            h, w, c = image.shape\n            assert (c == 3)\n        except:\n            image = cv2.imread(image_path)\n            image = cv2.cvtColor(image, cv2.COLOR_BGR2RGB)\n            image = np.array(image)\n\n        # Read annotation\n        annotation_id = self.annotation_list[item]\n        annotation_path = os.path.join(self.annotation_root, annotation_id)\n        polygons = self.parse_carve_txt(annotation_path)\n\n        data = dict()\n        data[\"image\"] = image\n        data[\"polygons\"] = polygons\n        data[\"image_id\"] = image_id.split(\"/\")[-1]\n        data[\"image_path\"] = image_path\n\n        return data\n\n    def __getitem__(self, item):\n\n        if self.load_memory:\n            data = self.datas[item]\n        else:\n            data = self.load_img_gt(item)\n\n        if self.is_training:\n            return self.get_training_data(data[\"image\"], data[\"polygons\"],\n                                          image_id=data[\"image_id\"], image_path=data[\"image_path\"])\n        else:\n            return self.get_test_data(data[\"image\"], data[\"polygons\"],\n                                      image_id=data[\"image_id\"], image_path=data[\"image_path\"])\n\n    def __len__(self):\n        return len(self.image_list)\n\n\nif __name__ == '__main__':\n    from util.augmentation import Augmentation\n    from util.misc import regularize_sin_cos\n    from util.pbox import bbox_transfor_inv, minConnectPath\n    from util import canvas as cav\n    import time\n\n    means = (0.485, 0.456, 0.406)\n    stds = (0.229, 0.224, 0.225)\n\n    transform = Augmentation(\n        size=640, mean=means, std=stds\n    )\n\n    trainset = Ctw1500Text(\n        data_root='../data/ctw1500',\n        is_training=True,\n        transform=transform\n    )\n\n    # img, train_mask, tr_mask, tcl_mask, radius_map, sin_map, cos_map, meta = trainset[944]\n    for idx in range(0, len(trainset)):\n        t0 = time.time()\n        img, train_mask, tr_mask, tcl_mask, radius_map, sin_map, cos_map, gt_roi = trainset[idx]\n        img, train_mask, tr_mask, tcl_mask, radius_map, sin_map, cos_map, gt_roi \\\n            = map(lambda x: x.cpu().numpy(), (img, train_mask, tr_mask, tcl_mask, radius_map, sin_map, cos_map, gt_roi))\n\n        img = img.transpose(1, 2, 0)\n        img = ((img * stds + means) * 255).astype(np.uint8)\n        print(idx, img.shape)\n        top_map = radius_map[:, :, 0]\n        bot_map = radius_map[:, :, 1]\n\n        print(radius_map.shape)\n\n        sin_map, cos_map = regularize_sin_cos(sin_map, cos_map)\n        ret, labels = cv2.connectedComponents(tcl_mask[:, :, 0].astype(np.uint8), connectivity=8)\n        cv2.imshow(\"labels0\", cav.heatmap(np.array(labels * 255 / np.max(labels), dtype=np.uint8)))\n        print(np.sum(tcl_mask[:, :, 1]))\n\n        t0 = time.time()\n        for bbox_idx in range(1, ret):\n            bbox_mask = labels == bbox_idx\n            text_map = tcl_mask[:, :, 0] * bbox_mask\n\n            boxes = bbox_transfor_inv(radius_map, sin_map, cos_map, text_map, wclip=(2, 8))\n            # nms\n            boxes = lanms.merge_quadrangle_n9(boxes.astype('float32'), 0.25)\n            boxes = boxes[:, :8].reshape((-1, 4, 2)).astype(np.int32)\n            if boxes.shape[0] > 1:\n                center = np.mean(boxes, axis=1).astype(np.int32).tolist()\n                paths, routes_path = minConnectPath(center)\n                boxes = boxes[routes_path]\n                top = np.mean(boxes[:, 0:2, :], axis=1).astype(np.int32).tolist()\n                bot = np.mean(boxes[:, 2:4, :], axis=1).astype(np.int32).tolist()\n\n                boundary_point = top + bot[::-1]\n                # for index in routes:\n\n                for ip, pp in enumerate(top):\n                    if ip == 0:\n                        color = (0, 255, 255)\n                    elif ip == len(top) - 1:\n                        color = (255, 255, 0)\n                    else:\n                        color = (0, 0, 255)\n                    cv2.circle(img, (int(pp[0]), int(pp[1])), 2, color, -1)\n                for ip, pp in enumerate(bot):\n                    if ip == 0:\n                        color = (0, 255, 255)\n                    elif ip == len(top) - 1:\n                        color = (255, 255, 0)\n                    else:\n                        color = (0, 255, 0)\n                    cv2.circle(img, (int(pp[0]), int(pp[1])), 2, color, -1)\n                cv2.drawContours(img, [np.array(boundary_point)], -1, (0, 255, 255), 1)\n        # print(\"nms time: {}\".format(time.time() - t0))\n        # # cv2.imshow(\"\", img)\n        # # cv2.waitKey(0)\n\n        # print(meta[\"image_id\"])\n        cv2.imshow('imgs', img)\n        cv2.imshow(\"\", cav.heatmap(np.array(labels * 255 / np.max(labels), dtype=np.uint8)))\n        cv2.imshow(\"tr_mask\", cav.heatmap(np.array(tr_mask * 255 / np.max(tr_mask), dtype=np.uint8)))\n        cv2.imshow(\"tcl_mask\",\n                   cav.heatmap(np.array(tcl_mask[:, :, 1] * 255 / np.max(tcl_mask[:, :, 1]), dtype=np.uint8)))\n        # cv2.imshow(\"top_map\", cav.heatmap(np.array(top_map * 255 / np.max(top_map), dtype=np.uint8)))\n        # cv2.imshow(\"bot_map\", cav.heatmap(np.array(bot_map * 255 / np.max(bot_map), dtype=np.uint8)))\n        cv2.waitKey(0)\n"
  },
  {
    "path": "dataset/data_util.py",
    "content": "from PIL import ImageFile\nfrom PIL import Image\nimport numpy as np\n\nImageFile.LOAD_TRUNCATED_IMAGES = True\nImage.MAX_IMAGE_PIXELS = None\n\n\ndef pil_load_img(path):\n    image = Image.open(path)\n    if image.mode != 'RGB':\n        image = image.convert('RGB')\n    image = np.array(image)\n    return image"
  },
  {
    "path": "dataset/dataload.py",
    "content": "# -*- coding: utf-8 -*-\n__author__ = \"S.X.Zhang\"\nimport copy\nimport cv2\nimport torch\nimport numpy as np\nfrom PIL import Image\nfrom scipy import ndimage as ndimg\nfrom cfglib.config import config as cfg\nfrom util.misc import find_bottom, find_long_edges, split_edge_seqence, \\\n    vector_sin, get_sample_point\n\n\ndef pil_load_img(path):\n    image = Image.open(path)\n    image = np.array(image)\n    return image\n\n\nclass TextInstance(object):\n    def __init__(self, points, orient, text):\n        self.orient = orient\n        self.text = text\n        self.bottoms = None\n        self.e1 = None\n        self.e2 = None\n        if self.text != \"#\":\n            self.label = 1\n        else:\n            self.label = -1\n\n        '''\n        remove_points = []\n        if len(points) > 4:\n            # remove point if area is almost unchanged after removing it\n            ori_area = cv2.contourArea(points)\n            for p in range(len(points)):\n                # attempt to remove p\n                index = list(range(len(points)))\n                index.remove(p)\n                area = cv2.contourArea(points[index])\n                if np.abs(ori_area - area)/ori_area < 0.0017 and len(points) - len(remove_points) > 4:\n                    remove_points.append(p)\n            self.points = np.array([point for i, point in enumerate(points) if i not in remove_points])\n        else:\n            self.points = np.array(points)\n        '''\n        self.points = np.array(points)\n\n    def find_bottom_and_sideline(self):\n        self.bottoms = find_bottom(self.points)  # find two bottoms of this Text\n        self.e1, self.e2 = find_long_edges(self.points, self.bottoms)  # find two long edge sequence\n\n    def get_sample_point(self, size=None):\n        mask = np.zeros(size, np.uint8)\n        cv2.fillPoly(mask, [self.points.astype(np.int32)], color=(1,))\n        control_points = get_sample_point(mask, cfg.num_points, cfg.approx_factor)\n\n        return control_points\n\n    def get_control_points(self, size=None):\n        n_disk = cfg.num_control_points // 2 - 1\n        sideline1 = split_edge_seqence(self.points, self.e1, n_disk)\n        sideline2 = split_edge_seqence(self.points, self.e2, n_disk)[::-1]\n        if sideline1[0][0] > sideline1[-1][0]:\n            sideline1 = sideline1[::-1]\n            sideline2 = sideline2[::-1]\n        p1 = np.mean(sideline1, axis=0)\n        p2 = np.mean(sideline2, axis=0)\n        vpp = vector_sin(p1 - p2)\n        if vpp >= 0:\n            top = sideline2\n            bot = sideline1\n        else:\n            top = sideline1\n            bot = sideline2\n\n        control_points = np.concatenate([np.array(top), np.array(bot[::-1])], axis=0).astype(np.float32)\n\n        return control_points\n\n    def __repr__(self):\n        return str(self.__dict__)\n\n    def __getitem__(self, item):\n        return getattr(self, item)\n\n\nclass TextDataset(object):\n\n    def __init__(self, transform, is_training=False):\n        super().__init__()\n        self.transform = transform\n        self.is_training = is_training\n        self.min_text_size = 4\n        self.jitter = 0.65\n        self.th_b = 0.35\n\n    @staticmethod\n    def sigmoid_alpha(x, k):\n        betak = (1 + np.exp(-k)) / (1 - np.exp(-k))\n        dm = max(np.max(x), 0.0001)\n        res = (2 / (1 + np.exp(-x * k / dm)) - 1) * betak\n        return np.maximum(0, res)\n\n    @staticmethod\n    def fill_polygon(mask, pts, value):\n        \"\"\"\n        fill polygon in the mask with value\n        :param mask: input mask\n        :param pts: polygon to draw\n        :param value: fill value\n        \"\"\"\n        # cv2.drawContours(mask, [polygon.astype(np.int32)], -1, value, -1)\n        cv2.fillPoly(mask, [pts.astype(np.int32)], color=(value,))\n        # rr, cc = drawpoly(polygon[:, 1], polygon[:, 0], shape=(mask.shape[0],mask.shape[1]))\n        # mask[rr, cc] = value\n\n    @staticmethod\n    def generate_proposal_point(text_mask, num_points, approx_factor, jitter=0.0, distance=10.0):\n        # get  proposal point in contours\n        h, w = text_mask.shape[0:2]\n        contours, _ = cv2.findContours(text_mask.astype(np.uint8), cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE)\n        epsilon = approx_factor * cv2.arcLength(contours[0], True)\n        approx = cv2.approxPolyDP(contours[0], epsilon, True).reshape((-1, 2))\n        ctrl_points = split_edge_seqence(approx, num_points)\n\n        ctrl_points = np.array(ctrl_points[:num_points, :]).astype(np.int32)\n\n        if jitter > 0:\n            x_offset = (np.random.rand(ctrl_points.shape[0]) - 0.5) * distance*jitter\n            y_offset = (np.random.rand(ctrl_points.shape[0]) - 0.5) * distance*jitter\n            ctrl_points[:, 0] += x_offset.astype(np.int32)\n            ctrl_points[:, 1] += y_offset.astype(np.int32)\n        ctrl_points[:, 0] = np.clip(ctrl_points[:, 0], 1, w - 2)\n        ctrl_points[:, 1] = np.clip(ctrl_points[:, 1], 1, h - 2)\n        return ctrl_points\n\n    @staticmethod\n    def compute_direction_field(inst_mask, h, w):\n        _, labels = cv2.distanceTransformWithLabels(inst_mask, cv2.DIST_L2,\n                                                    cv2.DIST_MASK_PRECISE, labelType=cv2.DIST_LABEL_PIXEL)\n        # # compute the direction field\n        index = np.copy(labels)\n        index[inst_mask > 0] = 0\n        place = np.argwhere(index > 0)\n        nearCord = place[labels - 1, :]\n        x = nearCord[:, :, 0]\n        y = nearCord[:, :, 1]\n        nearPixel = np.zeros((2, h, w))\n        nearPixel[0, :, :] = x\n        nearPixel[1, :, :] = y\n        grid = np.indices(inst_mask.shape)\n        grid = grid.astype(float)\n        diff = nearPixel - grid\n\n        return diff\n\n    def make_text_region(self, img, polygons):\n        h, w = img.shape[0], img.shape[1]\n        mask_zeros = np.zeros(img.shape[:2], np.uint8)\n\n        train_mask = np.ones((h, w), np.uint8)\n        tr_mask = np.zeros((h, w), np.uint8)\n        weight_matrix = np.zeros((h, w), dtype=np.float)\n        direction_field = np.zeros((2, h, w), dtype=np.float)\n        distance_field = np.zeros((h, w), np.float)\n\n        gt_points = np.zeros((cfg.max_annotation, cfg.num_points, 2), dtype=np.float)\n        proposal_points = np.zeros((cfg.max_annotation, cfg.num_points, 2), dtype=np.float)\n        ignore_tags = np.zeros((cfg.max_annotation,), dtype=np.int)\n\n        if polygons is None:\n            return train_mask, tr_mask, \\\n                   distance_field, direction_field, \\\n                   weight_matrix, gt_points, proposal_points, ignore_tags\n\n        for idx, polygon in enumerate(polygons):\n            if idx >= cfg.max_annotation:\n                break\n            polygon.points[:, 0] = np.clip(polygon.points[:, 0], 1, w - 2)\n            polygon.points[:, 1] = np.clip(polygon.points[:, 1], 1, h - 2)\n            gt_points[idx, :, :] = polygon.get_sample_point(size=(h, w))\n            cv2.fillPoly(tr_mask, [polygon.points.astype(np.int)], color=(idx + 1,))\n\n            inst_mask = mask_zeros.copy()\n            cv2.fillPoly(inst_mask, [polygon.points.astype(np.int32)], color=(1,))\n            dmp = ndimg.distance_transform_edt(inst_mask)  # distance transform\n\n            if polygon.text == '#' or np.max(dmp) < self.min_text_size or np.sum(inst_mask)<150:\n                cv2.fillPoly(train_mask, [polygon.points.astype(np.int32)], color=(0,))\n                ignore_tags[idx] = -1\n            else:\n                ignore_tags[idx] = 1\n\n            proposal_points[idx, :, :] = \\\n                self.generate_proposal_point(dmp / (np.max(dmp)+1e-3) >= self.th_b, cfg.num_points,\n                                             cfg.approx_factor, jitter=self.jitter, distance=self.th_b * np.max(dmp))\n\n            distance_field[:, :] = np.maximum(distance_field[:, :], dmp / (np.max(dmp)+1e-3))\n\n            weight_matrix[inst_mask > 0] = 1. / np.sqrt(inst_mask.sum())\n            diff = self.compute_direction_field(inst_mask, h, w)\n            direction_field[:, inst_mask > 0] = diff[:, inst_mask > 0]\n\n        # ### background ######\n        weight_matrix[tr_mask == 0] = 1. / np.sqrt(np.sum(tr_mask == 0))\n        # diff = self.compute_direction_field((tr_mask == 0).astype(np.uint8), h, w)\n        # direction_field[:, tr_mask == 0] = diff[:, tr_mask == 0]\n\n        train_mask = np.clip(train_mask, 0, 1)\n\n        return train_mask, tr_mask, \\\n               distance_field, direction_field, \\\n               weight_matrix, gt_points, proposal_points, ignore_tags\n\n    def get_training_data(self, image, polygons, image_id=None, image_path=None):\n        np.random.seed()\n        if self.transform:\n            #image, polygons = self.transform(image, polygons)\n            image, polygons = self.transform(copy.deepcopy(image), copy.deepcopy(polygons))\n\n        train_mask, tr_mask, \\\n        distance_field, direction_field, \\\n        weight_matrix, gt_points, proposal_points, ignore_tags = self.make_text_region(image, polygons)\n\n        # # to pytorch channel sequence\n        image = image.transpose(2, 0, 1)\n        image = torch.from_numpy(image).float()\n\n        train_mask = torch.from_numpy(train_mask).bool()\n        tr_mask = torch.from_numpy(tr_mask).int()\n        weight_matrix = torch.from_numpy(weight_matrix).float()\n        direction_field = torch.from_numpy(direction_field).float()\n        distance_field = torch.from_numpy(distance_field).float()\n        gt_points = torch.from_numpy(gt_points).float()\n        proposal_points = torch.from_numpy(proposal_points).float()\n        ignore_tags = torch.from_numpy(ignore_tags).int()\n\n        return image, train_mask, tr_mask, distance_field, \\\n               direction_field, weight_matrix, gt_points, proposal_points, ignore_tags\n\n    def get_test_data(self, image, polygons=None, image_id=None, image_path=None):\n        H, W, _ = image.shape\n        if self.transform:\n            image, polygons = self.transform(image, polygons)\n\n        # max point per polygon for annotation\n        points = np.zeros((cfg.max_annotation, 20, 2))\n        length = np.zeros(cfg.max_annotation, dtype=int)\n        label_tag = np.zeros(cfg.max_annotation, dtype=int)\n        if polygons is not None:\n            for i, polygon in enumerate(polygons):\n                pts = polygon.points\n                points[i, :pts.shape[0]] = polygon.points\n                length[i] = pts.shape[0]\n                if polygon.text != '#':\n                    label_tag[i] = 1\n                else:\n                    label_tag[i] = -1\n\n        meta = {\n            'image_id': image_id,\n            'image_path': image_path,\n            'annotation': points,\n            'n_annotation': length,\n            'label_tag': label_tag,\n            'Height': H,\n            'Width': W\n        }\n\n        # to pytorch channel sequence\n        image = image.transpose(2, 0, 1)\n\n        return image, meta\n\n    def __len__(self):\n        raise NotImplementedError()\n"
  },
  {
    "path": "dataset/deploy.py",
    "content": "# -*- coding: utf-8 -*-\n__author__ = \"S.X.Zhang\"\nimport scipy.io as io\nimport numpy as np\nimport os\n\nfrom dataset.data_util import pil_load_img\nfrom dataset.dataload import TextDataset, TextInstance\n\n\nclass DeployDataset(TextDataset):\n\n    def __init__(self, image_root, transform=None):\n        super().__init__(transform)\n        self.image_root = image_root\n        self.image_list = os.listdir(image_root)\n\n    def __getitem__(self, item):\n\n        # Read image data\n        image_id = self.image_list[item]\n        image_path = os.path.join(self.image_root, image_id)\n        image = pil_load_img(image_path)\n\n        return self.get_test_data(image, image_id=image_id, image_path=image_path)\n\n    def __len__(self):\n        return len(self.image_list)\n\n\nif __name__ == '__main__':\n    import os\n    from util.augmentation import BaseTransform, Augmentation\n    from torch.utils.data import DataLoader\n\n    means = (0.485, 0.456, 0.406)\n    stds = (0.229, 0.224, 0.225)\n\n    transform = BaseTransform(\n        size=512, mean=means, std=stds\n    )\n\n    trainset = DeployDataset(\n        image_root='data/total-text/Images/Train',\n        transform=transform\n    )\n\n    loader = DataLoader(trainset, batch_size=1, num_workers=0)\n\n    for img, meta in loader:\n        print(img.size(), type(img))"
  },
  {
    "path": "dataset/icdar15/075eval.sh",
    "content": "rm submit/*\ncp $1/*.txt submit\ncd submit/;zip -r  submit.zip *;mv submit.zip ../; cd ../\npython2 script075.py -g=gt.zip -s=submit.zip\n"
  },
  {
    "path": "dataset/icdar15/Evaluation_Protocol/readme.txt",
    "content": "INSTRUCTIONS FOR THE STANDALONE SCRIPTS\nRequirements:\n-         Python version 2.7.\n-         Each Task requires different Python modules. When running the script, if some module is not installed you will see a notification and installation instructions.\n \nProcedure:\nDownload the ZIP file for the requested script and unzip it to a directory.\n \nOpen a terminal in the directory and run the command:\npython script.py –g=gt.zip –s=submit.zip\n \nIf you have already installed all the required modules, then you will see the method’s results or an error message if the submitted file is not correct.\n \nparameters:\n-g: Path of the Ground Truth file. In most cases, the Ground Truth will be included in the same Zip file named 'gt.zip', gt.txt' or 'gt.json'. If not, you will be able to get it on the Downloads page of the Task.\n-s: Path of your method's results file.\n \nOptional parameters:\n-o: Path to a directory where to copy the file ‘results.zip’ that contains per-sample results.\n-p: JSON string parameters to override the script default parameters. The parameters that can be overrided are inside the function 'default_evaluation_params' located at the begining of the evaluation Script.\n \nExample: python script.py –g=gt.zip –s=submit.zip –o=./ -p='{\" IOU_CONSTRAINT\" = 0.8}'"
  },
  {
    "path": "dataset/icdar15/Evaluation_Protocol/rrc_evaluation_funcs.py",
    "content": "#!/usr/bin/env python2\n#encoding: UTF-8\nimport json\nimport zipfile\nimport re\nimport sys\nimport os\nimport codecs\n# import importlib\n# # from StringIO import StringIO\n\ndef print_help():\n    sys.stdout.write('Usage: python %s.py -g=<gtFile> -s=<submFile> [-o=<outputFolder> -p=<jsonParams>]' %sys.argv[0])\n    sys.exit(2)\n    \n\ndef load_zip_file_keys(file,fileNameRegExp=''):\n    \"\"\"\n    Returns an array with the entries of the ZIP file that match with the regular expression.\n    The key's are the names or the file or the capturing group definied in the fileNameRegExp\n    \"\"\"\n    try:\n        archive=zipfile.ZipFile(file, mode='r', allowZip64=True)\n    except :\n        raise Exception('Error loading the ZIP archive.')\n\n    pairs = []\n    \n    for name in archive.namelist():\n        addFile = True\n        keyName = name\n        if fileNameRegExp!=\"\":\n            m = re.match(fileNameRegExp,name)\n            if m == None:\n                addFile = False\n            else:\n                if len(m.groups())>0:\n                    keyName = m.group(1)\n                    \n        if addFile:\n            pairs.append( keyName )\n                \n    return pairs\n    \n\ndef load_zip_file(file,fileNameRegExp='',allEntries=False):\n    \"\"\"\n    Returns an array with the contents (filtered by fileNameRegExp) of a ZIP file.\n    The key's are the names or the file or the capturing group definied in the fileNameRegExp\n    allEntries validates that all entries in the ZIP file pass the fileNameRegExp\n    \"\"\"\n    try:\n        archive=zipfile.ZipFile(file, mode='r', allowZip64=True)\n    except :\n        raise Exception('Error loading the ZIP archive')    \n\n    pairs = []\n    for name in archive.namelist():\n        addFile = True\n        keyName = name\n        if fileNameRegExp!=\"\":\n            m = re.match(fileNameRegExp,name)\n            if m == None:\n                addFile = False\n            else:\n                if len(m.groups())>0:\n                    keyName = m.group(1)\n        \n        if addFile:\n            pairs.append( [ keyName , archive.read(name)] )\n        else:\n            if allEntries:\n                raise Exception('ZIP entry not valid: %s' %name)             \n\n    return dict(pairs)\n\t\ndef decode_utf8(raw):\n    \"\"\"\n    Returns a Unicode object on success, or None on failure\n    \"\"\"\n    try:\n        raw = codecs.decode(raw,'utf-8', 'replace')\n        #extracts BOM if exists\n        raw = raw.encode('utf8')\n        if raw.startswith(codecs.BOM_UTF8):\n            raw = raw.replace(codecs.BOM_UTF8, '', 1)\n        return raw.decode('utf-8')\n    except:\n       return None\n   \ndef validate_lines_in_file(fileName,file_contents,CRLF=True,LTRB=True,withTranscription=False,withConfidence=False,imWidth=0,imHeight=0):\n    \"\"\"\n    This function validates that all lines of the file calling the Line validation function for each line\n    \"\"\"\n    utf8File = decode_utf8(file_contents)\n    if (utf8File is None) :\n        raise Exception(\"The file %s is not UTF-8\" %fileName)\n\n    lines = utf8File.split( \"\\r\\n\" if CRLF else \"\\n\" )\n    for line in lines:\n        line = line.replace(\"\\r\",\"\").replace(\"\\n\",\"\")\n        if(line != \"\"):\n            try:\n                validate_tl_line(line,LTRB,withTranscription,withConfidence,imWidth,imHeight)\n            except Exception as e:\n                raise Exception((\"Line in sample not valid. Sample: %s Line: %s Error: %s\" %(fileName,line,str(e))).encode('utf-8', 'replace'))\n    \n   \n   \ndef validate_tl_line(line,LTRB=True,withTranscription=True,withConfidence=True,imWidth=0,imHeight=0):\n    \"\"\"\n    Validate the format of the line. If the line is not valid an exception will be raised.\n    If maxWidth and maxHeight are specified, all points must be inside the imgage bounds.\n    Posible values are:\n    LTRB=True: xmin,ymin,xmax,ymax[,confidence][,transcription] \n    LTRB=False: x1,y1,x2,y2,x3,y3,x4,y4[,confidence][,transcription] \n    \"\"\"\n    get_tl_line_values(line,LTRB,withTranscription,withConfidence,imWidth,imHeight)\n    \n   \ndef get_tl_line_values(line,LTRB=True,withTranscription=False,withConfidence=False,imWidth=0,imHeight=0):\n    \"\"\"\n    Validate the format of the line. If the line is not valid an exception will be raised.\n    If maxWidth and maxHeight are specified, all points must be inside the imgage bounds.\n    Posible values are:\n    LTRB=True: xmin,ymin,xmax,ymax[,confidence][,transcription] \n    LTRB=False: x1,y1,x2,y2,x3,y3,x4,y4[,confidence][,transcription] \n    Returns values from a textline. Points , [Confidences], [Transcriptions]\n    \"\"\"\n    confidence = 0.0\n    transcription = \"\";\n    points = []\n    \n    numPoints = 4;\n    \n    if LTRB:\n    \n        numPoints = 4;\n        \n        if withTranscription and withConfidence:\n            m = re.match(r'^\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*([0-9]+)\\s*,\\s*([0-9]+)\\s*,\\s*([0-1].?[0-9]*)\\s*,(.*)$',line)\n            if m == None :\n                m = re.match(r'^\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*([0-9]+)\\s*,\\s*([0-9]+)\\s*,\\s*([0-1].?[0-9]*)\\s*,(.*)$',line)\n                raise Exception(\"Format incorrect. Should be: xmin,ymin,xmax,ymax,confidence,transcription\")\n        elif withConfidence:\n            m = re.match(r'^\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*([0-9]+)\\s*,\\s*([0-9]+)\\s*,\\s*([0-1].?[0-9]*)\\s*$',line)\n            if m == None :\n                raise Exception(\"Format incorrect. Should be: xmin,ymin,xmax,ymax,confidence\")\n        elif withTranscription:\n            m = re.match(r'^\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*([0-9]+)\\s*,\\s*([0-9]+)\\s*,(.*)$',line)\n            if m == None :\n                raise Exception(\"Format incorrect. Should be: xmin,ymin,xmax,ymax,transcription\")\n        else:\n            m = re.match(r'^\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*([0-9]+)\\s*,\\s*([0-9]+)\\s*,?\\s*$',line)\n            if m == None :\n                raise Exception(\"Format incorrect. Should be: xmin,ymin,xmax,ymax\")\n            \n        xmin = int(m.group(1))\n        ymin = int(m.group(2))\n        xmax = int(m.group(3))\n        ymax = int(m.group(4))\n        if(xmax<xmin):\n                raise Exception(\"Xmax value (%s) not valid (Xmax < Xmin).\" %(xmax))\n        if(ymax<ymin):\n                raise Exception(\"Ymax value (%s)  not valid (Ymax < Ymin).\" %(ymax))  \n\n        points = [ float(m.group(i)) for i in range(1, (numPoints+1) ) ]\n        \n        if (imWidth>0 and imHeight>0):\n            validate_point_inside_bounds(xmin,ymin,imWidth,imHeight);\n            validate_point_inside_bounds(xmax,ymax,imWidth,imHeight);\n\n    else:\n        \n        numPoints = 8;\n        \n        if withTranscription and withConfidence:\n            m = re.match(r'^\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*([0-1].?[0-9]*)\\s*,(.*)$',line)\n            if m == None :\n                raise Exception(\"Format incorrect. Should be: x1,y1,x2,y2,x3,y3,x4,y4,confidence,transcription\")\n        elif withConfidence:\n            m = re.match(r'^\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*([0-1].?[0-9]*)\\s*$',line)\n            if m == None :\n                raise Exception(\"Format incorrect. Should be: x1,y1,x2,y2,x3,y3,x4,y4,confidence\")\n        elif withTranscription:\n            m = re.match(r'^\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,(.*)$',line)\n            if m == None :\n                raise Exception(\"Format incorrect. Should be: x1,y1,x2,y2,x3,y3,x4,y4,transcription\")\n        else:\n            m = re.match(r'^\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*,\\s*(-?[0-9]+)\\s*$',line)\n            if m == None :\n                raise Exception(\"Format incorrect. Should be: x1,y1,x2,y2,x3,y3,x4,y4\")\n            \n        points = [ float(m.group(i)) for i in range(1, (numPoints+1) ) ]\n        \n        validate_clockwise_points(points)\n        \n        if (imWidth>0 and imHeight>0):\n            validate_point_inside_bounds(points[0],points[1],imWidth,imHeight);\n            validate_point_inside_bounds(points[2],points[3],imWidth,imHeight);\n            validate_point_inside_bounds(points[4],points[5],imWidth,imHeight);\n            validate_point_inside_bounds(points[6],points[7],imWidth,imHeight);\n            \n    \n    if withConfidence:\n        try:\n            confidence = float(m.group(numPoints+1))\n        except ValueError:\n            raise Exception(\"Confidence value must be a float\")       \n            \n    if withTranscription:\n        posTranscription = numPoints + (2 if withConfidence else 1)\n        transcription = m.group(posTranscription)\n        m2 = re.match(r'^\\s*\\\"(.*)\\\"\\s*$',transcription)\n        if m2 != None : #Transcription with double quotes, we extract the value and replace escaped characters\n            transcription = m2.group(1).replace(\"\\\\\\\\\", \"\\\\\").replace(\"\\\\\\\"\", \"\\\"\")\n    \n    return points,confidence,transcription\n    \n            \ndef validate_point_inside_bounds(x,y,imWidth,imHeight):\n    if(x<0 or x>imWidth):\n            raise Exception(\"X value (%s) not valid. Image dimensions: (%s,%s)\" %(x,imWidth,imHeight))\n    if(y<0 or y>imHeight):\n            raise Exception(\"Y value (%s)  not valid. Image dimensions: (%s,%s) Sample: %s Line:%s\" %(y,imWidth,imHeight))\n\ndef validate_clockwise_points(points):\n    \"\"\"\n    Validates that the points that the 4 points that dlimite a polygon are in clockwise order.\n    \"\"\"\n    \n    if len(points) != 8:\n        raise Exception(\"Points list not valid.\" + str(len(points)))\n    \n    point = [\n                [int(points[0]) , int(points[1])],\n                [int(points[2]) , int(points[3])],\n                [int(points[4]) , int(points[5])],\n                [int(points[6]) , int(points[7])]\n            ]\n    edge = [\n                ( point[1][0] - point[0][0])*( point[1][1] + point[0][1]),\n                ( point[2][0] - point[1][0])*( point[2][1] + point[1][1]),\n                ( point[3][0] - point[2][0])*( point[3][1] + point[2][1]),\n                ( point[0][0] - point[3][0])*( point[0][1] + point[3][1])\n    ]\n    \n    summatory = edge[0] + edge[1] + edge[2] + edge[3];\n    if summatory>0:\n        raise Exception(\"Points are not clockwise. The coordinates of bounding quadrilaterals have to be given in clockwise order. Regarding the correct interpretation of 'clockwise' remember that the image coordinate system used is the standard one, with the image origin at the upper left, the X axis extending to the right and Y axis extending downwards.\")\n\ndef get_tl_line_values_from_file_contents(content,CRLF=True,LTRB=True,withTranscription=False,withConfidence=False,imWidth=0,imHeight=0,sort_by_confidences=True):\n    \"\"\"\n    Returns all points, confindences and transcriptions of a file in lists. Valid line formats:\n    xmin,ymin,xmax,ymax,[confidence],[transcription]\n    x1,y1,x2,y2,x3,y3,x4,y4,[confidence],[transcription]\n    \"\"\"\n    pointsList = []\n    transcriptionsList = []\n    confidencesList = []\n    \n    lines = content.split( \"\\r\\n\" if CRLF else \"\\n\" )\n    for line in lines:\n        line = line.replace(\"\\r\",\"\").replace(\"\\n\",\"\")\n        if(line != \"\") :\n            points, confidence, transcription = get_tl_line_values(line,LTRB,withTranscription,withConfidence,imWidth,imHeight);\n            pointsList.append(points)\n            transcriptionsList.append(transcription)\n            confidencesList.append(confidence)\n\n    if withConfidence and len(confidencesList)>0 and sort_by_confidences:\n        import numpy as np\n        sorted_ind = np.argsort(-np.array(confidencesList))\n        confidencesList = [confidencesList[i] for i in sorted_ind]\n        pointsList = [pointsList[i] for i in sorted_ind]\n        transcriptionsList = [transcriptionsList[i] for i in sorted_ind]        \n        \n    return pointsList,confidencesList,transcriptionsList\n\ndef main_evaluation(p,default_evaluation_params_fn,validate_data_fn,evaluate_method_fn,show_result=True,per_sample=True):\n    \"\"\"\n    This process validates a method, evaluates it and if it succed generates a ZIP file with a JSON entry for each sample.\n    Params:\n    p: Dictionary of parmeters with the GT/submission locations. If None is passed, the parameters send by the system are used.\n    default_evaluation_params_fn: points to a function that returns a dictionary with the default parameters used for the evaluation\n    validate_data_fn: points to a method that validates the corrct format of the submission\n    evaluate_method_fn: points to a function that evaluated the submission and return a Dictionary with the results\n    \"\"\"\n\n    if (p == None):\n        p = dict([s[1:].split('=') for s in sys.argv[1:]])\n        if(len(sys.argv)<3):\n            print_help()\n\n    evalParams = default_evaluation_params_fn()\n    if 'p' in p.keys():\n        evalParams.update( p['p'] if isinstance(p['p'], dict) else json.loads(p['p'][1:-1]) )\n\n    resDict={'calculated':True,'Message':'','method':'{}','per_sample':'{}'}    \n    try:\n        validate_data_fn(p['g'], p['s'], evalParams)  \n        evalData = evaluate_method_fn(p['g'], p['s'], evalParams)\n        resDict.update(evalData)\n        \n    except Exception as e:\n        resDict['Message']= str(e)\n        resDict['calculated']=False\n\n    if 'o' in p:\n        if not os.path.exists(p['o']):\n            os.makedirs(p['o'])\n\n        resultsOutputname = p['o'] + '/results.zip'\n        outZip = zipfile.ZipFile(resultsOutputname, mode='w', allowZip64=True)\n\n        del resDict['per_sample']\n        if 'output_items' in resDict.keys():\n            del resDict['output_items']\n\n        outZip.writestr('method.json',json.dumps(resDict))\n        \n    if not resDict['calculated']:\n        if show_result:\n            sys.stderr.write('Error!\\n'+ resDict['Message']+'\\n\\n')\n        if 'o' in p:\n            outZip.close()\n        return resDict\n    \n    if 'o' in p:\n        if per_sample == True:\n            for k,v in evalData['per_sample'].iteritems():\n                outZip.writestr( k + '.json',json.dumps(v)) \n\n            if 'output_items' in evalData.keys():\n                for k, v in evalData['output_items'].iteritems():\n                    outZip.writestr( k,v) \n\n        outZip.close()\n\n    if show_result:\n        sys.stdout.write(\"Calculated:\\n\")\n        sys.stdout.write(json.dumps(resDict['method']))\n        print(\"\\n\")\n    \n    return resDict\n\n\ndef main_validation(default_evaluation_params_fn,validate_data_fn):\n    \"\"\"\n    This process validates a method\n    Params:\n    default_evaluation_params_fn: points to a function that returns a dictionary with the default parameters used for the evaluation\n    validate_data_fn: points to a method that validates the corrct format of the submission\n    \"\"\"    \n    try:\n        p = dict([s[1:].split('=') for s in sys.argv[1:]])\n        evalParams = default_evaluation_params_fn()\n        if 'p' in p.keys():\n            evalParams.update( p['p'] if isinstance(p['p'], dict) else json.loads(p['p'][1:-1]) )\n\n        validate_data_fn(p['g'], p['s'], evalParams)              \n        print('SUCCESS')\n        sys.exit(0)\n    except Exception as e:\n        print(str(e))\n        sys.exit(101)\n"
  },
  {
    "path": "dataset/icdar15/Evaluation_Protocol/script.py",
    "content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\nfrom collections import namedtuple\nimport rrc_evaluation_funcs\nimport importlib\nimport numpy as np\nimport Polygon as plg\nfrom tqdm import tqdm\n\neval_result_dir = \"../../output/Analysis/output_eval\"\nfid_path = '{}/Eval_icdar15.txt'.format(eval_result_dir)\n\ndef evaluation_imports():\n    \"\"\"\n    evaluation_imports: Dictionary ( key = module name , value = alias  )  with python modules used in the evaluation.\n    \"\"\"\n    return {\n            'Polygon':'plg',\n            'numpy':'np'\n            }\n\ndef default_evaluation_params():\n    \"\"\"\n    default_evaluation_params: Default parameters to use for the validation and evaluation.\n    \"\"\"\n    return {\n                'IOU_CONSTRAINT' :0.5,\n                'AREA_PRECISION_CONSTRAINT' :0.5,\n                'GT_SAMPLE_NAME_2_ID':'gt_img_([0-9]+).txt',\n                'DET_SAMPLE_NAME_2_ID':'res_img_([0-9]+).txt',\n                'LTRB':False, #LTRB:2points(left,top,right,bottom) or 4 points(x1,y1,x2,y2,x3,y3,x4,y4)\n                'CRLF':False, # Lines are delimited by Windows CRLF format\n                'CONFIDENCES':False, #Detections must include confidence value. AP will be calculated\n                'PER_SAMPLE_RESULTS':True #Generate per sample results and produce data for visualization\n            }\n\ndef validate_data(gtFilePath, submFilePath,evaluationParams):\n    \"\"\"\n    Method validate_data: validates that all files in the results folder are correct (have the correct name contents).\n                            Validates also that there are no missing files in the folder.\n                            If some error detected, the method raises the error\n    \"\"\"\n    gt = rrc_evaluation_funcs.load_zip_file(gtFilePath,evaluationParams['GT_SAMPLE_NAME_2_ID'])\n\n    subm = rrc_evaluation_funcs.load_zip_file(submFilePath,evaluationParams['DET_SAMPLE_NAME_2_ID'],True)\n\n    #Validate format of GroundTruth\n    for k in gt:\n        rrc_evaluation_funcs.validate_lines_in_file(k,gt[k],evaluationParams['CRLF'],evaluationParams['LTRB'],True)\n\n    #Validate format of results\n    for k in subm:\n        if (k in gt) == False :\n            raise Exception(\"The sample %s not present in GT\" %k)\n\n        rrc_evaluation_funcs.validate_lines_in_file(k,subm[k],evaluationParams['CRLF'],evaluationParams['LTRB'],False,evaluationParams['CONFIDENCES'])\n\n\ndef evaluate_method(gtFilePath, submFilePath, evaluationParams):\n    \"\"\"\n    Method evaluate_method: evaluate method and returns the results\n        Results. Dictionary with the following values:\n        - method (required)  Global method metrics. Ex: { 'Precision':0.8,'Recall':0.9 }\n        - samples (optional) Per sample metrics. Ex: {'sample1' : { 'Precision':0.8,'Recall':0.9 } , 'sample2' : { 'Precision':0.8,'Recall':0.9 }\n    \"\"\"\n    for module,alias in evaluation_imports().items():\n        globals()[alias] = importlib.import_module(module)\n\n    def polygon_from_points(points):\n        \"\"\"\n        Returns a Polygon object to use with the Polygon2 class from a list of 8 points: x1,y1,x2,y2,x3,y3,x4,y4\n        \"\"\"\n        resBoxes=np.empty([1,8],dtype='int32')\n        resBoxes[0,0]=int(points[0])\n        resBoxes[0,4]=int(points[1])\n        resBoxes[0,1]=int(points[2])\n        resBoxes[0,5]=int(points[3])\n        resBoxes[0,2]=int(points[4])\n        resBoxes[0,6]=int(points[5])\n        resBoxes[0,3]=int(points[6])\n        resBoxes[0,7]=int(points[7])\n        pointMat = resBoxes[0].reshape([2,4]).T\n        return plg.Polygon(pointMat)\n\n    def rectangle_to_polygon(rect):\n        resBoxes=np.empty([1,8],dtype='int32')\n        resBoxes[0,0]=int(rect.xmin)\n        resBoxes[0,4]=int(rect.ymax)\n        resBoxes[0,1]=int(rect.xmin)\n        resBoxes[0,5]=int(rect.ymin)\n        resBoxes[0,2]=int(rect.xmax)\n        resBoxes[0,6]=int(rect.ymin)\n        resBoxes[0,3]=int(rect.xmax)\n        resBoxes[0,7]=int(rect.ymax)\n\n        pointMat = resBoxes[0].reshape([2,4]).T\n\n        return plg.Polygon(pointMat)\n\n    def rectangle_to_points(rect):\n        points = [int(rect.xmin), int(rect.ymax), int(rect.xmax), int(rect.ymax), int(rect.xmax), int(rect.ymin), int(rect.xmin), int(rect.ymin)]\n        return points\n\n    def get_union(pD,pG):\n        areaA = pD.area();\n        areaB = pG.area();\n        return areaA + areaB - get_intersection(pD, pG);\n\n    def get_intersection_over_union(pD,pG):\n        try:\n            return get_intersection(pD, pG) / get_union(pD, pG);\n        except:\n            return 0\n\n    def get_intersection(pD,pG):\n        pInt = pD & pG\n        if len(pInt) == 0:\n            return 0\n        return pInt.area()\n\n    def compute_ap(confList, matchList,numGtCare):\n        correct = 0\n        AP = 0\n        if len(confList)>0:\n            confList = np.array(confList)\n            matchList = np.array(matchList)\n            sorted_ind = np.argsort(-confList)\n            confList = confList[sorted_ind]\n            matchList = matchList[sorted_ind]\n            for n in range(len(confList)):\n                match = matchList[n]\n                if match:\n                    correct += 1\n                    AP += float(correct)/(n + 1)\n\n            if numGtCare>0:\n                AP /= numGtCare\n\n        return AP\n\n    perSampleMetrics = {}\n\n    matchedSum = 0\n\n    Rectangle = namedtuple('Rectangle', 'xmin ymin xmax ymax')\n\n    gt = rrc_evaluation_funcs.load_zip_file(gtFilePath,evaluationParams['GT_SAMPLE_NAME_2_ID'])\n    subm = rrc_evaluation_funcs.load_zip_file(submFilePath,evaluationParams['DET_SAMPLE_NAME_2_ID'],True)\n\n\n    numGlobalCareGt = 0;\n    numGlobalCareDet = 0;\n\n    arrGlobalConfidences = [];\n    arrGlobalMatches = [];\n\n    with open(fid_path, 'w') as f:\n        for resFile in tqdm(gt):\n\n            gtFile = rrc_evaluation_funcs.decode_utf8(gt[resFile])\n            recall = 0\n            precision = 0\n            hmean = 0\n\n            detMatched = 0\n\n            iouMat = np.empty([1,1])\n\n            gtPols = []\n            detPols = []\n\n            gtPolPoints = []\n            detPolPoints = []\n\n            #Array of Ground Truth Polygons' keys marked as don't Care\n            gtDontCarePolsNum = []\n            #Array of Detected Polygons' matched with a don't Care GT\n            detDontCarePolsNum = []\n\n            pairs = []\n            detMatchedNums = []\n\n            arrSampleConfidences = [];\n            arrSampleMatch = [];\n            sampleAP = 0;\n\n            evaluationLog = \"\"\n\n            pointsList,_,transcriptionsList = rrc_evaluation_funcs.get_tl_line_values_from_file_contents(gtFile,evaluationParams['CRLF'],evaluationParams['LTRB'],True,False)\n            for n in range(len(pointsList)):\n                points = pointsList[n]\n                transcription = transcriptionsList[n]\n                dontCare = transcription == \"###\"\n                if evaluationParams['LTRB']:\n                    gtRect = Rectangle(*points)\n                    gtPol = rectangle_to_polygon(gtRect)\n                else:\n                    gtPol = polygon_from_points(points)\n\n                gtPols.append(gtPol)\n                gtPolPoints.append(points)\n                if dontCare:\n                    gtDontCarePolsNum.append(len(gtPols)-1)\n\n            evaluationLog += \"GT polygons: \" + str(len(gtPols)) + (\" (\" + str(len(gtDontCarePolsNum)) + \" don't care)\\n\" if len(gtDontCarePolsNum)>0 else \"\\n\")\n\n            if resFile in subm:\n\n                detFile = rrc_evaluation_funcs.decode_utf8(subm[resFile])\n\n                pointsList,confidencesList,_ = rrc_evaluation_funcs.get_tl_line_values_from_file_contents(detFile,evaluationParams['CRLF'],evaluationParams['LTRB'],False,evaluationParams['CONFIDENCES'])\n                for n in range(len(pointsList)):\n                    points = pointsList[n]\n\n                    if evaluationParams['LTRB']:\n                        detRect = Rectangle(*points)\n                        detPol = rectangle_to_polygon(detRect)\n                    else:\n                        detPol = polygon_from_points(points)\n                    detPols.append(detPol)\n                    detPolPoints.append(points)\n                    if len(gtDontCarePolsNum)>0 :\n                        for dontCarePol in gtDontCarePolsNum:\n                            dontCarePol = gtPols[dontCarePol]\n                            intersected_area = get_intersection(dontCarePol,detPol)\n                            pdDimensions = detPol.area()\n                            precision = 0 if pdDimensions == 0 else intersected_area / pdDimensions\n                            if (precision > evaluationParams['AREA_PRECISION_CONSTRAINT'] ):\n                                detDontCarePolsNum.append( len(detPols)-1 )\n                                break\n\n                evaluationLog += \"DET polygons: \" + str(len(detPols)) + (\" (\" + str(len(detDontCarePolsNum)) + \" don't care)\\n\" if len(detDontCarePolsNum)>0 else \"\\n\")\n\n                if len(gtPols)>0 and len(detPols)>0:\n                    #Calculate IoU and precision matrixs\n                    outputShape=[len(gtPols),len(detPols)]\n                    iouMat = np.empty(outputShape)\n                    gtRectMat = np.zeros(len(gtPols),np.int8)\n                    detRectMat = np.zeros(len(detPols),np.int8)\n                    for gtNum in range(len(gtPols)):\n                        for detNum in range(len(detPols)):\n                            pG = gtPols[gtNum]\n                            pD = detPols[detNum]\n                            iouMat[gtNum,detNum] = get_intersection_over_union(pD,pG)\n\n                    for gtNum in range(len(gtPols)):\n                        for detNum in range(len(detPols)):\n                            if gtRectMat[gtNum] == 0 and detRectMat[detNum] == 0 and gtNum not in gtDontCarePolsNum and detNum not in detDontCarePolsNum :\n                                if iouMat[gtNum,detNum]>evaluationParams['IOU_CONSTRAINT']:\n                                    gtRectMat[gtNum] = 1\n                                    detRectMat[detNum] = 1\n                                    detMatched += 1\n                                    pairs.append({'gt':gtNum,'det':detNum})\n                                    detMatchedNums.append(detNum)\n                                    evaluationLog += \"Match GT #\" + str(gtNum) + \" with Det #\" + str(detNum) + \"\\n\"\n\n                if evaluationParams['CONFIDENCES']:\n                    for detNum in range(len(detPols)):\n                        if detNum not in detDontCarePolsNum :\n                            #we exclude the don't care detections\n                            match = detNum in detMatchedNums\n\n                            arrSampleConfidences.append(confidencesList[detNum])\n                            arrSampleMatch.append(match)\n\n                            arrGlobalConfidences.append(confidencesList[detNum]);\n                            arrGlobalMatches.append(match);\n\n            numGtCare = (len(gtPols) - len(gtDontCarePolsNum))\n            numDetCare = (len(detPols) - len(detDontCarePolsNum))\n            if numGtCare == 0:\n                recall = float(1)\n                precision = float(0) if numDetCare >0 else float(1)\n                sampleAP = precision\n            else:\n                recall = float(detMatched) / numGtCare\n                precision = 0 if numDetCare==0 else float(detMatched) / numDetCare\n                if evaluationParams['CONFIDENCES'] and evaluationParams['PER_SAMPLE_RESULTS']:\n                    sampleAP = compute_ap(arrSampleConfidences, arrSampleMatch, numGtCare )\n\n            hmean = 0 if (precision + recall)==0 else 2.0 * precision * recall / (precision + recall)\n\n            # write result\n            f.write('img_%s :: Precision=%.4f - Recall=%.4f\\n' % (str(resFile)+\".txt\", precision, recall))\n\n            matchedSum += detMatched\n            numGlobalCareGt += numGtCare\n            numGlobalCareDet += numDetCare\n\n            if evaluationParams['PER_SAMPLE_RESULTS']:\n                perSampleMetrics[resFile] = {\n                                                'precision':precision,\n                                                'recall':recall,\n                                                'hmean':hmean,\n                                                'pairs':pairs,\n                                                'AP':sampleAP,\n                                                'iouMat':[] if len(detPols)>100 else iouMat.tolist(),\n                                                'gtPolPoints':gtPolPoints,\n                                                'detPolPoints':detPolPoints,\n                                                'gtDontCare':gtDontCarePolsNum,\n                                                'detDontCare':detDontCarePolsNum,\n                                                'evaluationParams': evaluationParams,\n                                                'evaluationLog': evaluationLog\n                                            }\n\n        # Compute MAP and MAR\n        AP = 0\n        if evaluationParams['CONFIDENCES']:\n            AP = compute_ap(arrGlobalConfidences, arrGlobalMatches, numGlobalCareGt)\n\n        methodRecall = 0 if numGlobalCareGt == 0 else float(matchedSum)/numGlobalCareGt\n        methodPrecision = 0 if numGlobalCareDet == 0 else float(matchedSum)/numGlobalCareDet\n        methodHmean = 0 if methodRecall + methodPrecision==0 else 2* methodRecall * methodPrecision / (methodRecall + methodPrecision)\n\n        methodMetrics = {'precision':methodPrecision, 'recall':methodRecall,'hmean': methodHmean, 'AP': AP}\n\n        resDict = {'calculated':True,'Message':'','method': methodMetrics,'per_sample': perSampleMetrics}\n\n        f.write('ALL :: AP=%.4f - Precision=%.4f - Recall=%.4f - Fscore=%.4f'\n                % (AP, methodPrecision, methodRecall, methodHmean))\n\n\n    return resDict;\n\n\n\nif __name__=='__main__':\n    rrc_evaluation_funcs.main_evaluation(None,default_evaluation_params,validate_data,evaluate_method)\n"
  },
  {
    "path": "dataset/icdar15/Evaluation_Protocol/script075.py",
    "content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\nfrom collections import namedtuple\nimport rrc_evaluation_funcs\nimport importlib\nimport numpy as np\n\ndef evaluation_imports():\n    \"\"\"\n    evaluation_imports: Dictionary ( key = module name , value = alias  )  with python modules used in the evaluation. \n    \"\"\"    \n    return {\n            'Polygon':'plg',\n            'numpy':'np'\n            }\n\ndef default_evaluation_params():\n    \"\"\"\n    default_evaluation_params: Default parameters to use for the validation and evaluation.\n    \"\"\"\n    return {\n                'IOU_CONSTRAINT' :0.75,\n                'AREA_PRECISION_CONSTRAINT' :0.5,\n                'GT_SAMPLE_NAME_2_ID':'gt_img_([0-9]+).txt',\n                'DET_SAMPLE_NAME_2_ID':'res_img_([0-9]+).txt',\n                'LTRB':False, #LTRB:2points(left,top,right,bottom) or 4 points(x1,y1,x2,y2,x3,y3,x4,y4)\n                'CRLF':False, # Lines are delimited by Windows CRLF format\n                'CONFIDENCES':False, #Detections must include confidence value. AP will be calculated\n                'PER_SAMPLE_RESULTS':True #Generate per sample results and produce data for visualization\n            }\n\ndef validate_data(gtFilePath, submFilePath,evaluationParams):\n    \"\"\"\n    Method validate_data: validates that all files in the results folder are correct (have the correct name contents).\n                            Validates also that there are no missing files in the folder.\n                            If some error detected, the method raises the error\n    \"\"\"\n    gt = rrc_evaluation_funcs.load_zip_file(gtFilePath,evaluationParams['GT_SAMPLE_NAME_2_ID'])\n\n    subm = rrc_evaluation_funcs.load_zip_file(submFilePath,evaluationParams['DET_SAMPLE_NAME_2_ID'],True)\n    \n    #Validate format of GroundTruth\n    for k in gt:\n        rrc_evaluation_funcs.validate_lines_in_file(k,gt[k],evaluationParams['CRLF'],evaluationParams['LTRB'],True)\n\n    #Validate format of results\n    for k in subm:\n        if (k in gt) == False :\n            raise Exception(\"The sample %s not present in GT\" %k)\n        \n        rrc_evaluation_funcs.validate_lines_in_file(k,subm[k],evaluationParams['CRLF'],evaluationParams['LTRB'],False,evaluationParams['CONFIDENCES'])\n\n    \ndef evaluate_method(gtFilePath, submFilePath, evaluationParams):\n    \"\"\"\n    Method evaluate_method: evaluate method and returns the results\n        Results. Dictionary with the following values:\n        - method (required)  Global method metrics. Ex: { 'Precision':0.8,'Recall':0.9 }\n        - samples (optional) Per sample metrics. Ex: {'sample1' : { 'Precision':0.8,'Recall':0.9 } , 'sample2' : { 'Precision':0.8,'Recall':0.9 }\n    \"\"\"    \n    \n    for module,alias in evaluation_imports().iteritems():\n        globals()[alias] = importlib.import_module(module)    \n    \n    def polygon_from_points(points):\n        \"\"\"\n        Returns a Polygon object to use with the Polygon2 class from a list of 8 points: x1,y1,x2,y2,x3,y3,x4,y4\n        \"\"\"        \n        resBoxes=np.empty([1,8],dtype='int32')\n        resBoxes[0,0]=int(points[0])\n        resBoxes[0,4]=int(points[1])\n        resBoxes[0,1]=int(points[2])\n        resBoxes[0,5]=int(points[3])\n        resBoxes[0,2]=int(points[4])\n        resBoxes[0,6]=int(points[5])\n        resBoxes[0,3]=int(points[6])\n        resBoxes[0,7]=int(points[7])\n        pointMat = resBoxes[0].reshape([2,4]).T\n        return plg.Polygon( pointMat)    \n    \n    def rectangle_to_polygon(rect):\n        resBoxes=np.empty([1,8],dtype='int32')\n        resBoxes[0,0]=int(rect.xmin)\n        resBoxes[0,4]=int(rect.ymax)\n        resBoxes[0,1]=int(rect.xmin)\n        resBoxes[0,5]=int(rect.ymin)\n        resBoxes[0,2]=int(rect.xmax)\n        resBoxes[0,6]=int(rect.ymin)\n        resBoxes[0,3]=int(rect.xmax)\n        resBoxes[0,7]=int(rect.ymax)\n\n        pointMat = resBoxes[0].reshape([2,4]).T\n        \n        return plg.Polygon( pointMat)\n    \n    def rectangle_to_points(rect):\n        points = [int(rect.xmin), int(rect.ymax), int(rect.xmax), int(rect.ymax), int(rect.xmax), int(rect.ymin), int(rect.xmin), int(rect.ymin)]\n        return points\n        \n    def get_union(pD,pG):\n        areaA = pD.area();\n        areaB = pG.area();\n        return areaA + areaB - get_intersection(pD, pG);\n        \n    def get_intersection_over_union(pD,pG):\n        try:\n            return get_intersection(pD, pG) / get_union(pD, pG);\n        except:\n            return 0\n        \n    def get_intersection(pD,pG):\n        pInt = pD & pG\n        if len(pInt) == 0:\n            return 0\n        return pInt.area()\n    \n    def compute_ap(confList, matchList,numGtCare):\n        correct = 0\n        AP = 0\n        if len(confList)>0:\n            confList = np.array(confList)\n            matchList = np.array(matchList)\n            sorted_ind = np.argsort(-confList)\n            confList = confList[sorted_ind]\n            matchList = matchList[sorted_ind]\n            for n in range(len(confList)):\n                match = matchList[n]\n                if match:\n                    correct += 1\n                    AP += float(correct)/(n + 1)\n\n            if numGtCare>0:\n                AP /= numGtCare\n            \n        return AP\n    \n    perSampleMetrics = {}\n    \n    matchedSum = 0\n    \n    Rectangle = namedtuple('Rectangle', 'xmin ymin xmax ymax')\n    \n    gt = rrc_evaluation_funcs.load_zip_file(gtFilePath,evaluationParams['GT_SAMPLE_NAME_2_ID'])\n    subm = rrc_evaluation_funcs.load_zip_file(submFilePath,evaluationParams['DET_SAMPLE_NAME_2_ID'],True)\n   \n    numGlobalCareGt = 0;\n    numGlobalCareDet = 0;\n    \n    arrGlobalConfidences = [];\n    arrGlobalMatches = [];\n\n    for resFile in gt:\n        \n        gtFile = rrc_evaluation_funcs.decode_utf8(gt[resFile])\n        recall = 0\n        precision = 0\n        hmean = 0    \n        \n        detMatched = 0\n        \n        iouMat = np.empty([1,1])\n        \n        gtPols = []\n        detPols = []\n        \n        gtPolPoints = []\n        detPolPoints = []  \n        \n        #Array of Ground Truth Polygons' keys marked as don't Care\n        gtDontCarePolsNum = []\n        #Array of Detected Polygons' matched with a don't Care GT\n        detDontCarePolsNum = []   \n        \n        pairs = [] \n        detMatchedNums = []\n        \n        arrSampleConfidences = [];\n        arrSampleMatch = [];\n        sampleAP = 0;\n\n        evaluationLog = \"\"\n        \n        pointsList,_,transcriptionsList = rrc_evaluation_funcs.get_tl_line_values_from_file_contents(gtFile,evaluationParams['CRLF'],evaluationParams['LTRB'],True,False)\n        for n in range(len(pointsList)):\n            points = pointsList[n]\n            transcription = transcriptionsList[n]\n            dontCare = transcription == \"###\"\n            if evaluationParams['LTRB']:\n                gtRect = Rectangle(*points)\n                gtPol = rectangle_to_polygon(gtRect)\n            else:\n                gtPol = polygon_from_points(points)\n            gtPols.append(gtPol)\n            gtPolPoints.append(points)\n            if dontCare:\n                gtDontCarePolsNum.append( len(gtPols)-1 )\n                \n        evaluationLog += \"GT polygons: \" + str(len(gtPols)) + (\" (\" + str(len(gtDontCarePolsNum)) + \" don't care)\\n\" if len(gtDontCarePolsNum)>0 else \"\\n\")\n        \n        if resFile in subm:\n            \n            detFile = rrc_evaluation_funcs.decode_utf8(subm[resFile]) \n            \n            pointsList,confidencesList,_ = rrc_evaluation_funcs.get_tl_line_values_from_file_contents(detFile,evaluationParams['CRLF'],evaluationParams['LTRB'],False,evaluationParams['CONFIDENCES'])\n            for n in range(len(pointsList)):\n                points = pointsList[n]\n                \n                if evaluationParams['LTRB']:\n                    detRect = Rectangle(*points)\n                    detPol = rectangle_to_polygon(detRect)\n                else:\n                    detPol = polygon_from_points(points)                    \n                detPols.append(detPol)\n                detPolPoints.append(points)\n                if len(gtDontCarePolsNum)>0 :\n                    for dontCarePol in gtDontCarePolsNum:\n                        dontCarePol = gtPols[dontCarePol]\n                        intersected_area = get_intersection(dontCarePol,detPol)\n                        pdDimensions = detPol.area()\n                        precision = 0 if pdDimensions == 0 else intersected_area / pdDimensions\n                        if (precision > evaluationParams['AREA_PRECISION_CONSTRAINT'] ):\n                            detDontCarePolsNum.append( len(detPols)-1 )\n                            break\n                                \n            evaluationLog += \"DET polygons: \" + str(len(detPols)) + (\" (\" + str(len(detDontCarePolsNum)) + \" don't care)\\n\" if len(detDontCarePolsNum)>0 else \"\\n\")\n            \n            if len(gtPols)>0 and len(detPols)>0:\n                #Calculate IoU and precision matrixs\n                outputShape=[len(gtPols),len(detPols)]\n                iouMat = np.empty(outputShape)\n                gtRectMat = np.zeros(len(gtPols),np.int8)\n                detRectMat = np.zeros(len(detPols),np.int8)\n                for gtNum in range(len(gtPols)):\n                    for detNum in range(len(detPols)):\n                        pG = gtPols[gtNum]\n                        pD = detPols[detNum]\n                        iouMat[gtNum,detNum] = get_intersection_over_union(pD,pG)\n\n                for gtNum in range(len(gtPols)):\n                    for detNum in range(len(detPols)):\n                        if gtRectMat[gtNum] == 0 and detRectMat[detNum] == 0 and gtNum not in gtDontCarePolsNum and detNum not in detDontCarePolsNum :\n                            if iouMat[gtNum,detNum]>evaluationParams['IOU_CONSTRAINT']:\n                                gtRectMat[gtNum] = 1\n                                detRectMat[detNum] = 1\n                                detMatched += 1\n                                pairs.append({'gt':gtNum,'det':detNum})\n                                detMatchedNums.append(detNum)\n                                evaluationLog += \"Match GT #\" + str(gtNum) + \" with Det #\" + str(detNum) + \"\\n\"\n\n            if evaluationParams['CONFIDENCES']:\n                for detNum in range(len(detPols)):\n                    if detNum not in detDontCarePolsNum :\n                        #we exclude the don't care detections\n                        match = detNum in detMatchedNums\n\n                        arrSampleConfidences.append(confidencesList[detNum])\n                        arrSampleMatch.append(match)\n\n                        arrGlobalConfidences.append(confidencesList[detNum]);\n                        arrGlobalMatches.append(match);\n                            \n        numGtCare = (len(gtPols) - len(gtDontCarePolsNum))\n        numDetCare = (len(detPols) - len(detDontCarePolsNum))\n        if numGtCare == 0:\n            recall = float(1)\n            precision = float(0) if numDetCare >0 else float(1)\n            sampleAP = precision\n        else:\n            recall = float(detMatched) / numGtCare\n            precision = 0 if numDetCare==0 else float(detMatched) / numDetCare\n            if evaluationParams['CONFIDENCES'] and evaluationParams['PER_SAMPLE_RESULTS']:\n                sampleAP = compute_ap(arrSampleConfidences, arrSampleMatch, numGtCare )                    \n\n        hmean = 0 if (precision + recall)==0 else 2.0 * precision * recall / (precision + recall)                \n\n        matchedSum += detMatched\n        numGlobalCareGt += numGtCare\n        numGlobalCareDet += numDetCare\n        \n        if evaluationParams['PER_SAMPLE_RESULTS']:\n            perSampleMetrics[resFile] = {\n                                            'precision':precision,\n                                            'recall':recall,\n                                            'hmean':hmean,\n                                            'pairs':pairs,\n                                            'AP':sampleAP,\n                                            'iouMat':[] if len(detPols)>100 else iouMat.tolist(),\n                                            'gtPolPoints':gtPolPoints,\n                                            'detPolPoints':detPolPoints,\n                                            'gtDontCare':gtDontCarePolsNum,\n                                            'detDontCare':detDontCarePolsNum,\n                                            'evaluationParams': evaluationParams,\n                                            'evaluationLog': evaluationLog                                        \n                                        }\n                                    \n    # Compute MAP and MAR\n    AP = 0\n    if evaluationParams['CONFIDENCES']:\n        AP = compute_ap(arrGlobalConfidences, arrGlobalMatches, numGlobalCareGt)\n\n    methodRecall = 0 if numGlobalCareGt == 0 else float(matchedSum)/numGlobalCareGt\n    methodPrecision = 0 if numGlobalCareDet == 0 else float(matchedSum)/numGlobalCareDet\n    methodHmean = 0 if methodRecall + methodPrecision==0 else 2* methodRecall * methodPrecision / (methodRecall + methodPrecision)\n    \n    methodMetrics = {'precision':methodPrecision, 'recall':methodRecall,'hmean': methodHmean, 'AP': AP  }\n\n    resDict = {'calculated':True,'Message':'','method': methodMetrics,'per_sample': perSampleMetrics}\n    \n    \n    return resDict;\n\n\n\nif __name__=='__main__':\n        \n    rrc_evaluation_funcs.main_evaluation(None,default_evaluation_params,validate_data,evaluate_method)\n"
  },
  {
    "path": "dataset/icdar15/batch_eval.sh",
    "content": "for ((i=490000;i<=510000;i+=1000)); do echo $i;sh eval.sh $1/submit-${i}_1.5|grep Calc;done\n"
  },
  {
    "path": "dataset/icdar15/eval.sh",
    "content": "cd dataset/icdar15/\nrm submit/*\ncp $1/*.txt submit\ncd submit/;zip -r  submit.zip * &> ../log.txt ;mv submit.zip ../; cd ../\nrm log.txt\npython Evaluation_Protocol/script.py -g=gt.zip -s=submit.zip\n"
  },
  {
    "path": "dataset/icdar15/submit/res_img_1.txt",
    "content": "137,342,179,342,179,354,137,354\n"
  },
  {
    "path": "dataset/synth-text/README.md",
    "content": "## SynthText\n\nSynthText is uploaded to Baidu Cloud [link](https://pan.baidu.com/s/17Gk301SwsnoESM1jQZRq0g), extract code `tb5g`\n\n1. download from link above and unzip it SynthText.zip\n2. transform `.mat` ground truth to `.txt` format in `gt`: `$ python dataset/synth-text/make_list.py`\n3. make training list by running: `$ ls data/SynthText/gt/ > data/SynthText/image_list.txt` (see [issue #24](https://github.com/princewang1994/TextSnake.pytorch/issues/24))\n4. pretrain using synthtext:\n\n```bash\nCUDA_VISIBLE_DEVICES=$GPUID python train.py synthtext_pretrain --dataset synth-text --viz --max_epoch 1 --batch_size 8\n```"
  },
  {
    "path": "dataset/synth-text/make_list.py",
    "content": "import os\nimport scipy.io as io\nfrom tqdm import tqdm\n\ngt_mat_path = 'data/SynthText/gt.mat'\nim_root = 'data/SynthText/'\ntxt_root = 'data/SynthText/gt/'\n\nif not os.path.exists(txt_root):\n    os.mkdir(txt_root)\n\nprint('reading data from {}'.format(gt_mat_path))\ngt = io.loadmat(gt_mat_path)\nprint('Done.')\n\nfor i, imname in enumerate(tqdm(gt['imnames'][0])):\n    imname = imname[0]\n    img_id = os.path.basename(imname)\n    im_path = os.path.join(im_root, imname)\n    txt_path = os.path.join(txt_root, img_id.replace('jpg', 'txt'))\n\n    if len(gt['wordBB'][0,i].shape) == 2:\n        annots = gt['wordBB'][0,i].transpose(1, 0).reshape(-1, 8)\n    else:\n        annots = gt['wordBB'][0,i].transpose(2, 1, 0).reshape(-1, 8)\n    with open(txt_path, 'w') as f:\n        f.write(imname + '\\n')\n        for annot in annots:\n            str_write = ','.join(annot.astype(str).tolist())\n            f.write(str_write + '\\n')\n\nprint('End.')"
  },
  {
    "path": "dataset/synth_text.py",
    "content": "# -*- coding: utf-8 -*-\n__author__ = \"S.X.Zhang\"\nimport os\nimport numpy as np\nfrom dataset.data_util import pil_load_img\nfrom dataset.dataload import TextDataset, TextInstance\n\n\nclass SynthText(TextDataset):\n\n    def __init__(self, data_root, is_training=True, load_memory=False, transform=None, ignore_list=None):\n        super().__init__(transform, is_training)\n        self.data_root = data_root\n        self.is_training = is_training\n        self.image_root = data_root\n        self.load_memory = load_memory\n\n        self.annotation_root = os.path.join(data_root, 'gt')\n        with open(os.path.join(data_root, 'image_list.txt')) as f:\n            self.annotation_list = [line.strip() for line in f.readlines()]\n\n        #if self.load_memory:\n        #    self.datas = list()\n        #    for item in range(len(self.image_list)):\n        #        self.datas.append(self.load_img_gt(item))\n\n    @staticmethod\n    def parse_txt(annotation_path):\n\n        with open(annotation_path) as f:\n            lines = [line.strip() for line in f.readlines()]\n            image_id = lines[0]\n            polygons = []\n            for line in lines[1:]:\n                points = [float(coordinate) for coordinate in line.split(',')]\n                points = np.array(points, dtype=int).reshape(4, 2)\n                polygon = TextInstance(points, 'c', 'abc')\n                polygons.append(polygon)\n        return image_id, polygons\n\n    def load_img_gt(self, item):\n        # Read annotation\n        annotation_id = self.annotation_list[item]\n        annotation_path = os.path.join(self.annotation_root, annotation_id)\n        image_id, polygons = self.parse_txt(annotation_path)\n\n        # Read image data\n        image_path = os.path.join(self.image_root, image_id)\n        image = pil_load_img(image_path)\n\n        data = dict()\n        data[\"image\"] = image\n        data[\"polygons\"] = polygons\n        data[\"image_id\"] = image_id\n        data[\"image_path\"] = image_path\n\n        return data\n\n    def __getitem__(self, item):\n\n        if self.load_memory:\n            data = self.datas[item]\n        else:\n            data = self.load_img_gt(item)\n\n        if self.is_training:\n            return self.get_training_data(data[\"image\"], data[\"polygons\"],\n                                          image_id=data[\"image_id\"], image_path=data[\"image_path\"])\n        else:\n            return self.get_test_data(data[\"image\"], data[\"polygons\"],\n                                      image_id=data[\"image_id\"], image_path=data[\"image_path\"])\n\n    def __len__(self):\n        return len(self.annotation_list)\n\n\nif __name__ == '__main__':\n    from util.augmentation import BaseTransform, Augmentation\n    import time\n    import cv2\n    means = (0.485, 0.456, 0.406)\n    stds = (0.229, 0.224, 0.225)\n\n    transform = Augmentation(\n        size=512, mean=means, std=stds\n    )\n\n    trainset = SynthText(\n        data_root='data/SynthText',\n        is_training=True,\n        transform=transform\n    )\n\n    # img, train_mask, tr_mask, tcl_mask, radius_map, sin_map, cos_map, meta = trainset[944]\n\n    # img, train_mask, tr_mask, tcl_mask, radius_map, sin_map, cos_map, meta = trainset[944]\n    for idx in range(0, len(trainset)):\n        t0 = time.time()\n        img, train_mask, tr_mask = trainset[idx]\n        img, train_mask, tr_mask = map(lambda x: x.cpu().numpy(), (img, train_mask, tr_mask))\n\n        img = img.transpose(1, 2, 0)\n        img = ((img * stds + means) * 255).astype(np.uint8)\n        print(idx, img.shape)\n\n        for i in range(tr_mask.shape[2]):\n            cv2.imshow(\"tr_mask_{}\".format(i),\n                       cav.heatmap(np.array(tr_mask[:, :, i] * 255 / np.max(tr_mask[:, :, i]), dtype=np.uint8)))\n\n        cv2.imshow('imgs', img)\n        cv2.waitKey(0)\n"
  },
  {
    "path": "dataset/total_text/.gitignore",
    "content": "groundtruth_text.zip\ntotaltext.zip\n"
  },
  {
    "path": "dataset/total_text/Evaluation_Protocol/ComputePrecisionRecall.m",
    "content": "function [ precision, recall ] = ComputePrecisionRecall( tau, sigma, tp,tr,k,fsc_k )\n%COMPUTEPRECISIONRECALL Summary of this function goes here\n%   Detailed explanation goes here\n\nif nargin == 2\n    tr = 0.8;   % recall threshold\n    tp = 0.4;   % precision threshold\n    k = 2;      % min number of matches, used in penalizing split & merge\n    fsc_k = 0.8;% penalize value of split or merge\nend\n\ntot_gt = 0;\nrecall_accum = 0;\n\ntot_detected = 0;\nprecision_accum = 0;\n\nnum_images = numel(tau);\nassert(num_images == numel(sigma));\n\nflag_gt = cell(1, num_images);\nflag_det = cell(1, num_images);\n\nfor ifile=1:num_images\n    \n    [num_gt, num_detected] = size( sigma{ifile} );\n    tot_gt = tot_gt + num_gt;\n    tot_detected = tot_detected + num_detected;\n    \n    % ----- mark unprocessed\n    flag_gt{ifile} = zeros(num_gt, 1);\n    flag_det{ifile} = zeros(num_detected, 1);\n    \n    % ---------------------------------------\n    % check one-to-one match\n    % ---------------------------------------\n    for i_gt=1:num_gt\n        \n        num_detected_in_sigma = numel( find( sigma{ifile}(i_gt,:)>tr) );\n        num_detected_in_tau = numel( find( tau{ifile}(i_gt,:)>tp) );\n        \n        if num_detected_in_sigma == 1 && num_detected_in_tau == 1\n            recall_accum = recall_accum + 1.0;\n            precision_accum = precision_accum + 1.0;\n            \n            % Mark the ground truth and detection, do not process twice\n            flag_gt{ifile}(i_gt) = 1;\n            idx_det = sigma{ifile}(i_gt,:)>tr;\n            flag_det{ifile}(idx_det) = 1;\n        end\n    end\n    \n    % ---------------------------------------\n    % check one-to-many match (split)\n    % one gt with many detected rectangles\n    % ---------------------------------------\n    for i_gt=1:num_gt\n        \n        if flag_gt{ifile}(i_gt) > 0\n            continue;\n        end\n        \n        num_nonzero_in_sigma = sum( sigma{ifile}(i_gt,:)>0 );\n        if num_nonzero_in_sigma >= k\n            \n            % -------------------------------------------------------------\n            % Search the possible \"many\" partners for this\t\"one\" rectangle\n            % -------------------------------------------------------------\n            \n            % ----- satisfy 1st condition\n            % only select unprocessed data\n            idx_detected_in_tau = find( (tau{ifile}(i_gt,:)'>=tp) & (flag_det{ifile}==0) );\n            num_detected_in_tau = numel( idx_detected_in_tau );\n            \n            if num_detected_in_tau == 1\n                % Only one of the many-rectangles qualified ->\n                % This match degraded to a one-to-one match\n                if ( (tau{ifile}(i_gt, idx_detected_in_tau) >= tp) && ...\n                        (sigma{ifile}(i_gt, idx_detected_in_tau) >= tr) )\n                    recall_accum = recall_accum + 1.0;\n                    precision_accum = precision_accum + 1.0;\n                end\n            else\n                % satisfy 2nd condition\n                if sum( sigma{ifile}(i_gt,idx_detected_in_tau) ) >= tr\n                \n                    % Mark the \"one\" rectangle\n                    flag_gt{ifile}(i_gt) = 1;\n                    \n                    % Mark all the \"many\" rectangles\n                    flag_det{ifile}(idx_detected_in_tau) = 1;\n                    \n                    recall_accum = recall_accum + fsc_k;\n                    precision_accum = precision_accum + num_detected_in_tau * fsc_k;\n\n                end\n            end\n            \n        end\n        \n        % No match\n        recall_accum = recall_accum + 0;\n        precision_accum = precision_accum + 0;\n        \n    end\n    \n    % ---------------------------------------\n    % check many-to-one match (merge)\n    % one detected rectangle with many gt\n    % ---------------------------------------\n    for i_test=1:num_detected\n        \n        if flag_det{ifile}(i_test) > 0\n            continue;\n        end\n        \n        num_nonzero_in_tau = sum( tau{ifile}(:,i_test)>0 );\n        if num_nonzero_in_tau >= k\n            \n            % satisfy 1st condition\n            % only select unprocessed data\n            idx_detected_in_sigma = find( (sigma{ifile}(:,i_test)>=tr) & (flag_gt{ifile}==0) );\n            num_detected_in_sigma = numel( idx_detected_in_sigma );\n            \n            if num_detected_in_sigma == 1\n                % Only one of the many-rectangles qualified ->\n                % This match degraded to a one-to-one match\n                if ( (tau{ifile}(idx_detected_in_sigma, i_test) >= tp) && ...\n                        (sigma{ifile}(idx_detected_in_sigma, i_test) >= tr) )\n                    recall_accum = recall_accum + 1.0;\n                    precision_accum = precision_accum + 1.0;\n                end\n            else\n                % satisfy 2nd condition\n                if sum( tau{ifile}(idx_detected_in_sigma,i_test) ) >= tp\n                    % Mark the \"one\" rectangle\n                    flag_det{ifile}(i_test) = 1;\n                    \n                    % Mark all the \"many\" rectangles\n                    flag_gt{ifile}(idx_detected_in_sigma) = 1;\n                    \n                    recall_accum = recall_accum + num_detected_in_sigma*fsc_k;\n                    precision_accum = precision_accum + fsc_k;\n%                     recall_accum = recall_accum + num_detected_in_sigma;\n%                     precision_accum = precision_accum + 1.0;\n                end\n            end\n            \n        end\n        \n        % No match\n        recall_accum = recall_accum + 0;\n        precision_accum = precision_accum + 0;\n    end\n    \nend\nrecall = recall_accum / tot_gt;\nprecision = precision_accum / tot_detected;\n\nend\n\n"
  },
  {
    "path": "dataset/total_text/Evaluation_Protocol/Eval.m",
    "content": "%% Evaluation method for Total-Text.  \n% Chee Kheng Ch'ng and Chee Seng Chan.\n% \"Total-Text:  A Comprehensive Dataset for Scene Text Detection and\n% Recognition.\n% It's built on top of Wolf & Jolion's method. \n% Wolf, Christian, and Jean-Michel Jolion. \n% \"Object count/area graphs for the evaluation of object detection and segmentation algorithms.\" \n% International Journal of Document Analysis and Recognition (IJDAR) 8.4 (2006): 280-296.\n%\n\n%% Initialization\n\nclearvars;\nclose all;\n\n%% Path configuration %%\n% gtPath: Path to groundtruth directory\n% infPath: Path to prediction directory \n% fidPath: A text file directory to capture all individual results\ngtPath = '/home/ck/Dropbox/Evaluation_Protocol/Examples/Groundtruth';\npredPath = '/home/ck/Dropbox/Evaluation_Protocol/Examples/Prediction';\nfidPath = '/home/ck/Dropbox/Evaluation_Protocol/Examples/Result.txt';\n\n% This script will look to load your result files(infPath) based on what you have in\n% gtPath.\nallFiles = dir(gtPath);\nallNames = { allFiles.name };\n\n% constants\ntr = 0.8;   % recall threshold\ntp = 0.4;   % precision threshold\nk_t = 2;      % min number of matches, used in penalizing split & merge\nfsc_k = 0.8;    % penalize value of split or merge\n\n%% Prepare overlap matrices\nnumFiles_test = numel(allNames) - 2;\nsigma = cell(numFiles_test,1);  % overlap matrix recall\ntau = cell(numFiles_test,1);    % overlap matrix precision\n\nfor i=3:(numFiles_test + 2)\n    % Outer for loop to run through every groundtruth mat files.\n    disp(allNames{i})\n    \n    gt = load([gtPath '/' allNames{i}]);\n    pred_name = strsplit(allNames{i}, '_');\n    pred = load([predPath '/' pred_name{3}]);\n    %We stored our groundtruth and prediction result in structure, feel\n    %free too change according to your need.\n    gt = gt.polygt;\n    pred = pred.accuInf;\n    \n    % Get the number of polygon boundaries in result file\n    numPolyinTestData = size(pred,1);\n\n    % Get the number of polygon boundaries in ground truth file\n    numPolyinGTData = size(gt,1);\n\n    % initialized overlap matrices to zeros\n    sigma{i-2} = zeros(numPolyinGTData, numPolyinTestData);\n    tau{i-2} = zeros(numPolyinGTData, numPolyinTestData);\n    clear gt_poly;\n    for j = 1:size(gt,1)\n        % For loop to run through groundtruth\n        gt_Ph_x = gt{j,2}(:);\n        gt_Ph_y = gt{j,4}(:);\n        \n        gt_poly(j).x_ = double(gt_Ph_x);\n        gt_poly(j).y_ = double(gt_Ph_y);\n        poly_gt_x = gt_poly(j).x_; poly_gt_y = gt_poly(j).y_;\n        % The order of polygon points need to be clockwise\n        if ~ispolycw(poly_gt_x, poly_gt_y)\n            [poly_gt_x, poly_gt_y] = poly2cw(poly_gt_x, poly_gt_y);\n        end\n\n        gt_area = polyarea(poly_gt_x, poly_gt_y);\n        clear pred_poly;\n        % For loop to run through every prediction \n        for k = 1:size(pred,1)\n            pred_Ph = pred{k};\n            pred_poly(k).x_ = pred_Ph(:,1);\n            pred_poly(k).y_ = pred_Ph(:,2);\n            poly_pred_x = pred_poly(k).x_; poly_pred_y = pred_poly(k).y_;\n            % The order of polygon points need to be clockwise\n            if ~ispolycw(poly_pred_x, poly_pred_y)\n                [poly_pred_x, poly_pred_y] = poly2cw(poly_pred_x, poly_pred_y);\n            end\n\n            pred_area = polyarea(poly_pred_x, poly_pred_y);\n\n            % Get polygon intersection from two polygons\n            [sx, sy] = polybool('intersection', poly_gt_x, poly_gt_y, poly_pred_x, poly_pred_y);\n\n            if ~isempty(sx) || ~isempty(sx)\n                % update sigma and tau if it is intercepted\n                if isShapeMultipart(sx, sy)\n                    % if the intersection has multi-part\n                    [sx1,sy1] = polysplit(sx,sy);\n                    intersec_area = 0;\n                    for m=1:numel(sx1)\n                        intersec_area = intersec_area + polyarea(sx1{m}, sy1{m});\n                    end\n                else\n                    intersec_area = polyarea(sx, sy);\n                end\n\n                % compute intersection\n                recall = intersec_area/gt_area;\n                precision = intersec_area/pred_area;\n                fid = fopen(fidPath, 'a');\n                temp = ([allNames{i} ' ' mat2str(precision) ' ' mat2str(recall)  '\\n']); \n                fprintf(fid,temp);\n                fclose(fid);\n                % fill in the overlap matrix\n                sigma{i-2}(j, k) = recall;\n                tau{i-2}(j, k) = precision;\n            end\n        end\n    end\nend\n\n[ precision, recall ] = ComputePrecisionRecall( tau, sigma, tp,tr,k_t,fsc_k );\n\n%% Display final result\ndisp(sprintf('\\nPrecision = %f', precision));\ndisp(sprintf('Recall    = %f', recall));\nf_score = 2*precision*recall/(precision+recall);\ndisp(sprintf('F-Score   = %f\\n', f_score));\n\ndisp('Finish processing...');"
  },
  {
    "path": "dataset/total_text/Evaluation_Protocol/Examples/Result.txt",
    "content": "poly_gt_img1.mat 0.779921424378009 0.0942898883654127\npoly_gt_img1.mat 0.525106870874419 0.629432040680526\npoly_gt_img2.mat 0.559817673141842 0.999892524773751\npoly_gt_img2.mat 0.590193543826295 0.99875589640427\npoly_gt_img2.mat 0.0942022257298617 1\npoly_gt_img2.mat 0.103871814938123 1\npoly_gt_img2.mat 0.010505232720086 1\npoly_gt_img2.mat 0.0198166889947076 1\npoly_gt_img2.mat 0.104654401719038 1\npoly_gt_img2.mat 0.0875966627316258 1\npoly_gt_img2.mat 0.0931278269289438 1\npoly_gt_img3.mat 0.48681831806657 0.999451986053668\npoly_gt_img3.mat 0.40616586652521 0.821243459290938\npoly_gt_img3.mat 0.355970732107761 0.340450325543543\npoly_gt_img3.mat 0.0300002607017102 0.010002712610903\npoly_gt_img3.mat 0.840746644511745 0.581523258188604\npoly_gt_img4.mat 0.164790083242852 0.99986467602099\npoly_gt_img4.mat 0.0288083252252633 1\npoly_gt_img4.mat 0.0897669106138162 1\npoly_gt_img4.mat 0.561632398620351 0.310857651237011\npoly_gt_img4.mat 0.0418624972017546 0.0414709318543488\npoly_gt_img4.mat 0.0727406318883174 1\npoly_gt_img4.mat 0.0369466554154218 1\npoly_gt_img4.mat 0.0392957328893653 1\npoly_gt_img4.mat 0.0170137067530276 1\npoly_gt_img4.mat 0.0282573494195726 1\npoly_gt_img4.mat 0.019066298720551 1\npoly_gt_img4.mat 0.0166985325234465 0.997521192854982\npoly_gt_img4.mat 0.0249503956941182 1\npoly_gt_img5.mat 1 0.0903089701228755\n"
  },
  {
    "path": "dataset/total_text/Evaluation_Protocol/Python_scripts/Deteval.py",
    "content": "# modified from https://github.com/cs-chan/Total-Text-Dataset/blob/master/Evaluation_Protocol/Python_scripts/Deteval.py\nimport warnings\nwarnings.filterwarnings(\"ignore\")\nimport numpy as np\nimport os\nfrom os import listdir\nfrom scipy import io\n# from polygon_wrapper import iod\n# from polygon_wrapper import area_of_intersection\n# from polygon_wrapper import area\nfrom polygon_fast import iod\nfrom polygon_fast import area_of_intersection\nfrom polygon_fast import area\n\nimport argparse\nfrom tqdm import tqdm\n\nparser = argparse.ArgumentParser()\n\n# basic opts\nparser.add_argument('exp_name', type=str, help='Model output directory')\nparser.add_argument('--tr', type=float, default=0.7, help='Recall threshold')\nparser.add_argument('--tp', type=float, default=0.6, help='Precision threshold')\nargs = parser.parse_args()\n\n\"\"\"\nInput format: y0,x0, ..... yn,xn. Each detection is separated by the end of line token ('\\n')'\n\"\"\"\n\ninput_dir = 'output/{}'.format(args.exp_name)\neval_result_dir = \"output/Analysis/output_eval\"\ngt_dir = 'data/total-text-mat/gt/Test'\nfid_path = '{}/Eval_TotalText_{}_{}.txt'.format(eval_result_dir, args.tr, args.tp)\n\n\nallInputs = listdir(input_dir)\n\n\ndef input_reading_mod(input_dir, input):\n    \"\"\"This helper reads input from txt files\"\"\"\n    with open('%s/%s' % (input_dir, input), 'r') as input_fid:\n        pred = input_fid.readlines()\n    det = [x.strip('\\n') for x in pred]\n    return det\n\n\ndef gt_reading_mod(gt_dir, gt_id):\n    \"\"\"This helper reads groundtruths from mat files\"\"\"\n    gt_id = gt_id.split('.')[0]\n    gt = io.loadmat('%s/poly_gt_%s.mat' % (gt_dir, gt_id))\n    gt = gt['polygt']\n    return gt\n\n\ndef detection_filtering(detections, groundtruths, threshold=0.5):\n    for gt_id, gt in enumerate(groundtruths):\n        if (gt[5] == '#') and (gt[1].shape[1] > 1):\n            gt_x = list(map(int, np.squeeze(gt[1])))\n            gt_y = list(map(int, np.squeeze(gt[3])))\n            for det_id, detection in enumerate(detections):\n                detection = detection.split(',')\n                detection = list(map(int, detection[0:]))\n                det_y = detection[1::2]\n                det_x = detection[0::2]\n                det_gt_iou = iod(det_x, det_y, gt_x, gt_y)\n                if det_gt_iou > threshold:\n                    detections[det_id] = []\n\n            detections[:] = [item for item in detections if item != []]\n    return detections\n\ndef sigma_calculation(det_x, det_y, gt_x, gt_y):\n    \"\"\"\n    sigma = inter_area / gt_area\n    \"\"\"\n    return np.round((area_of_intersection(det_x, det_y, gt_x, gt_y) / area(gt_x, gt_y)), 2)\n\ndef tau_calculation(det_x, det_y, gt_x, gt_y):\n    \"\"\"\n    tau = inter_area / det_area\n    \"\"\"\n    return np.round((area_of_intersection(det_x, det_y, gt_x, gt_y) / area(det_x, det_y)), 2)\n\n\n\n##############################Initialization###################################\nglobal_tp = 0\nglobal_fp = 0\nglobal_fn = 0\nglobal_sigma = []\nglobal_tau = []\ntr = args.tr\ntp = args.tp\nfsc_k = 0.8\nk = 2\n###############################################################################\n\nfor i, input_id in enumerate(tqdm(allInputs)):\n    if (input_id != '.DS_Store') and (input_id != 'Pascal_result.txt') and (\n            input_id != 'Pascal_result_curved.txt') and (input_id != 'Pascal_result_non_curved.txt') and (\n            input_id != 'Deteval_result.txt') and (input_id != 'Deteval_result_curved.txt') \\\n            and (input_id != 'Deteval_result_non_curved.txt'):\n        # print(input_id)\n        detections = input_reading_mod(input_dir, input_id)\n        groundtruths = gt_reading_mod(gt_dir, input_id)\n        detections = detection_filtering(detections, groundtruths)  # filters detections overlapping with DC area\n        dc_id = np.where(groundtruths[:, 5] == '#')\n        groundtruths = np.delete(groundtruths, (dc_id), (0))\n\n        local_sigma_table = np.zeros((groundtruths.shape[0], len(detections)))\n        local_tau_table = np.zeros((groundtruths.shape[0], len(detections)))\n        for gt_id, gt in enumerate(groundtruths):\n            if len(detections) > 0:\n                for det_id, detection in enumerate(detections):\n                    detection = detection.split(',')\n                    detection = list(map(int, detection))\n                    det_y = detection[1::2]\n                    det_x = detection[0::2]\n                    gt_x = list(map(int, np.squeeze(gt[1])))\n                    gt_y = list(map(int, np.squeeze(gt[3])))\n\n                    local_sigma_table[gt_id, det_id] = sigma_calculation(det_x, det_y, gt_x, gt_y)\n                    local_tau_table[gt_id, det_id] = tau_calculation(det_x, det_y, gt_x, gt_y)\n        global_sigma.append(local_sigma_table)\n        global_tau.append(local_tau_table)\n\nglobal_accumulative_recall = 0\nglobal_accumulative_precision = 0\ntotal_num_gt = 0\ntotal_num_det = 0\n\n\ndef one_to_one(local_sigma_table, local_tau_table, local_accumulative_recall,\n               local_accumulative_precision, global_accumulative_recall, global_accumulative_precision,\n               gt_flag, det_flag):\n    for gt_id in range(num_gt):\n        qualified_sigma_candidates = np.where(local_sigma_table[gt_id, :] > tr)\n        num_qualified_sigma_candidates = qualified_sigma_candidates[0].shape[0]\n        qualified_tau_candidates = np.where(local_tau_table[gt_id, :] > tp)\n        num_qualified_tau_candidates = qualified_tau_candidates[0].shape[0]\n\n        if (num_qualified_sigma_candidates == 1) and (num_qualified_tau_candidates == 1):\n            global_accumulative_recall = global_accumulative_recall + 1.0\n            global_accumulative_precision = global_accumulative_precision + 1.0\n            local_accumulative_recall = local_accumulative_recall + 1.0\n            local_accumulative_precision = local_accumulative_precision + 1.0\n\n            gt_flag[0, gt_id] = 1\n            matched_det_id = np.where(local_sigma_table[gt_id, :] > tr)\n            det_flag[0, matched_det_id] = 1\n    return local_accumulative_recall, local_accumulative_precision, global_accumulative_recall, global_accumulative_precision, gt_flag, det_flag\n\n\ndef one_to_many(local_sigma_table, local_tau_table, local_accumulative_recall,\n               local_accumulative_precision, global_accumulative_recall, global_accumulative_precision,\n               gt_flag, det_flag):\n    for gt_id in range(num_gt):\n        #skip the following if the groundtruth was matched\n        if gt_flag[0, gt_id] > 0:\n            continue\n\n        non_zero_in_sigma = np.where(local_sigma_table[gt_id, :] > 0)\n        num_non_zero_in_sigma = non_zero_in_sigma[0].shape[0]\n\n        if num_non_zero_in_sigma >= k:\n            ####search for all detections that overlaps with this groundtruth\n            qualified_tau_candidates = np.where((local_tau_table[gt_id, :] >= tp) & (det_flag[0, :] == 0))\n            num_qualified_tau_candidates = qualified_tau_candidates[0].shape[0]\n\n            if num_qualified_tau_candidates == 1:\n                if ((local_tau_table[gt_id, qualified_tau_candidates] >= tp) and (local_sigma_table[gt_id, qualified_tau_candidates] >= tr)):\n                    #became an one-to-one case\n                    global_accumulative_recall = global_accumulative_recall + 1.0\n                    global_accumulative_precision = global_accumulative_precision + 1.0\n                    local_accumulative_recall = local_accumulative_recall + 1.0\n                    local_accumulative_precision = local_accumulative_precision + 1.0\n\n                    gt_flag[0, gt_id] = 1\n                    det_flag[0, qualified_tau_candidates] = 1\n            elif (np.sum(local_sigma_table[gt_id, qualified_tau_candidates]) >= tr):\n                gt_flag[0, gt_id] = 1\n                det_flag[0, qualified_tau_candidates] = 1\n\n                global_accumulative_recall = global_accumulative_recall + fsc_k\n                global_accumulative_precision = global_accumulative_precision + num_qualified_tau_candidates * fsc_k\n\n                local_accumulative_recall = local_accumulative_recall + fsc_k\n                local_accumulative_precision = local_accumulative_precision + num_qualified_tau_candidates * fsc_k\n\n    return local_accumulative_recall, local_accumulative_precision, global_accumulative_recall, global_accumulative_precision, gt_flag, det_flag\n\n\ndef many_to_many(local_sigma_table, local_tau_table, local_accumulative_recall,\n               local_accumulative_precision, global_accumulative_recall, global_accumulative_precision,\n               gt_flag, det_flag):\n    for det_id in range(num_det):\n        # skip the following if the detection was matched\n        if det_flag[0, det_id] > 0:\n            continue\n\n        non_zero_in_tau = np.where(local_tau_table[:, det_id] > 0)\n        num_non_zero_in_tau = non_zero_in_tau[0].shape[0]\n\n        if num_non_zero_in_tau >= k:\n            ####search for all detections that overlaps with this groundtruth\n            qualified_sigma_candidates = np.where((local_sigma_table[:, det_id] >= tp) & (gt_flag[0, :] == 0))\n            num_qualified_sigma_candidates = qualified_sigma_candidates[0].shape[0]\n\n            if num_qualified_sigma_candidates == 1:\n                if ((local_tau_table[qualified_sigma_candidates, det_id] >= tp) and (local_sigma_table[qualified_sigma_candidates, det_id] >= tr)):\n                    #became an one-to-one case\n                    global_accumulative_recall = global_accumulative_recall + 1.0\n                    global_accumulative_precision = global_accumulative_precision + 1.0\n                    local_accumulative_recall = local_accumulative_recall + 1.0\n                    local_accumulative_precision = local_accumulative_precision + 1.0\n\n                    gt_flag[0, qualified_sigma_candidates] = 1\n                    det_flag[0, det_id] = 1\n            elif (np.sum(local_tau_table[qualified_sigma_candidates, det_id]) >= tp):\n                det_flag[0, det_id] = 1\n                gt_flag[0, qualified_sigma_candidates] = 1\n\n                global_accumulative_recall = global_accumulative_recall + num_qualified_sigma_candidates * fsc_k\n                global_accumulative_precision = global_accumulative_precision + fsc_k\n\n                local_accumulative_recall = local_accumulative_recall + num_qualified_sigma_candidates * fsc_k\n                local_accumulative_precision = local_accumulative_precision + fsc_k\n    return local_accumulative_recall, local_accumulative_precision, global_accumulative_recall, global_accumulative_precision, gt_flag, det_flag\n\nfid = open(fid_path, 'w')\n\nfor idx in range(len(global_sigma)):\n\n    local_sigma_table = global_sigma[idx]\n    local_tau_table = global_tau[idx]\n\n    num_gt = local_sigma_table.shape[0]\n    num_det = local_sigma_table.shape[1]\n\n    total_num_gt = total_num_gt + num_gt\n    total_num_det = total_num_det + num_det\n\n    local_accumulative_recall = 0\n    local_accumulative_precision = 0\n    gt_flag = np.zeros((1, num_gt))\n    det_flag = np.zeros((1, num_det))\n\n    #######first check for one-to-one case##########\n    local_accumulative_recall, local_accumulative_precision, global_accumulative_recall, global_accumulative_precision, \\\n    gt_flag, det_flag = one_to_one(local_sigma_table, local_tau_table,\n                                  local_accumulative_recall, local_accumulative_precision,\n                                  global_accumulative_recall, global_accumulative_precision,\n                                  gt_flag, det_flag)\n\n    #######then check for one-to-many case##########\n    local_accumulative_recall, local_accumulative_precision, global_accumulative_recall, global_accumulative_precision, \\\n    gt_flag, det_flag = one_to_many(local_sigma_table, local_tau_table,\n                                   local_accumulative_recall, local_accumulative_precision,\n                                   global_accumulative_recall, global_accumulative_precision,\n                                   gt_flag, det_flag)\n\n    #######then check for many-to-many case##########\n    local_accumulative_recall, local_accumulative_precision, global_accumulative_recall, global_accumulative_precision, \\\n    gt_flag, det_flag = many_to_many(local_sigma_table, local_tau_table,\n                                    local_accumulative_recall, local_accumulative_precision,\n                                    global_accumulative_recall, global_accumulative_precision,\n                                    gt_flag, det_flag)\n\n    try:\n        local_precision = local_accumulative_precision / num_det\n    except ZeroDivisionError:\n        local_precision = 0\n\n    try:\n        local_recall = local_accumulative_recall / num_gt\n    except ZeroDivisionError:\n        local_recall = 0\n\n    str_write = ('%s :: Precision=%.4f - Recall=%.4f\\n' % (allInputs[idx], local_precision, local_recall))\n    fid.write(str_write)\nfid.close()\n\ntry:\n    recall = global_accumulative_recall / total_num_gt\nexcept ZeroDivisionError:\n    recall = 0\n\ntry:\n    precision = global_accumulative_precision / total_num_det\nexcept ZeroDivisionError:\n    precision = 0\n\ntry:\n    f_score = 2*precision*recall/(precision+recall)\nexcept ZeroDivisionError:\n    f_score = 0\n\nfid = open(fid_path, 'a')\nstr_write = ('ALL :: Precision=%.4f - Recall=%.4f - Fscore=%.4f' % (precision, recall, f_score))\nfid.write(str_write)\nfid.close()\n\nprint('Input: {}'.format(input_dir))\nprint('Config: tr: {} - tp: {}'.format(tr, tp))\nprint(str_write)\nprint('Done.\\n')"
  },
  {
    "path": "dataset/total_text/Evaluation_Protocol/Python_scripts/Pascal_VOC.py",
    "content": "from os import listdir\nfrom scipy import io\nimport numpy as np\nfrom skimage.draw import polygon\nfrom polygon_wrapper import iou\nfrom polygon_wrapper import iod\n\n\n\"\"\"\nInput format: y0,x0, ..... yn,xn. Each detection is separated by the end of line token ('\\n')'\n\"\"\"\n\ninput_dir = '' #detection directory goes here\ngt_dir = '' #gt directory goes here\nfid_path = '' #output text file directory goes here\n\nallInputs = listdir(input_dir)\n\ndef input_reading_mod(input_dir, input):\n    \"\"\"This helper convert input\"\"\"\n    with open('%s/%s' %(input_dir, input), 'r') as input_fid:\n        pred = input_fid.readlines()\n    det = [x.strip('\\n') for x in pred]\n    return det\n\n\ndef gt_reading_mod(gt_dir, gt_id):\n    gt_id = gt_id.split('.')[0]\n    gt = io.loadmat('%s/poly_gt_%s.mat' %(gt_dir, gt_id))\n    gt = gt['polygt']\n    return gt\n\ndef detection_filtering(detections, groundtruths, threshold=0.5):\n    for gt_id, gt in enumerate(groundtruths):\n        if (gt[5] == '#') and (gt[1].shape[1] > 1):\n            gt_x = map(int, np.squeeze(gt[1]))\n            gt_y = map(int, np.squeeze(gt[3]))\n            for det_id, detection in enumerate(detections):\n                detection = detection.split(',')\n                detection = map(int, detection)\n                det_y = detection[0::2]\n                det_x = detection[1::2]\n                det_gt_iou = iod(det_x, det_y, gt_x, gt_y)\n                if det_gt_iou > threshold:\n                    detections[det_id] = []\n\n            detections[:] = [item for item in detections if item != []]\n    return detections\n\nglobal_tp = 0\nglobal_fp = 0\nglobal_fn = 0\nfor input_id in allInputs:\n    if (input_id != '.DS_Store'):\n        print(input_id)\n        detections = input_reading_mod(input_dir, input_id)\n        groundtruths = gt_reading_mod(gt_dir, input_id)\n        detections = detection_filtering(detections, groundtruths) #filtering detections overlaps with DC area\n        dc_id = np.where(groundtruths[:, 5] == '#')\n        groundtruths = np.delete(groundtruths, (dc_id), (0))\n\n        iou_table = np.zeros((groundtruths.shape[0], len(detections)))\n        det_flag = np.zeros((len(detections), 1))\n        gt_flag = np.zeros((groundtruths.shape[0], 1))\n        tp = 0\n        fp = 0\n        fn = 0\n        for gt_id, gt in enumerate(groundtruths):\n            if len(detections) > 0:\n                for det_id, detection in enumerate(detections):\n                    detection = detection.split(',')\n                    detection = map(int, detection) #take it off in the final script\n                    det_y = detection[0::2]\n                    det_x = detection[1::2]\n                    gt_x = map(int, np.squeeze(gt[1]))\n                    gt_y = map(int, np.squeeze(gt[3]))\n\n                    iou_table[gt_id, det_id] = iou(det_x, det_y, gt_x, gt_y)\n\n                best_matched_det_id = np.argmax(iou_table[gt_id, :]) #identified the best matched detection candidates with current groundtruth\n\n                matched_id = np.where(iou_table[gt_id, :] >= 0.5)\n                if (iou_table[gt_id, best_matched_det_id] >= 0.5):\n                    if matched_id[0].shape[0] < 2:\n                        tp = tp + 1.0\n                        global_tp = global_tp + 1.0\n                        det_flag[best_matched_det_id] = 1\n                        gt_flag[gt_id] = 1\n                    else:\n                        tp = tp + 1.0\n                        global_tp = global_tp + 1.0\n                        det_flag[best_matched_det_id] = 1\n                        gt_flag[gt_id] = 1\n                        #if there are more than 1 matched detection, only 1 is contributed to tp, the rest are fp\n                        fp = fp + (matched_id[0].shape[0] - 1.0)\n\n        #Update local and global tp, fp, and fn\n        inv_gt_flag = 1 - gt_flag\n        fn = np.sum(inv_gt_flag)\n        inv_det_flag = 1 - det_flag\n        fp = fp + np.sum(inv_det_flag)\n\n        global_fp = global_fp + fp\n        global_fn = global_fn + fn\n        local_precision = tp / (tp + fp)\n        local_recall = tp / (tp + fn)\n        fid = open(fid_path, 'a')\n        temp = ('%s______/Precision:_%s_______/Recall:_%s\\n' %(input_id, str(local_precision), str(local_recall)))\n        fid.write(temp)\n        fid.close()\n\n\nglobal_precision = global_tp / (global_tp + global_fp)\nglobal_recall = global_tp / (global_tp + global_fn)\nf_score = 2*global_precision*global_recall/(global_precision+global_recall)\n\nfid = open(fid_path, 'a')\ntemp = ('Precision:_%s_______/Recall:_%s\\n' %(str(global_precision), str(global_recall)))\nfid.write(temp)\nfid.close()\nprint('pb')\n\n\n\n\n\n\n                \n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n"
  },
  {
    "path": "dataset/total_text/Evaluation_Protocol/Python_scripts/__init__.py",
    "content": ""
  },
  {
    "path": "dataset/total_text/Evaluation_Protocol/Python_scripts/polygon_fast.py",
    "content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n__author__ = '古溪'\n\nimport numpy as np\nfrom shapely.geometry import Polygon\n\n\"\"\"\n:param det_x: [1, N] Xs of detection's vertices \n:param det_y: [1, N] Ys of detection's vertices\n:param gt_x: [1, N] Xs of groundtruth's vertices\n:param gt_y: [1, N] Ys of groundtruth's vertices\n##############\nAll the calculation of 'AREA' in this script is handled by:\n1) First generating a binary mask with the polygon area filled up with 1's\n2) Summing up all the 1's\n\"\"\"\n\n\ndef area(x, y):\n    polygon = Polygon(np.stack([x, y], axis=1))\n    return float(polygon.area)\n\n\ndef approx_area_of_intersection(det_x, det_y, gt_x, gt_y):\n    \"\"\"\n    This helper determine if both polygons are intersecting with each others with an approximation method.\n    Area of intersection represented by the minimum bounding rectangular [xmin, ymin, xmax, ymax]\n    \"\"\"\n    det_ymax = np.max(det_y)\n    det_xmax = np.max(det_x)\n    det_ymin = np.min(det_y)\n    det_xmin = np.min(det_x)\n\n    gt_ymax = np.max(gt_y)\n    gt_xmax = np.max(gt_x)\n    gt_ymin = np.min(gt_y)\n    gt_xmin = np.min(gt_x)\n\n    all_min_ymax = np.minimum(det_ymax, gt_ymax)\n    all_max_ymin = np.maximum(det_ymin, gt_ymin)\n\n    intersect_heights = np.maximum(0.0, (all_min_ymax - all_max_ymin))\n\n    all_min_xmax = np.minimum(det_xmax, gt_xmax)\n    all_max_xmin = np.maximum(det_xmin, gt_xmin)\n    intersect_widths = np.maximum(0.0, (all_min_xmax - all_max_xmin))\n\n    return intersect_heights * intersect_widths\n\ndef area_of_intersection(det_x, det_y, gt_x, gt_y):\n    p1 = Polygon(np.stack([det_x, det_y], axis=1)).buffer(0)\n    p2 = Polygon(np.stack([gt_x, gt_y], axis=1)).buffer(0)\n    return float(p1.intersection(p2).area)\n\ndef area_of_union(det_x, det_y, gt_x, gt_y):\n    p1 = Polygon(np.stack([det_x, det_y], axis=1)).buffer(0)\n    p2 = Polygon(np.stack([gt_x, gt_y], axis=1)).buffer(0)\n    return float(p1.union(p2).area)\n\n\ndef iou(det_x, det_y, gt_x, gt_y):\n    return area_of_intersection(det_x, det_y, gt_x, gt_y) / (area_of_union(det_x, det_y, gt_x, gt_y) + 1.0)\n\n\ndef iod(det_x, det_y, gt_x, gt_y):\n    \"\"\"\n    This helper determine the fraction of intersection area over detection area\n    \"\"\"\n    return area_of_intersection(det_x, det_y, gt_x, gt_y) / (area(det_x, det_y) + 1.0)"
  },
  {
    "path": "dataset/total_text/Evaluation_Protocol/Python_scripts/polygon_wrapper.py",
    "content": "import numpy as np\n# from skimage.draw import polygon\nfrom shapely.geometry import Polygon\n\n\"\"\"\n:param det_x: [1, N] Xs of detection's vertices \n:param det_y: [1, N] Ys of detection's vertices\n:param gt_x: [1, N] Xs of groundtruth's vertices\n:param gt_y: [1, N] Ys of groundtruth's vertices\n\n##############\nAll the calculation of 'AREA' in this script is handled by:\n1) First generating a binary mask with the polygon area filled up with 1's\n2) Summing up all the 1's\n\"\"\"\n\n\ndef area(x, y):\n    \"\"\"\n    This helper calculates the area given x and y vertices.\n    \"\"\"\n    return shapely_area(x, y)\n    ymax = np.max(y)\n    xmax = np.max(x)\n    bin_mask = np.zeros((ymax, xmax))\n    rr, cc = polygon(y, x)\n    bin_mask[rr, cc] = 1\n    area = np.sum(bin_mask)\n    return area\n    #return np.round(area, 2)\n\n\ndef approx_area_of_intersection(det_x, det_y, gt_x, gt_y):\n    \"\"\"\n    This helper determine if both polygons are intersecting with each others with an approximation method.\n    Area of intersection represented by the minimum bounding rectangular [xmin, ymin, xmax, ymax]\n    \"\"\"\n    det_ymax = np.max(det_y)\n    det_xmax = np.max(det_x)\n    det_ymin = np.min(det_y)\n    det_xmin = np.min(det_x)\n\n    gt_ymax = np.max(gt_y)\n    gt_xmax = np.max(gt_x)\n    gt_ymin = np.min(gt_y)\n    gt_xmin = np.min(gt_x)\n\n    all_min_ymax = np.minimum(det_ymax, gt_ymax)\n    all_max_ymin = np.maximum(det_ymin, gt_ymin)\n\n    intersect_heights = np.maximum(0.0, (all_min_ymax - all_max_ymin))\n\n    all_min_xmax = np.minimum(det_xmax, gt_xmax)\n    all_max_xmin = np.maximum(det_xmin, gt_xmin)\n    intersect_widths = np.maximum(0.0, (all_min_xmax - all_max_xmin))\n\n    return intersect_heights * intersect_widths\n\ndef shapely_area_of_intersection(det_x, det_y, gt_x, gt_y):\n    p1 = Polygon(np.stack([det_x, det_y], axis=1)).buffer(0)\n    p2 = Polygon(np.stack([gt_x, gt_y], axis=1)).buffer(0)\n    return int(p1.intersection(p2).area)\n\ndef shapely_area(x, y):\n    polygon = Polygon(np.stack([x, y], axis=1))\n    return int(polygon.area)\n\ndef area_of_intersection(det_x, det_y, gt_x, gt_y):\n    \"\"\"\n    This helper calculates the area of intersection.\n    \"\"\"\n    return shapely_area_of_intersection(det_x, det_y, gt_x, gt_y)\n    if approx_area_of_intersection(det_x, det_y, gt_x, gt_y) > 1: #only proceed if it passes the approximation test\n        ymax = np.maximum(np.max(det_y), np.max(gt_y)) + 1\n        xmax = np.maximum(np.max(det_x), np.max(gt_x)) + 1\n        bin_mask = np.zeros((ymax, xmax))\n        det_bin_mask = np.zeros_like(bin_mask)\n        gt_bin_mask = np.zeros_like(bin_mask)\n\n        rr, cc = polygon(det_y, det_x)\n        det_bin_mask[rr, cc] = 1\n\n        rr, cc = polygon(gt_y, gt_x)\n        gt_bin_mask[rr, cc] = 1\n\n        final_bin_mask = det_bin_mask + gt_bin_mask\n\n        inter_map = np.where(final_bin_mask == 2, 1, 0)\n        inter = np.sum(inter_map)\n        return inter\n#        return np.round(inter, 2)\n    else:\n        return 0\n\n\ndef iou(det_x, det_y, gt_x, gt_y):\n    \"\"\"\n    This helper determine the intersection over union of two polygons.\n    \"\"\"\n\n    if approx_area_of_intersection(det_x, det_y, gt_x, gt_y) > 1: #only proceed if it passes the approximation test\n        ymax = np.maximum(np.max(det_y), np.max(gt_y)) + 1\n        xmax = np.maximum(np.max(det_x), np.max(gt_x)) + 1\n        bin_mask = np.zeros((ymax, xmax))\n        det_bin_mask = np.zeros_like(bin_mask)\n        gt_bin_mask = np.zeros_like(bin_mask)\n\n        rr, cc = polygon(det_y, det_x)\n        det_bin_mask[rr, cc] = 1\n\n        rr, cc = polygon(gt_y, gt_x)\n        gt_bin_mask[rr, cc] = 1\n\n        final_bin_mask = det_bin_mask + gt_bin_mask\n\n        #inter_map = np.zeros_like(final_bin_mask)\n        inter_map = np.where(final_bin_mask == 2, 1, 0)\n        inter = np.sum(inter_map)\n\n        #union_map = np.zeros_like(final_bin_mask)\n        union_map = np.where(final_bin_mask > 0, 1, 0)\n        union = np.sum(union_map)\n        return inter / float(union + 1.0)\n        #return np.round(inter / float(union + 1.0), 2)\n    else:\n        return 0\n\ndef iod(det_x, det_y, gt_x, gt_y):\n    \"\"\"\n    This helper determine the fraction of intersection area over detection area\n    \"\"\"\n\n    if approx_area_of_intersection(det_x, det_y, gt_x, gt_y) > 1: #only proceed if it passes the approximation test\n        ymax = np.maximum(np.max(det_y), np.max(gt_y)) + 1\n        xmax = np.maximum(np.max(det_x), np.max(gt_x)) + 1\n        bin_mask = np.zeros((ymax, xmax))\n        det_bin_mask = np.zeros_like(bin_mask)\n        gt_bin_mask = np.zeros_like(bin_mask)\n\n        rr, cc = polygon(det_y, det_x)\n        det_bin_mask[rr, cc] = 1\n\n        rr, cc = polygon(gt_y, gt_x)\n        gt_bin_mask[rr, cc] = 1\n\n        final_bin_mask = det_bin_mask + gt_bin_mask\n\n        inter_map = np.where(final_bin_mask == 2, 1, 0)\n        inter = np.round(np.sum(inter_map), 2)\n\n        det = np.round(np.sum(det_bin_mask), 2)\n        return inter / float(det + 1.0)\n        #return np.round(inter / float(det + 1.0), 2)\n    else:\n        return 0"
  },
  {
    "path": "dataset/total_text/Evaluation_Protocol/README.md",
    "content": "## Description\n\nThese codes are the official evaluation protocol implementation for Total-Text. Two methods are made available: Deteval and Pascal VOC protocols.\n\nNewly added feature - 'Do not care' candidates filtering is now available in the latest [Python_scripts](https://github.com/cs-chan/Total-Text-Dataset/tree/master/Evaluation_Protocol/Python_scripts) folder.\n\n### Deteval\nWe recommend tr = 0.7 and tp = 0.6 threshold for a fairer evaluation with polygon ground-truth and detection format.\n\n### Pascal VOC\nThe conventional 0.5 threshold works fine.\n"
  },
  {
    "path": "dataset/total_text/Evaluation_Protocol/__init__.py",
    "content": ""
  },
  {
    "path": "dataset/total_text/README.md",
    "content": "### Total-Text dataset\n\nHome page: https://github.com/cs-chan/Total-Text-Dataset\n\ndownload dataset and make it to a dataset format\n\n```shell\nbash download.sh\n```\n\n"
  },
  {
    "path": "dataset/total_text/download.sh",
    "content": "set -e\n\n# download image data\nbash gdrivedl.sh https://drive.google.com/file/d/1bC68CzsSVTusZVvOkk7imSZSbgD1MqK2/view?usp=sharing totaltext.zip\n\nunzip totaltext.zip\nchmod -R o-w Images\nrm -rf __MACOSX\nmv Images/Train/img61.JPG Images/Train/img61.jpg\n\n# download ground truth data\nbash gdrivedl.sh https://drive.google.com/file/d/19quCaJGePvTc3yPZ7MAGNijjKfy77-ke/view?usp=sharing groundtruth_text.zip\n\nunzip groundtruth_text.zip -d gt\nchmod -R o-w gt\nrm -rf gt/__MACOSX\nmv gt/Groundtruth/Polygon/* gt/\nrm -rf gt/Groundtruth\n\nmkdir total-text\nmv Images gt total-text\n\nmv total-text ../../data\n \n"
  },
  {
    "path": "dataset/total_text/gdrivedl.sh",
    "content": "#!/bin/bash\n# modified from https://github.com/matthuisman/files.matthuisman.nz/blob/master/gdrivedl\n\nurl=$1\nfilename=$2\n\n[ -z \"$url\" ] && echo A URL or ID is required first argument && exit 1\n\nfileid=\"\"\ndeclare -a patterns=(\"s/.*\\/file\\/d\\/\\(.*\\)\\/.*/\\1/p\" \"s/.*id\\=\\(.*\\)/\\1/p\" \"s/\\(.*\\)/\\1/p\")\nfor i in \"${patterns[@]}\"\ndo\n   fileid=$(echo $url | sed -n $i)\n   [ ! -z \"$fileid\" ] && break\ndone\n\n[ -z \"$fileid\" ] && echo Could not find Google ID && exit 1\n\necho File ID: $fileid\n\ntmp_file=\"$filename.$$.file\"\ntmp_cookies=\"$filename.$$.cookies\"\ntmp_headers=\"$filename.$$.headers\"\n\nurl='https://docs.google.com/uc?export=download&id='$fileid\necho Downloading: \"$url > $tmp_file\"\nwget --save-cookies \"$tmp_cookies\" -q -S -O - $url 2> \"$tmp_headers\" 1> \"$tmp_file\"\n\nif [[ ! $(find \"$tmp_file\" -type f -size +10000c 2>/dev/null) ]]; then\n   confirm=$(cat \"$tmp_file\" | sed -rn 's/.*confirm=([0-9A-Za-z_]+).*/\\1/p')\nfi\n\nif [ ! -z \"$confirm\" ]; then\n   url='https://docs.google.com/uc?export=download&id='$fileid'&confirm='$confirm\n   echo Downloading: \"$url > $tmp_file\"\n   wget --load-cookies \"$tmp_cookies\" -q -S -O - $url 2> \"$tmp_headers\" 1> \"$tmp_file\"\nfi\n\n[ -z \"$filename\" ] && filename=$(cat \"$tmp_headers\" | sed -rn 's/.*filename=\\\"(.*)\\\".*/\\1/p')\n[ -z \"$filename\" ] && filename=\"google_drive.file\"\n\necho Moving: \"$tmp_file > $filename\"\n\nmv \"$tmp_file\" \"$filename\"\n\nrm -f \"$tmp_cookies\" \"$tmp_headers\"\n\necho Saved: \"$filename\"\necho DONE!\n\nexit 0"
  },
  {
    "path": "demo.py",
    "content": "import os\nimport time\nimport cv2\nimport numpy as np\nimport torch\nimport torch.backends.cudnn as cudnn\nimport torch.utils.data as data\nfrom dataset import DeployDataset\nfrom network.textnet import TextNet\nfrom cfglib.config import config as cfg, update_config, print_config\nfrom cfglib.option import BaseOptions\nfrom util.augmentation import BaseTransform\nfrom util.visualize import visualize_detection, visualize_gt\nfrom util.misc import to_device, mkdirs,rescale_result\n\n\nimport multiprocessing\nmultiprocessing.set_start_method(\"spawn\", force=True)\n\n\ndef osmkdir(out_dir):\n    import shutil\n    if os.path.exists(out_dir):\n        shutil.rmtree(out_dir)\n    os.makedirs(out_dir)\n\n\ndef write_to_file(contours, file_path):\n    \"\"\"\n    :param contours: [[x1, y1], [x2, y2]... [xn, yn]]\n    :param file_path: target file path\n    \"\"\"\n    # according to total-text evaluation method, output file shoud be formatted to: y0,x0, ..... yn,xn\n    with open(file_path, 'w') as f:\n        for cont in contours:\n            cont = np.stack([cont[:, 0], cont[:, 1]], 1)\n            if cv2.contourArea(cont) <= 0:\n                continue\n            cont = cont.flatten().astype(str).tolist()\n            cont = ','.join(cont)\n            f.write(cont + '\\n')\n\n\ndef inference(model, test_loader, output_dir):\n\n    total_time = 0.\n    if cfg.exp_name != \"MLT2017\" and cfg.exp_name != \"ArT\":\n        osmkdir(output_dir)\n    else:\n        if not os.path.exists(output_dir):\n            mkdirs(output_dir)\n        if cfg.exp_name == \"MLT2017\":\n            out_dir = os.path.join(output_dir, \"{}_{}_{}_{}_{}\".\n                                   format(str(cfg.checkepoch), cfg.test_size[0],\n                                          cfg.test_size[1], cfg.dis_threshold, cfg.cls_threshold))\n            if not os.path.exists(out_dir):\n                mkdirs(out_dir)\n\n    art_results = dict()\n    for i, (image, meta) in enumerate(test_loader):\n        input_dict = dict()\n        idx = 0  # test mode can only run with batch_size == 1\n        H, W = meta['Height'][idx].item(), meta['Width'][idx].item()\n        print(meta['image_id'], (H, W))\n\n        input_dict['img'] = to_device(image)\n\n        # get detection result\n        start = time.time()\n        output_dict = model(input_dict)\n        torch.cuda.synchronize()\n        end = time.time()\n        if i > 0:\n            total_time += end - start\n            fps = (i + 1) / total_time\n        else:\n            fps = 0.0\n\n        print('detect {} / {} images: {}. ({:.2f} fps)'.\n              format(i + 1, len(test_loader), meta['image_id'][idx], fps))\n\n        # visualization\n        img_show = image[idx].permute(1, 2, 0).cpu().numpy()\n        img_show = ((img_show * cfg.stds + cfg.means) * 255).astype(np.uint8)\n\n        if cfg.viz:\n            gt_contour = []\n            label_tag = meta['label_tag'][idx].int().cpu().numpy()\n            for annot, n_annot in zip(meta['annotation'][idx], meta['n_annotation'][idx]):\n                if n_annot.item() > 0:\n                    gt_contour.append(annot[:n_annot].int().cpu().numpy())\n\n            gt_vis = visualize_gt(img_show, gt_contour, label_tag)\n            show_boundary, heat_map = visualize_detection(img_show, output_dict, meta=meta)\n\n            show_map = np.concatenate([heat_map, gt_vis], axis=1)\n            show_map = cv2.resize(show_map, (320 * 3, 320))\n            im_vis = np.concatenate([show_map, show_boundary], axis=0)\n\n            path = os.path.join(cfg.vis_dir, '{}_test'.format(cfg.exp_name), meta['image_id'][idx].split(\".\")[0]+\".jpg\")\n            cv2.imwrite(path, im_vis)\n\n        contours = output_dict[\"py_preds\"][-1].int().cpu().numpy()\n        img_show, contours = rescale_result(img_show, contours, H, W)\n\n        # path = os.path.join(cfg.vis_dir, '{}_test'.format(cfg.exp_name), meta['image_id'][idx].split(\".\")[0] + \".jpg\")\n        # im_show = img_show.copy()\n        # im_show = np.ascontiguousarray(im_show[:, :, ::-1])\n        # cv2.drawContours(im_show, [gt_contour[i] for i, tag in enumerate(label_tag) if tag >0], -1, (0, 255, 0), 4)\n        # cv2.drawContours(im_show, contours, -1, (0, 0, 255), 2)\n        # cv2.imwrite(path, im_show)\n\n        fname = meta['image_id'][idx].replace('jpg', 'txt')\n        write_to_file(contours, os.path.join(output_dir, fname))\n\n\ndef main(vis_dir_path):\n\n    osmkdir(vis_dir_path)\n    testset = DeployDataset(\n        image_root=cfg.img_root,\n        transform=BaseTransform(size=cfg.test_size, mean=cfg.means, std=cfg.stds)\n    )\n\n    if cfg.cuda:\n        cudnn.benchmark = True\n\n    # Data\n    test_loader = data.DataLoader(testset, batch_size=1, shuffle=False, num_workers=cfg.num_workers)\n\n    # Model\n    model = TextNet(is_training=False, backbone=cfg.net)\n    model_path = os.path.join(cfg.save_dir, cfg.exp_name,\n                              'TextBPN_{}_{}.pth'.format(model.backbone_name, cfg.checkepoch))\n\n    model.load_model(model_path)\n    model = model.to(cfg.device)  # copy to cuda\n    model.eval()\n    with torch.no_grad():\n        print('Start testing TextBPN++.')\n        output_dir = os.path.join(cfg.output_dir, cfg.exp_name)\n        inference(model, test_loader, output_dir)\n\n\nif __name__ == \"__main__\":\n    # parse arguments\n    option = BaseOptions()\n    args = option.initialize()\n\n    update_config(cfg, args)\n    print_config(cfg)\n\n    vis_dir = os.path.join(cfg.vis_dir, '{}_test'.format(cfg.exp_name))\n\n    if not os.path.exists(vis_dir):\n        mkdirs(vis_dir)\n    # main\n    main(vis_dir)\n"
  },
  {
    "path": "demo.sh",
    "content": "#!/bin/bash\nCUDA_LAUNCH_BLOCKING=1 python3 demo.py --net resnet18 --scale 4 --exp_name TD500 --checkepoch 1135 --test_size 640 960 --dis_threshold 0.35 --cls_threshold 0.9 --gpu 0 --img_root ./Test1  --viz\n"
  },
  {
    "path": "eval.sh",
    "content": "#!/bin/bash\n##################### Total-Text ###################################\n# test_size=(640,1024)--cfglib/option\n#CUDA_LAUNCH_BLOCKING=1 python eval_textBPN.py --exp_name Totaltext --checkepoch 540 --test_size 640 800 --dis_threshold 0.315 --cls_threshold 0.85 --gpu 1\n###################### CTW-1500 ####################################\n# test_size=(640,1024)--cfglib/option\n#CUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN.py --exp_name Ctw1500 --checkepoch 600 --test_size 640 1024 --dis_threshold 0.375 --cls_threshold 0.8 --gpu 1\n\n#################### MSRA-TD500 ######################################\n# test_size=(640,1024)--cfglib/option\n#CUDA_LAUNCH_BLOCKING=1 python eval_textBPN.py --exp_name TD500 --checkepoch 680 --dis_threshold 0.3 --cls_threshold 0.925 --gpu 1\n\n\n#for ((i=55; i<=660; i=i+5))\n# do CUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN.py --exp_name Ctw1500 --checkepoch $i --test_size 640 1024 --dis_threshold 0.35 --cls_threshold 0.825 --gpu 0;\n#done\n\n\nfor ((i=55; i<=660; i=i+5))\n do CUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN.py --exp_name Totaltext --checkepoch $i --test_size 640 1024 --dis_threshold 0.3 --cls_threshold 0.85 --gpu 0;\n done\n"
  },
  {
    "path": "eval_textBPN.py",
    "content": "import os\nimport time\nimport cv2\nimport numpy as np\nimport json\nfrom shapely.geometry import *\nimport torch\nimport subprocess\nimport torch.backends.cudnn as cudnn\nimport torch.utils.data as data\nfrom dataset import TotalText, Ctw1500Text, Icdar15Text, Mlt2017Text, TD500Text, \\\n    ArtText, ArtTextJson, Mlt2019Text, Ctw1500Text_New, TotalText_New\nfrom network.textnet import TextNet\nfrom cfglib.config import config as cfg, update_config, print_config\nfrom cfglib.option import BaseOptions\nfrom util.augmentation import BaseTransform\nfrom util.visualize import visualize_detection, visualize_gt\nfrom util.misc import to_device, mkdirs,rescale_result\nfrom util.eval import deal_eval_total_text, deal_eval_ctw1500, deal_eval_icdar15, \\\n    deal_eval_TD500, data_transfer_ICDAR, data_transfer_TD500, data_transfer_MLT2017\n\nimport multiprocessing\nmultiprocessing.set_start_method(\"spawn\", force=True)\n\n\ndef osmkdir(out_dir):\n    import shutil\n    if os.path.exists(out_dir):\n        shutil.rmtree(out_dir)\n    os.makedirs(out_dir)\n\n\ndef write_to_file(contours, file_path):\n    \"\"\"\n    :param contours: [[x1, y1], [x2, y2]... [xn, yn]]\n    :param file_path: target file path\n    \"\"\"\n    # according to total-text evaluation method, output file shoud be formatted to: y0,x0, ..... yn,xn\n    with open(file_path, 'w') as f:\n        for cont in contours:\n            cont = np.stack([cont[:, 0], cont[:, 1]], 1)\n            if cv2.contourArea(cont) <= 0:\n                continue\n            cont = cont.flatten().astype(str).tolist()\n            cont = ','.join(cont)\n            f.write(cont + '\\n')\n\n\ndef inference(model, test_loader, output_dir):\n\n    total_time = 0.\n    if cfg.exp_name != \"MLT2017\" and cfg.exp_name != \"ArT\":\n        osmkdir(output_dir)\n    else:\n        if not os.path.exists(output_dir):\n            mkdirs(output_dir)\n        if cfg.exp_name == \"MLT2017\":\n            out_dir = os.path.join(output_dir, \"{}_{}_{}_{}_{}\".\n                                   format(str(cfg.checkepoch), cfg.test_size[0],\n                                          cfg.test_size[1], cfg.dis_threshold, cfg.cls_threshold))\n            if not os.path.exists(out_dir):\n                mkdirs(out_dir)\n\n    art_results = dict()\n    for i, (image, meta) in enumerate(test_loader):\n        input_dict = dict()\n        idx = 0  # test mode can only run with batch_size == 1\n        H, W = meta['Height'][idx].item(), meta['Width'][idx].item()\n        print(meta['image_id'], (H, W))\n\n        input_dict['img'] = to_device(image)\n\n        # get detection result\n        start = time.time()\n        output_dict = model(input_dict)\n        torch.cuda.synchronize()\n        end = time.time()\n        if i > 0:\n            total_time += end - start\n            fps = (i + 1) / total_time\n        else:\n            fps = 0.0\n\n        print('detect {} / {} images: {}. ({:.2f} fps)'.\n              format(i + 1, len(test_loader), meta['image_id'][idx], fps))\n\n        # visualization\n        img_show = image[idx].permute(1, 2, 0).cpu().numpy()\n        img_show = ((img_show * cfg.stds + cfg.means) * 255).astype(np.uint8)\n\n        if cfg.viz:\n        # if True:\n            gt_contour = []\n            label_tag = meta['label_tag'][idx].int().cpu().numpy()\n            for annot, n_annot in zip(meta['annotation'][idx], meta['n_annotation'][idx]):\n                if n_annot.item() > 0:\n                    gt_contour.append(annot[:n_annot].int().cpu().numpy())\n\n            gt_vis = visualize_gt(img_show, gt_contour, label_tag)\n            show_boundary, heat_map = visualize_detection(img_show, output_dict, meta=meta)\n\n            show_map = np.concatenate([heat_map, gt_vis], axis=1)\n            show_map = cv2.resize(show_map, (320 * 3, 320))\n            im_vis = np.concatenate([show_map, show_boundary], axis=0)\n\n            path = os.path.join(cfg.vis_dir, '{}_test'.format(cfg.exp_name), meta['image_id'][idx].split(\".\")[0]+\".jpg\")\n            cv2.imwrite(path, im_vis)\n\n        contours = output_dict[\"py_preds\"][-1].int().cpu().numpy()\n        img_show, contours = rescale_result(img_show, contours, H, W)\n\n        # path = os.path.join(cfg.vis_dir, '{}_test'.format(cfg.exp_name), meta['image_id'][idx].split(\".\")[0] + \".jpg\")\n        # im_show = img_show.copy()\n        # im_show = np.ascontiguousarray(im_show[:, :, ::-1])\n        # cv2.drawContours(im_show, [gt_contour[i] for i, tag in enumerate(label_tag) if tag >0], -1, (0, 255, 0), 4)\n        # cv2.drawContours(im_show, contours, -1, (0, 0, 255), 2)\n        # cv2.imwrite(path, im_show)\n\n        # empty GPU  cache\n        torch.cuda.empty_cache()\n\n        # write to file\n        if cfg.exp_name == \"Icdar2015\":\n            fname = \"res_\" + meta['image_id'][idx].replace('jpg', 'txt')\n            contours = data_transfer_ICDAR(contours)\n            write_to_file(contours, os.path.join(output_dir, fname))\n        elif cfg.exp_name == \"MLT2017\":\n            fname = meta['image_id'][idx].split(\"/\")[-1].replace('ts', 'res')\n            fname = fname.split(\".\")[0] + \".txt\"\n            data_transfer_MLT2017(contours, os.path.join(out_dir, fname))\n        elif cfg.exp_name == \"TD500\":\n            fname = \"res_\" + meta['image_id'][idx].split(\".\")[0]+\".txt\"\n            data_transfer_TD500(contours, os.path.join(output_dir, fname))\n        elif cfg.exp_name == \"ArT\":\n            fname = meta['image_id'][idx].split(\".\")[0].replace('gt', 'res')\n            art_result = []\n            for j in range(len(contours)):\n                art_res = dict()\n                S = cv2.contourArea(contours[j], oriented=True)\n                if S < 0:\n                    art_res['points'] = contours[j].tolist()[::-1]\n                else:\n                    print((meta['image_id'], S))\n                    continue\n                art_res['confidence'] = float(output_dict['confidences'][j])\n                art_result.append(art_res)\n            art_results[fname] = art_result\n            # print(art_results)\n        else:\n            fname = meta['image_id'][idx].replace('jpg', 'txt')\n            write_to_file(contours, os.path.join(output_dir, fname))\n    if cfg.exp_name == \"ArT\":\n        with open(output_dir + '/art_test_{}_{}_{}_{}_{}.json'.\n                format(cfg.checkepoch, cfg.test_size[0], cfg.test_size[1],\n                       cfg.dis_threshold, cfg.cls_threshold), 'w') as f:\n            json.dump(art_results, f)\n    elif cfg.exp_name == \"MLT2017\":\n        father_path = \"{}_{}_{}_{}_{}\".format(str(cfg.checkepoch), cfg.test_size[0],\n                                              cfg.test_size[1], cfg.dis_threshold, cfg.cls_threshold)\n        subprocess.call(['sh', './output/MLT2017/eval_zip.sh', father_path])\n        pass\n\n\ndef main(vis_dir_path):\n\n    osmkdir(vis_dir_path)\n    if cfg.exp_name == \"Totaltext\":\n        testset = TotalText(\n            data_root='data/total-text-mat',\n            ignore_list=None,\n            is_training=False,\n            transform=BaseTransform(size=cfg.test_size, mean=cfg.means, std=cfg.stds)\n        )\n\n    elif cfg.exp_name == \"Ctw1500\":\n        testset = Ctw1500Text(\n            data_root='data/ctw1500',\n            is_training=False,\n            transform=BaseTransform(size=cfg.test_size, mean=cfg.means, std=cfg.stds)\n        )\n    elif cfg.exp_name == \"Icdar2015\":\n        testset = Icdar15Text(\n            data_root='data/Icdar2015',\n            is_training=False,\n            transform=BaseTransform(size=cfg.test_size, mean=cfg.means, std=cfg.stds)\n        )\n    elif cfg.exp_name == \"MLT2017\":\n        testset = Mlt2017Text(\n            data_root='data/MLT2017',\n            is_training=False,\n            transform=BaseTransform(size=cfg.test_size, mean=cfg.means, std=cfg.stds)\n        )\n    elif cfg.exp_name == \"TD500\":\n        testset = TD500Text(\n            data_root='data/TD500',\n            is_training=False,\n            transform=BaseTransform(size=cfg.test_size, mean=cfg.means, std=cfg.stds)\n        )\n    elif cfg.exp_name == \"ArT\":\n        testset = ArtTextJson(\n            data_root='data/ArT',\n            is_training=False,\n            transform=BaseTransform(size=cfg.test_size, mean=cfg.means, std=cfg.stds)\n        )\n    else:\n        print(\"{} is not justify\".format(cfg.exp_name))\n\n    if cfg.cuda:\n        cudnn.benchmark = True\n\n    # Data\n    test_loader = data.DataLoader(testset, batch_size=1, shuffle=False, num_workers=cfg.num_workers)\n\n    # Model\n    model = TextNet(is_training=False, backbone=cfg.net)\n    model_path = os.path.join(cfg.save_dir, cfg.exp_name,\n                              'TextBPN_{}_{}.pth'.format(model.backbone_name, cfg.checkepoch))\n\n    model.load_model(model_path)\n    model = model.to(cfg.device)  # copy to cuda\n    model.eval()\n    with torch.no_grad():\n        print('Start testing TextBPN++.')\n        output_dir = os.path.join(cfg.output_dir, cfg.exp_name)\n        inference(model, test_loader, output_dir)\n\n    if cfg.exp_name == \"Totaltext\":\n        deal_eval_total_text(debug=True)\n\n    elif cfg.exp_name == \"Ctw1500\":\n        deal_eval_ctw1500(debug=True)\n\n    elif cfg.exp_name == \"Icdar2015\":\n        deal_eval_icdar15(debug=True)\n    elif cfg.exp_name == \"TD500\":\n        deal_eval_TD500(debug=True)\n    else:\n        print(\"{} is not justify\".format(cfg.exp_name))\n\n\nif __name__ == \"__main__\":\n    # parse arguments\n    option = BaseOptions()\n    args = option.initialize()\n\n    update_config(cfg, args)\n    print_config(cfg)\n\n    vis_dir = os.path.join(cfg.vis_dir, '{}_test'.format(cfg.exp_name))\n\n    if not os.path.exists(vis_dir):\n        mkdirs(vis_dir)\n    # main\n    main(vis_dir)\n"
  },
  {
    "path": "eval_textBPN_speed.py",
    "content": "import os\nimport time\nimport torch\nimport torch.backends.cudnn as cudnn\nimport torch.utils.data as data\nfrom dataset import TotalText, Ctw1500Text, Icdar15Text, Mlt2017Text, TD500Text, \\\n    ArtText, ArtTextJson, Mlt2019Text, Ctw1500Text_New, TotalText_New\nfrom network.textnet import TextNet\nfrom util.augmentation import BaseTransform\nfrom cfglib.config import config as cfg, update_config, print_config\nfrom cfglib.option import BaseOptions\nfrom util.misc import to_device, mkdirs\n\nimport multiprocessing\nmultiprocessing.set_start_method(\"spawn\", force=True)\n\n\ndef osmkdir(out_dir):\n    import shutil\n    if os.path.exists(out_dir):\n        shutil.rmtree(out_dir)\n    os.makedirs(out_dir)\n\n\ndef inference(model, test_loader, output_dir):\n\n    total_time = 0.\n    if cfg.exp_name != \"MLT2017\" and cfg.exp_name != \"ArT\":\n        osmkdir(output_dir)\n    else:\n        if not os.path.exists(output_dir):\n            mkdirs(output_dir)\n\n    for i, (image, meta) in enumerate(test_loader):\n        input_dict = dict()\n        input_dict['img'] = to_device(image)\n        # init model\n        if i == 0:\n            output_dict = model(input_dict, test_speed=True)\n\n        for k in range(0, 50):\n            start = time.time()\n            output_dict = model(input_dict, test_speed=True)\n            torch.cuda.synchronize()\n            end = time.time()\n            total_time += end - start\n\n        fps = (i + 1)*50 / total_time\n\n        print('detect {} / {} images: {}. ({:.2f} fps)'.\n              format(i + 1, len(test_loader), meta['image_id'][0], fps))\n\n\ndef main(vis_dir_path):\n\n    osmkdir(vis_dir_path)\n    if cfg.exp_name == \"Totaltext\":\n        testset = TotalText(\n            data_root='data/total-text-mat',\n            ignore_list=None,\n            is_training=False,\n            transform=BaseTransform(size=cfg.test_size, mean=cfg.means, std=cfg.stds)\n        )\n\n    elif cfg.exp_name == \"Ctw1500\":\n        testset = Ctw1500Text(\n            data_root='data/ctw1500',\n            is_training=False,\n            transform=BaseTransform(size=cfg.test_size, mean=cfg.means, std=cfg.stds)\n        )\n    elif cfg.exp_name == \"Icdar2015\":\n        testset = Icdar15Text(\n            data_root='data/Icdar2015',\n            is_training=False,\n            transform=BaseTransform(size=cfg.test_size, mean=cfg.means, std=cfg.stds)\n        )\n    elif cfg.exp_name == \"MLT2017\":\n        testset = Mlt2017Text(\n            data_root='data/MLT2017',\n            is_training=False,\n            transform=BaseTransform(size=cfg.test_size, mean=cfg.means, std=cfg.stds)\n        )\n    elif cfg.exp_name == \"TD500\":\n        testset = TD500Text(\n            data_root='data/TD500',\n            is_training=False,\n            transform=BaseTransform(size=cfg.test_size, mean=cfg.means, std=cfg.stds)\n        )\n    elif cfg.exp_name == \"ArT\":\n        testset = ArtTextJson(\n            data_root='data/ArT',\n            is_training=False,\n            transform=BaseTransform(size=cfg.test_size, mean=cfg.means, std=cfg.stds)\n        )\n    else:\n        print(\"{} is not justify\".format(cfg.exp_name))\n\n    if cfg.cuda:\n        cudnn.benchmark = True\n\n    test_loader = data.DataLoader(testset, batch_size=1, shuffle=False, num_workers=cfg.num_workers, pin_memory=True)\n\n    # Model\n    model = TextNet(is_training=False, backbone=cfg.net)\n    model_path = os.path.join(cfg.save_dir, cfg.exp_name,\n                              'TextBPN_{}_{}.pth'.format(model.backbone_name, cfg.checkepoch))\n\n    model.load_model(model_path)\n    model = model.to(cfg.device)  # copy to cuda\n    model.eval()\n    with torch.no_grad():\n        print('Start testing TextBPN++.')\n        output_dir = os.path.join(cfg.output_dir, cfg.exp_name)\n        inference(model, test_loader, output_dir)\n\n\nif __name__ == \"__main__\":\n    # parse arguments\n    option = BaseOptions()\n    args = option.initialize()\n\n    update_config(cfg, args)\n    print_config(cfg)\n\n    vis_dir = os.path.join(cfg.vis_dir, '{}_test'.format(cfg.exp_name))\n\n    if not os.path.exists(vis_dir):\n        mkdirs(vis_dir)\n    # main\n    main(vis_dir)\n"
  },
  {
    "path": "model/README.md",
    "content": "Save the model here...\n\n"
  },
  {
    "path": "network/Reg_loss.py",
    "content": "# -*- coding: utf-8 -*-\n# @Time    : 10/1/21\n# @Author  : GXYM\nimport torch\nfrom torch import nn\nimport numpy as np\nimport torch.nn.functional as F\n\n\nclass PolyMatchingLoss(nn.Module):\n    def __init__(self, pnum, device, loss_type=\"L1\"):\n        super(PolyMatchingLoss, self).__init__()\n\n        self.pnum = pnum\n        self.device = device\n        self.loss_type = loss_type\n        self.smooth_L1 = F.smooth_l1_loss\n        self.L2_loss = torch.nn.MSELoss(reduce=False, size_average=False)\n\n        batch_size = 1\n        pidxall = np.zeros(shape=(batch_size, pnum, pnum), dtype=np.int32)\n        for b in range(batch_size):\n            for i in range(pnum):\n                pidx = (np.arange(pnum) + i) % pnum\n                pidxall[b, i] = pidx\n\n        pidxall = torch.from_numpy(np.reshape(pidxall, newshape=(batch_size, -1))).to(device)\n        self.feature_id = pidxall.unsqueeze_(2).long().expand(pidxall.size(0), pidxall.size(1), 2).detach()\n        print(self.feature_id.shape)\n\n    def match_loss(self, pred, gt):\n        batch_size = pred.shape[0]\n        feature_id = self.feature_id.expand(batch_size, self.feature_id.size(1), 2)\n\n        gt_expand = torch.gather(gt, 1, feature_id).view(batch_size, self.pnum, self.pnum, 2)\n        pred_expand = pred.unsqueeze(1)\n\n        if self.loss_type == \"L2\":\n            dis = self.L2_loss(pred_expand, gt_expand)\n            dis = dis.sum(3).sqrt().mean(2)\n        elif self.loss_type == \"L1\":\n            dis = self.smooth_L1(pred_expand, gt_expand, reduction='none')\n            dis = dis.sum(3).mean(2)\n\n        min_dis, min_id = torch.min(dis, dim=1, keepdim=True)\n\n        return min_dis\n\n    def forward(self, pred_list, gt):\n        loss = torch.tensor(0.)\n        for pred in pred_list:\n            loss += torch.mean(self.match_loss(pred, gt))\n\n        return loss / torch.tensor(len(pred_list))\n\n        # los = []\n        # for pred in pred_list:\n        #     los.append(self.match_loss(pred, gt))\n        #\n        # los_b = torch.tensor(0.)\n        # loss_c = torch.tensor(0.)\n        # for i, _ in enumerate(los):\n        #     los_b += torch.mean(los[i])\n        #     loss_c += (torch.mean(torch.clamp(los[i] - los[i - 1], min=0.0)) if i > 0 else torch.tensor(0.))\n        # loss = los_b / torch.tensor(len(los)) + 0.5*loss_c / torch.tensor(len(los)-1)\n        #\n        # return loss\n\n\nclass AttentionLoss(nn.Module):\n    def __init__(self, beta=4, gamma=0.5):\n        super(AttentionLoss, self).__init__()\n\n        self.beta = beta\n        self.gamma = gamma\n\n    def forward(self, pred, gt):\n        num_pos = torch.sum(gt)\n        num_neg = torch.sum(1 - gt)\n        alpha = num_neg / (num_pos + num_neg)\n        edge_beta = torch.pow(self.beta, torch.pow(1 - pred, self.gamma))\n        bg_beta = torch.pow(self.beta, torch.pow(pred, self.gamma))\n\n        loss = 0\n        loss = loss - alpha * edge_beta * torch.log(pred) * gt\n        loss = loss - (1 - alpha) * bg_beta * torch.log(1 - pred) * (1 - gt)\n        return torch.mean(loss)\n\n\nclass GeoCrossEntropyLoss(nn.Module):\n    def __init__(self):\n        super(GeoCrossEntropyLoss, self).__init__()\n\n    def forward(self, output, target, poly):\n        output = torch.nn.functional.softmax(output, dim=1)\n        output = torch.log(torch.clamp(output, min=1e-4))\n        poly = poly.view(poly.size(0), 4, poly.size(1) // 4, 2)\n        target = target[..., None, None].expand(poly.size(0), poly.size(1), 1, poly.size(3))\n        target_poly = torch.gather(poly, 2, target)\n        sigma = (poly[:, :, 0] - poly[:, :, 1]).pow(2).sum(2, keepdim=True)\n        kernel = torch.exp(-(poly - target_poly).pow(2).sum(3) / (sigma / 3))\n        loss = -(output * kernel.transpose(2, 1)).sum(1).mean()\n        return loss\n\n\nclass AELoss(nn.Module):\n    def __init__(self):\n        super(AELoss, self).__init__()\n\n    def forward(self, ae, ind, ind_mask):\n        \"\"\"\n        ae: [b, 1, h, w]\n        ind: [b, max_objs, max_parts]\n        ind_mask: [b, max_objs, max_parts]\n        obj_mask: [b, max_objs]\n        \"\"\"\n        # first index\n        b, _, h, w = ae.shape\n        b, max_objs, max_parts = ind.shape\n        obj_mask = torch.sum(ind_mask, dim=2) != 0\n\n        ae = ae.view(b, h * w, 1)\n        seed_ind = ind.view(b, max_objs * max_parts, 1)\n        tag = ae.gather(1, seed_ind).view(b, max_objs, max_parts)\n\n        # compute the mean\n        tag_mean = tag * ind_mask\n        tag_mean = tag_mean.sum(2) / (ind_mask.sum(2) + 1e-4)\n\n        # pull ae of the same object to their mean\n        pull_dist = (tag - tag_mean.unsqueeze(2)).pow(2) * ind_mask\n        obj_num = obj_mask.sum(dim=1).float()\n        pull = (pull_dist.sum(dim=(1, 2)) / (obj_num + 1e-4)).sum()\n        pull /= b\n\n        # push away the mean of different objects\n        push_dist = torch.abs(tag_mean.unsqueeze(1) - tag_mean.unsqueeze(2))\n        push_dist = 1 - push_dist\n        push_dist = nn.functional.relu(push_dist, inplace=True)\n        obj_mask = (obj_mask.unsqueeze(1) + obj_mask.unsqueeze(2)) == 2\n        push_dist = push_dist * obj_mask.float()\n        push = ((push_dist.sum(dim=(1, 2)) - obj_num) / (obj_num * (obj_num - 1) + 1e-4)).sum()\n        push /= b\n        return pull, push\n\n\ndef smooth_l1_loss(inputs, target, sigma=9.0):\n    try:\n        diff = torch.abs(inputs - target)\n        less_one = (diff < 1.0 / sigma).float()\n        loss = less_one * 0.5 * diff ** 2 * sigma \\\n               + torch.abs(torch.tensor(1.0) - less_one) * (diff - 0.5 / sigma)\n        loss = torch.mean(loss) if loss.numel() > 0 else torch.tensor(0.0)\n    except Exception as e:\n        print('RPN_REGR_Loss Exception:', e)\n        loss = torch.tensor(0.0)\n\n    return loss\n\n\ndef _neg_loss(pred, gt):\n    ''' Modified focal loss. Exactly the same as CornerNet.\n        Runs faster and costs a little bit more memory\n        Arguments:\n            pred (batch x c x h x w)\n            gt_regr (batch x c x h x w)\n    '''\n    pos_inds = gt.eq(1).float()\n    neg_inds = gt.lt(1).float()\n\n    neg_weights = torch.pow(1 - gt, 4)\n\n    loss = 0\n\n    pos_loss = torch.log(pred) * torch.pow(1 - pred, 2) * pos_inds\n    neg_loss = torch.log(1 - pred) * torch.pow(pred, 2) * neg_weights * neg_inds\n\n    num_pos = pos_inds.float().sum()\n    pos_loss = pos_loss.sum()\n    neg_loss = neg_loss.sum()\n\n    if num_pos == 0:\n        loss = loss - neg_loss\n    else:\n        loss = loss - (pos_loss + neg_loss) / num_pos\n    return loss\n\n\nclass FocalLoss(nn.Module):\n    '''nn.Module warpper for focal loss'''\n    def __init__(self):\n        super(FocalLoss, self).__init__()\n        self.neg_loss = _neg_loss\n\n    def forward(self, out, target):\n        return self.neg_loss(out, target)"
  },
  {
    "path": "network/Seg_loss.py",
    "content": "# -*- coding: utf-8 -*-\n# @Time    : 10/1/21\n# @Author  : GXYM\nimport torch\nfrom torch import nn\nimport numpy as np\n\n\nclass SegmentLoss(nn.Module):\n    def __init__(self, Lambda, ratio=3, reduction='mean'):\n        \"\"\"Implement PSE Loss.\n        \"\"\"\n        super(SegmentLoss, self).__init__()\n        assert reduction in ['mean', 'sum'], \" reduction must in ['mean','sum']\"\n        self.Lambda = Lambda\n        self.ratio = ratio\n        self.reduction = reduction\n\n    def forward(self, outputs, labels, training_masks, th=0.5):\n        texts = outputs[:, -1, :, :]\n        kernels = outputs[:, :-1, :, :]\n        gt_texts = labels[:, -1, :, :]\n        gt_kernels = labels[:, :-1, :, :]\n\n        selected_masks = self.ohem_batch(texts, gt_texts, training_masks)\n        selected_masks = selected_masks.to(outputs.device)\n\n        loss_text = self.dice_loss(texts, gt_texts, selected_masks)\n\n        loss_kernels = []\n        # mask0 = torch.sigmoid(texts).data.cpu().numpy()\n        mask0 = texts.data.cpu().numpy()\n        mask1 = training_masks.data.cpu().numpy()\n        selected_masks = ((mask0 > th) & (mask1 > th)).astype('float32')\n        selected_masks = torch.from_numpy(selected_masks).float()\n        selected_masks = selected_masks.to(outputs.device)\n        kernels_num = gt_kernels.size()[1]\n        for i in range(kernels_num):\n            kernel_i = kernels[:, i, :, :]\n            gt_kernel_i = gt_kernels[:, i, :, :]\n            loss_kernel_i = self.dice_loss(kernel_i, gt_kernel_i, selected_masks)\n            loss_kernels.append(loss_kernel_i)\n        loss_kernels = torch.stack(loss_kernels).mean(0)\n        if self.reduction == 'mean':\n            loss_text = loss_text.mean()\n            loss_kernels = loss_kernels.mean()\n        elif self.reduction == 'sum':\n            loss_text = loss_text.sum()\n            loss_kernels = loss_kernels.sum()\n\n        loss = self.Lambda *loss_text + (1-self.Lambda)*loss_kernels\n        return loss_text, loss_kernels, loss\n\n    def dice_loss(self, input, target, mask):\n        # input = torch.sigmoid(input)\n\n        input = input.contiguous().view(input.size()[0], -1)\n        target = target.contiguous().view(target.size()[0], -1)\n        mask = mask.contiguous().view(mask.size()[0], -1)\n\n        input = input * mask\n        target = (target.float()) * mask\n\n        a = torch.sum(input * target, 1)\n        b = torch.sum(input * input, 1) + 0.001\n        c = torch.sum(target * target, 1) + 0.001\n        d = (2 * a) / (b + c)\n        return 1 - d\n\n    def ohem_single(self, score, gt_text, training_mask, th=0.5):\n        pos_num = (int)(np.sum(gt_text > th)) - (int)(np.sum((gt_text > th) & (training_mask <= th)))\n\n        if pos_num == 0:\n            # selected_mask = gt_text.copy() * 0 # may be not good\n            selected_mask = training_mask\n            selected_mask = selected_mask.reshape(1, selected_mask.shape[0], selected_mask.shape[1]).astype('float32')\n            return selected_mask\n\n        neg_num = (int)(np.sum(gt_text <= th))\n        neg_num = (int)(min(pos_num * 3, neg_num))\n\n        if neg_num == 0:\n            selected_mask = training_mask\n            selected_mask = selected_mask.reshape(1, selected_mask.shape[0], selected_mask.shape[1]).astype('float32')\n            return selected_mask\n\n        neg_score = score[gt_text <= th]\n        # 将负样本得分从高到低排序\n        neg_score_sorted = np.sort(-neg_score)\n        threshold = -neg_score_sorted[neg_num - 1]\n        # 选出 得分高的 负样本 和正样本 的 mask\n        selected_mask = ((score >= threshold) | (gt_text > th)) & (training_mask > th)\n        selected_mask = selected_mask.reshape(1, selected_mask.shape[0], selected_mask.shape[1]).astype('float32')\n        return selected_mask\n\n    def ohem_batch(self, scores, gt_texts, training_masks):\n        scores = scores.data.cpu().numpy()\n        gt_texts = gt_texts.data.cpu().numpy()\n        training_masks = training_masks.data.cpu().numpy()\n        selected_masks = []\n        for i in range(scores.shape[0]):\n            selected_masks.append(self.ohem_single(scores[i, :, :], gt_texts[i, :, :], training_masks[i, :, :]))\n\n        selected_masks = np.concatenate(selected_masks, 0)\n        selected_masks = torch.from_numpy(selected_masks).float()\n\n        return selected_masks\n"
  },
  {
    "path": "network/__init__.py",
    "content": "from . import *"
  },
  {
    "path": "network/backbone/__init__.py",
    "content": "from .resnet import resnet18, resnet34, resnet50, resnet101, deformable_resnet50, deformable_resnet18"
  },
  {
    "path": "network/backbone/assets/dcn/Makefile",
    "content": "#!/bin/bash\nrm  *.so \npython setup.py build_ext --inplace\nrm -rf ./build\n\n\n"
  },
  {
    "path": "network/backbone/assets/dcn/Makefile.sh",
    "content": "#!/bin/bash\nrm  *.so \npython setup.py build_ext --inplace\nrm -rf ./build\n\n\n"
  },
  {
    "path": "network/backbone/assets/dcn/__init__.py",
    "content": "from .functions.deform_conv import deform_conv, modulated_deform_conv\nfrom .functions.deform_pool import deform_roi_pooling\nfrom .modules.deform_conv import (DeformConv, ModulatedDeformConv,\n                                  DeformConvPack, ModulatedDeformConvPack)\nfrom .modules.deform_pool import (DeformRoIPooling, DeformRoIPoolingPack,\n                                  ModulatedDeformRoIPoolingPack)\n\n__all__ = [\n    'DeformConv', 'DeformConvPack', 'ModulatedDeformConv',\n    'ModulatedDeformConvPack', 'DeformRoIPooling', 'DeformRoIPoolingPack',\n    'ModulatedDeformRoIPoolingPack', 'deform_conv', 'modulated_deform_conv',\n    'deform_roi_pooling'\n]\n"
  },
  {
    "path": "network/backbone/assets/dcn/functions/__init__.py",
    "content": ""
  },
  {
    "path": "network/backbone/assets/dcn/functions/deform_conv.py",
    "content": "import torch\nfrom torch.autograd import Function\nfrom torch.nn.modules.utils import _pair\n\nfrom .. import deform_conv_cuda\n\n\nclass DeformConvFunction(Function):\n\n    @staticmethod\n    def forward(ctx,\n                input,\n                offset,\n                weight,\n                stride=1,\n                padding=0,\n                dilation=1,\n                groups=1,\n                deformable_groups=1,\n                im2col_step=64):\n        if input is not None and input.dim() != 4:\n            raise ValueError(\n                \"Expected 4D tensor as input, got {}D tensor instead.\".format(\n                    input.dim()))\n        ctx.stride = _pair(stride)\n        ctx.padding = _pair(padding)\n        ctx.dilation = _pair(dilation)\n        ctx.groups = groups\n        ctx.deformable_groups = deformable_groups\n        ctx.im2col_step = im2col_step\n\n        ctx.save_for_backward(input, offset, weight)\n\n        output = input.new_empty(\n            DeformConvFunction._output_size(input, weight, ctx.padding,\n                                            ctx.dilation, ctx.stride))\n\n        ctx.bufs_ = [input.new_empty(0), input.new_empty(0)]  # columns, ones\n\n        if not input.is_cuda:\n            raise NotImplementedError\n        else:\n            cur_im2col_step = min(ctx.im2col_step, input.shape[0])\n            assert (input.shape[0] %\n                    cur_im2col_step) == 0, 'im2col step must divide batchsize'\n            deform_conv_cuda.deform_conv_forward_cuda(\n                input, weight, offset, output, ctx.bufs_[0], ctx.bufs_[1],\n                weight.size(3), weight.size(2), ctx.stride[1], ctx.stride[0],\n                ctx.padding[1], ctx.padding[0], ctx.dilation[1],\n                ctx.dilation[0], ctx.groups, ctx.deformable_groups,\n                cur_im2col_step)\n        return output\n\n    @staticmethod\n    def backward(ctx, grad_output):\n        input, offset, weight = ctx.saved_tensors\n\n        grad_input = grad_offset = grad_weight = None\n\n        if not grad_output.is_cuda:\n            raise NotImplementedError\n        else:\n            cur_im2col_step = min(ctx.im2col_step, input.shape[0])\n            assert (input.shape[0] %\n                    cur_im2col_step) == 0, 'im2col step must divide batchsize'\n\n            if ctx.needs_input_grad[0] or ctx.needs_input_grad[1]:\n                grad_input = torch.zeros_like(input)\n                grad_offset = torch.zeros_like(offset)\n                deform_conv_cuda.deform_conv_backward_input_cuda(\n                    input, offset, grad_output, grad_input,\n                    grad_offset, weight, ctx.bufs_[0], weight.size(3),\n                    weight.size(2), ctx.stride[1], ctx.stride[0],\n                    ctx.padding[1], ctx.padding[0], ctx.dilation[1],\n                    ctx.dilation[0], ctx.groups, ctx.deformable_groups,\n                    cur_im2col_step)\n\n            if ctx.needs_input_grad[2]:\n                grad_weight = torch.zeros_like(weight)\n                deform_conv_cuda.deform_conv_backward_parameters_cuda(\n                    input, offset, grad_output,\n                    grad_weight, ctx.bufs_[0], ctx.bufs_[1], weight.size(3),\n                    weight.size(2), ctx.stride[1], ctx.stride[0],\n                    ctx.padding[1], ctx.padding[0], ctx.dilation[1],\n                    ctx.dilation[0], ctx.groups, ctx.deformable_groups, 1,\n                    cur_im2col_step)\n\n        return (grad_input, grad_offset, grad_weight, None, None, None, None,\n                None)\n\n    @staticmethod\n    def _output_size(input, weight, padding, dilation, stride):\n        channels = weight.size(0)\n        output_size = (input.size(0), channels)\n        for d in range(input.dim() - 2):\n            in_size = input.size(d + 2)\n            pad = padding[d]\n            kernel = dilation[d] * (weight.size(d + 2) - 1) + 1\n            stride_ = stride[d]\n            output_size += ((in_size + (2 * pad) - kernel) // stride_ + 1, )\n        if not all(map(lambda s: s > 0, output_size)):\n            raise ValueError(\n                \"convolution input is too small (output would be {})\".format(\n                    'x'.join(map(str, output_size))))\n        return output_size\n\n\nclass ModulatedDeformConvFunction(Function):\n\n    @staticmethod\n    def forward(ctx,\n                input,\n                offset,\n                mask,\n                weight,\n                bias=None,\n                stride=1,\n                padding=0,\n                dilation=1,\n                groups=1,\n                deformable_groups=1):\n        ctx.stride = stride\n        ctx.padding = padding\n        ctx.dilation = dilation\n        ctx.groups = groups\n        ctx.deformable_groups = deformable_groups\n        ctx.with_bias = bias is not None\n        if not ctx.with_bias:\n            bias = input.new_empty(1)  # fake tensor\n        if not input.is_cuda:\n            raise NotImplementedError\n        if weight.requires_grad or mask.requires_grad or offset.requires_grad \\\n                or input.requires_grad:\n            ctx.save_for_backward(input, offset, mask, weight, bias)\n        output = input.new_empty(\n            ModulatedDeformConvFunction._infer_shape(ctx, input, weight))\n        ctx._bufs = [input.new_empty(0), input.new_empty(0)]\n        deform_conv_cuda.modulated_deform_conv_cuda_forward(\n            input, weight, bias, ctx._bufs[0], offset, mask, output,\n            ctx._bufs[1], weight.shape[2], weight.shape[3], ctx.stride,\n            ctx.stride, ctx.padding, ctx.padding, ctx.dilation, ctx.dilation,\n            ctx.groups, ctx.deformable_groups, ctx.with_bias)\n        return output\n\n    @staticmethod\n    def backward(ctx, grad_output):\n        if not grad_output.is_cuda:\n            raise NotImplementedError\n        input, offset, mask, weight, bias = ctx.saved_tensors\n        grad_input = torch.zeros_like(input)\n        grad_offset = torch.zeros_like(offset)\n        grad_mask = torch.zeros_like(mask)\n        grad_weight = torch.zeros_like(weight)\n        grad_bias = torch.zeros_like(bias)\n        deform_conv_cuda.modulated_deform_conv_cuda_backward(\n            input, weight, bias, ctx._bufs[0], offset, mask, ctx._bufs[1],\n            grad_input, grad_weight, grad_bias, grad_offset, grad_mask,\n            grad_output, weight.shape[2], weight.shape[3], ctx.stride,\n            ctx.stride, ctx.padding, ctx.padding, ctx.dilation, ctx.dilation,\n            ctx.groups, ctx.deformable_groups, ctx.with_bias)\n        if not ctx.with_bias:\n            grad_bias = None\n\n        return (grad_input, grad_offset, grad_mask, grad_weight, grad_bias,\n                None, None, None, None, None)\n\n    @staticmethod\n    def _infer_shape(ctx, input, weight):\n        n = input.size(0)\n        channels_out = weight.size(0)\n        height, width = input.shape[2:4]\n        kernel_h, kernel_w = weight.shape[2:4]\n        height_out = (height + 2 * ctx.padding -\n                      (ctx.dilation * (kernel_h - 1) + 1)) // ctx.stride + 1\n        width_out = (width + 2 * ctx.padding -\n                     (ctx.dilation * (kernel_w - 1) + 1)) // ctx.stride + 1\n        return n, channels_out, height_out, width_out\n\n\ndeform_conv = DeformConvFunction.apply\nmodulated_deform_conv = ModulatedDeformConvFunction.apply\n"
  },
  {
    "path": "network/backbone/assets/dcn/functions/deform_pool.py",
    "content": "import torch\nfrom torch.autograd import Function\n\nfrom .. import deform_pool_cuda\n\n\nclass DeformRoIPoolingFunction(Function):\n\n    @staticmethod\n    def forward(ctx,\n                data,\n                rois,\n                offset,\n                spatial_scale,\n                out_size,\n                out_channels,\n                no_trans,\n                group_size=1,\n                part_size=None,\n                sample_per_part=4,\n                trans_std=.0):\n        ctx.spatial_scale = spatial_scale\n        ctx.out_size = out_size\n        ctx.out_channels = out_channels\n        ctx.no_trans = no_trans\n        ctx.group_size = group_size\n        ctx.part_size = out_size if part_size is None else part_size\n        ctx.sample_per_part = sample_per_part\n        ctx.trans_std = trans_std\n\n        assert 0.0 <= ctx.trans_std <= 1.0\n        if not data.is_cuda:\n            raise NotImplementedError\n\n        n = rois.shape[0]\n        output = data.new_empty(n, out_channels, out_size, out_size)\n        output_count = data.new_empty(n, out_channels, out_size, out_size)\n        deform_pool_cuda.deform_psroi_pooling_cuda_forward(\n            data, rois, offset, output, output_count, ctx.no_trans,\n            ctx.spatial_scale, ctx.out_channels, ctx.group_size, ctx.out_size,\n            ctx.part_size, ctx.sample_per_part, ctx.trans_std)\n\n        if data.requires_grad or rois.requires_grad or offset.requires_grad:\n            ctx.save_for_backward(data, rois, offset)\n        ctx.output_count = output_count\n\n        return output\n\n    @staticmethod\n    def backward(ctx, grad_output):\n        if not grad_output.is_cuda:\n            raise NotImplementedError\n\n        data, rois, offset = ctx.saved_tensors\n        output_count = ctx.output_count\n        grad_input = torch.zeros_like(data)\n        grad_rois = None\n        grad_offset = torch.zeros_like(offset)\n\n        deform_pool_cuda.deform_psroi_pooling_cuda_backward(\n            grad_output, data, rois, offset, output_count, grad_input,\n            grad_offset, ctx.no_trans, ctx.spatial_scale, ctx.out_channels,\n            ctx.group_size, ctx.out_size, ctx.part_size, ctx.sample_per_part,\n            ctx.trans_std)\n        return (grad_input, grad_rois, grad_offset, None, None, None, None,\n                None, None, None, None)\n\n\ndeform_roi_pooling = DeformRoIPoolingFunction.apply\n"
  },
  {
    "path": "network/backbone/assets/dcn/modules/__init__.py",
    "content": ""
  },
  {
    "path": "network/backbone/assets/dcn/modules/deform_conv.py",
    "content": "import math\n\nimport torch\nimport torch.nn as nn\nfrom torch.nn.modules.utils import _pair\n\nfrom ..functions.deform_conv import deform_conv, modulated_deform_conv\n\n\nclass DeformConv(nn.Module):\n\n    def __init__(self,\n                 in_channels,\n                 out_channels,\n                 kernel_size,\n                 stride=1,\n                 padding=0,\n                 dilation=1,\n                 groups=1,\n                 deformable_groups=1,\n                 bias=False):\n        super(DeformConv, self).__init__()\n\n        assert not bias\n        assert in_channels % groups == 0, \\\n            'in_channels {} cannot be divisible by groups {}'.format(\n                in_channels, groups)\n        assert out_channels % groups == 0, \\\n            'out_channels {} cannot be divisible by groups {}'.format(\n                out_channels, groups)\n\n        self.in_channels = in_channels\n        self.out_channels = out_channels\n        self.kernel_size = _pair(kernel_size)\n        self.stride = _pair(stride)\n        self.padding = _pair(padding)\n        self.dilation = _pair(dilation)\n        self.groups = groups\n        self.deformable_groups = deformable_groups\n\n        self.weight = nn.Parameter(\n            torch.Tensor(out_channels, in_channels // self.groups,\n                         *self.kernel_size))\n\n        self.reset_parameters()\n\n    def reset_parameters(self):\n        n = self.in_channels\n        for k in self.kernel_size:\n            n *= k\n        stdv = 1. / math.sqrt(n)\n        self.weight.data.uniform_(-stdv, stdv)\n\n    def forward(self, x, offset):\n        return deform_conv(x, offset, self.weight, self.stride, self.padding,\n                           self.dilation, self.groups, self.deformable_groups)\n\n\nclass DeformConvPack(DeformConv):\n\n    def __init__(self, *args, **kwargs):\n        super(DeformConvPack, self).__init__(*args, **kwargs)\n\n        self.conv_offset = nn.Conv2d(\n            self.in_channels,\n            self.deformable_groups * 2 * self.kernel_size[0] *\n            self.kernel_size[1],\n            kernel_size=self.kernel_size,\n            stride=_pair(self.stride),\n            padding=_pair(self.padding),\n            bias=True)\n        self.init_offset()\n\n    def init_offset(self):\n        self.conv_offset.weight.data.zero_()\n        self.conv_offset.bias.data.zero_()\n\n    def forward(self, x):\n        offset = self.conv_offset(x)\n        return deform_conv(x, offset, self.weight, self.stride, self.padding,\n                           self.dilation, self.groups, self.deformable_groups)\n\n\nclass ModulatedDeformConv(nn.Module):\n\n    def __init__(self,\n                 in_channels,\n                 out_channels,\n                 kernel_size,\n                 stride=1,\n                 padding=0,\n                 dilation=1,\n                 groups=1,\n                 deformable_groups=1,\n                 bias=True):\n        super(ModulatedDeformConv, self).__init__()\n        self.in_channels = in_channels\n        self.out_channels = out_channels\n        self.kernel_size = _pair(kernel_size)\n        self.stride = stride\n        self.padding = padding\n        self.dilation = dilation\n        self.groups = groups\n        self.deformable_groups = deformable_groups\n        self.with_bias = bias\n\n        self.weight = nn.Parameter(\n            torch.Tensor(out_channels, in_channels // groups,\n                         *self.kernel_size))\n        if bias:\n            self.bias = nn.Parameter(torch.Tensor(out_channels))\n        else:\n            self.register_parameter('bias', None)\n        self.reset_parameters()\n\n    def reset_parameters(self):\n        n = self.in_channels\n        for k in self.kernel_size:\n            n *= k\n        stdv = 1. / math.sqrt(n)\n        self.weight.data.uniform_(-stdv, stdv)\n        if self.bias is not None:\n            self.bias.data.zero_()\n\n    def forward(self, x, offset, mask):\n        return modulated_deform_conv(x, offset, mask, self.weight, self.bias,\n                                     self.stride, self.padding, self.dilation,\n                                     self.groups, self.deformable_groups)\n\n\nclass ModulatedDeformConvPack(ModulatedDeformConv):\n\n    def __init__(self, *args, **kwargs):\n        super(ModulatedDeformConvPack, self).__init__(*args, **kwargs)\n\n        self.conv_offset_mask = nn.Conv2d(\n            self.in_channels,\n            self.deformable_groups * 3 * self.kernel_size[0] *\n            self.kernel_size[1],\n            kernel_size=self.kernel_size,\n            stride=_pair(self.stride),\n            padding=_pair(self.padding),\n            bias=True)\n        self.init_offset()\n\n    def init_offset(self):\n        self.conv_offset_mask.weight.data.zero_()\n        self.conv_offset_mask.bias.data.zero_()\n\n    def forward(self, x):\n        out = self.conv_offset_mask(x)\n        o1, o2, mask = torch.chunk(out, 3, dim=1)\n        offset = torch.cat((o1, o2), dim=1)\n        mask = torch.sigmoid(mask)\n        return modulated_deform_conv(x, offset, mask, self.weight, self.bias,\n                                     self.stride, self.padding, self.dilation,\n                                     self.groups, self.deformable_groups)\n"
  },
  {
    "path": "network/backbone/assets/dcn/modules/deform_pool.py",
    "content": "from torch import nn\n\nfrom ..functions.deform_pool import deform_roi_pooling\n\n\nclass DeformRoIPooling(nn.Module):\n\n    def __init__(self,\n                 spatial_scale,\n                 out_size,\n                 out_channels,\n                 no_trans,\n                 group_size=1,\n                 part_size=None,\n                 sample_per_part=4,\n                 trans_std=.0):\n        super(DeformRoIPooling, self).__init__()\n        self.spatial_scale = spatial_scale\n        self.out_size = out_size\n        self.out_channels = out_channels\n        self.no_trans = no_trans\n        self.group_size = group_size\n        self.part_size = out_size if part_size is None else part_size\n        self.sample_per_part = sample_per_part\n        self.trans_std = trans_std\n\n    def forward(self, data, rois, offset):\n        if self.no_trans:\n            offset = data.new_empty(0)\n        return deform_roi_pooling(\n            data, rois, offset, self.spatial_scale, self.out_size,\n            self.out_channels, self.no_trans, self.group_size, self.part_size,\n            self.sample_per_part, self.trans_std)\n\n\nclass DeformRoIPoolingPack(DeformRoIPooling):\n\n    def __init__(self,\n                 spatial_scale,\n                 out_size,\n                 out_channels,\n                 no_trans,\n                 group_size=1,\n                 part_size=None,\n                 sample_per_part=4,\n                 trans_std=.0,\n                 num_offset_fcs=3,\n                 deform_fc_channels=1024):\n        super(DeformRoIPoolingPack,\n              self).__init__(spatial_scale, out_size, out_channels, no_trans,\n                             group_size, part_size, sample_per_part, trans_std)\n\n        self.num_offset_fcs = num_offset_fcs\n        self.deform_fc_channels = deform_fc_channels\n\n        if not no_trans:\n            seq = []\n            ic = self.out_size * self.out_size * self.out_channels\n            for i in range(self.num_offset_fcs):\n                if i < self.num_offset_fcs - 1:\n                    oc = self.deform_fc_channels\n                else:\n                    oc = self.out_size * self.out_size * 2\n                seq.append(nn.Linear(ic, oc))\n                ic = oc\n                if i < self.num_offset_fcs - 1:\n                    seq.append(nn.ReLU(inplace=True))\n            self.offset_fc = nn.Sequential(*seq)\n            self.offset_fc[-1].weight.data.zero_()\n            self.offset_fc[-1].bias.data.zero_()\n\n    def forward(self, data, rois):\n        assert data.size(1) == self.out_channels\n        if self.no_trans:\n            offset = data.new_empty(0)\n            return deform_roi_pooling(\n                data, rois, offset, self.spatial_scale, self.out_size,\n                self.out_channels, self.no_trans, self.group_size,\n                self.part_size, self.sample_per_part, self.trans_std)\n        else:\n            n = rois.shape[0]\n            offset = data.new_empty(0)\n            x = deform_roi_pooling(data, rois, offset, self.spatial_scale,\n                                   self.out_size, self.out_channels, True,\n                                   self.group_size, self.part_size,\n                                   self.sample_per_part, self.trans_std)\n            offset = self.offset_fc(x.view(n, -1))\n            offset = offset.view(n, 2, self.out_size, self.out_size)\n            return deform_roi_pooling(\n                data, rois, offset, self.spatial_scale, self.out_size,\n                self.out_channels, self.no_trans, self.group_size,\n                self.part_size, self.sample_per_part, self.trans_std)\n\n\nclass ModulatedDeformRoIPoolingPack(DeformRoIPooling):\n\n    def __init__(self,\n                 spatial_scale,\n                 out_size,\n                 out_channels,\n                 no_trans,\n                 group_size=1,\n                 part_size=None,\n                 sample_per_part=4,\n                 trans_std=.0,\n                 num_offset_fcs=3,\n                 num_mask_fcs=2,\n                 deform_fc_channels=1024):\n        super(ModulatedDeformRoIPoolingPack, self).__init__(\n            spatial_scale, out_size, out_channels, no_trans, group_size,\n            part_size, sample_per_part, trans_std)\n\n        self.num_offset_fcs = num_offset_fcs\n        self.num_mask_fcs = num_mask_fcs\n        self.deform_fc_channels = deform_fc_channels\n\n        if not no_trans:\n            offset_fc_seq = []\n            ic = self.out_size * self.out_size * self.out_channels\n            for i in range(self.num_offset_fcs):\n                if i < self.num_offset_fcs - 1:\n                    oc = self.deform_fc_channels\n                else:\n                    oc = self.out_size * self.out_size * 2\n                offset_fc_seq.append(nn.Linear(ic, oc))\n                ic = oc\n                if i < self.num_offset_fcs - 1:\n                    offset_fc_seq.append(nn.ReLU(inplace=True))\n            self.offset_fc = nn.Sequential(*offset_fc_seq)\n            self.offset_fc[-1].weight.data.zero_()\n            self.offset_fc[-1].bias.data.zero_()\n\n            mask_fc_seq = []\n            ic = self.out_size * self.out_size * self.out_channels\n            for i in range(self.num_mask_fcs):\n                if i < self.num_mask_fcs - 1:\n                    oc = self.deform_fc_channels\n                else:\n                    oc = self.out_size * self.out_size\n                mask_fc_seq.append(nn.Linear(ic, oc))\n                ic = oc\n                if i < self.num_mask_fcs - 1:\n                    mask_fc_seq.append(nn.ReLU(inplace=True))\n                else:\n                    mask_fc_seq.append(nn.Sigmoid())\n            self.mask_fc = nn.Sequential(*mask_fc_seq)\n            self.mask_fc[-2].weight.data.zero_()\n            self.mask_fc[-2].bias.data.zero_()\n\n    def forward(self, data, rois):\n        assert data.size(1) == self.out_channels\n        if self.no_trans:\n            offset = data.new_empty(0)\n            return deform_roi_pooling(\n                data, rois, offset, self.spatial_scale, self.out_size,\n                self.out_channels, self.no_trans, self.group_size,\n                self.part_size, self.sample_per_part, self.trans_std)\n        else:\n            n = rois.shape[0]\n            offset = data.new_empty(0)\n            x = deform_roi_pooling(data, rois, offset, self.spatial_scale,\n                                   self.out_size, self.out_channels, True,\n                                   self.group_size, self.part_size,\n                                   self.sample_per_part, self.trans_std)\n            offset = self.offset_fc(x.view(n, -1))\n            offset = offset.view(n, 2, self.out_size, self.out_size)\n            mask = self.mask_fc(x.view(n, -1))\n            mask = mask.view(n, 1, self.out_size, self.out_size)\n            return deform_roi_pooling(\n                data, rois, offset, self.spatial_scale, self.out_size,\n                self.out_channels, self.no_trans, self.group_size,\n                self.part_size, self.sample_per_part, self.trans_std) * mask\n"
  },
  {
    "path": "network/backbone/assets/dcn/setup.py",
    "content": "import os\nPATH =\"{}:{}\".format(os.environ['PATH'], \"/opt/cuda/bin\")\n# os.environ['CUDA_VISIBLE_DEVICES'] = \"1\"\nos.environ['PATH'] = PATH\nfrom setuptools import setup\nfrom torch.utils.cpp_extension import BuildExtension, CUDAExtension\n\nsetup(\n    name='deform_conv',\n    ext_modules=[\n        CUDAExtension('deform_conv_cuda', [\n            'src/deform_conv_cuda.cpp',\n            'src/deform_conv_cuda_kernel.cu',\n        ]),\n        CUDAExtension('deform_pool_cuda', [\n            'src/deform_pool_cuda.cpp', 'src/deform_pool_cuda_kernel.cu'\n        ]),\n    ],\n    cmdclass={'build_ext': BuildExtension})\n"
  },
  {
    "path": "network/backbone/assets/dcn/src/deform_conv_cuda.cpp",
    "content": "// modify from\n// https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/deform_conv_cuda.c\n\n#include <torch/extension.h>\n\n#include <cmath>\n#include <vector>\n\nvoid deformable_im2col(const at::Tensor data_im, const at::Tensor data_offset,\n                       const int channels, const int height, const int width,\n                       const int ksize_h, const int ksize_w, const int pad_h,\n                       const int pad_w, const int stride_h, const int stride_w,\n                       const int dilation_h, const int dilation_w,\n                       const int parallel_imgs, const int deformable_group,\n                       at::Tensor data_col);\n\nvoid deformable_col2im(const at::Tensor data_col, const at::Tensor data_offset,\n                       const int channels, const int height, const int width,\n                       const int ksize_h, const int ksize_w, const int pad_h,\n                       const int pad_w, const int stride_h, const int stride_w,\n                       const int dilation_h, const int dilation_w,\n                       const int parallel_imgs, const int deformable_group,\n                       at::Tensor grad_im);\n\nvoid deformable_col2im_coord(\n    const at::Tensor data_col, const at::Tensor data_im,\n    const at::Tensor data_offset, const int channels, const int height,\n    const int width, const int ksize_h, const int ksize_w, const int pad_h,\n    const int pad_w, const int stride_h, const int stride_w,\n    const int dilation_h, const int dilation_w, const int parallel_imgs,\n    const int deformable_group, at::Tensor grad_offset);\n\nvoid modulated_deformable_im2col_cuda(\n    const at::Tensor data_im, const at::Tensor data_offset,\n    const at::Tensor data_mask, const int batch_size, const int channels,\n    const int height_im, const int width_im, const int height_col,\n    const int width_col, const int kernel_h, const int kenerl_w,\n    const int pad_h, const int pad_w, const int stride_h, const int stride_w,\n    const int dilation_h, const int dilation_w, const int deformable_group,\n    at::Tensor data_col);\n\nvoid modulated_deformable_col2im_cuda(\n    const at::Tensor data_col, const at::Tensor data_offset,\n    const at::Tensor data_mask, const int batch_size, const int channels,\n    const int height_im, const int width_im, const int height_col,\n    const int width_col, const int kernel_h, const int kenerl_w,\n    const int pad_h, const int pad_w, const int stride_h, const int stride_w,\n    const int dilation_h, const int dilation_w, const int deformable_group,\n    at::Tensor grad_im);\n\nvoid modulated_deformable_col2im_coord_cuda(\n    const at::Tensor data_col, const at::Tensor data_im,\n    const at::Tensor data_offset, const at::Tensor data_mask,\n    const int batch_size, const int channels, const int height_im,\n    const int width_im, const int height_col, const int width_col,\n    const int kernel_h, const int kenerl_w, const int pad_h, const int pad_w,\n    const int stride_h, const int stride_w, const int dilation_h,\n    const int dilation_w, const int deformable_group, at::Tensor grad_offset,\n    at::Tensor grad_mask);\n\nvoid shape_check(at::Tensor input, at::Tensor offset, at::Tensor *gradOutput,\n                 at::Tensor weight, int kH, int kW, int dH, int dW, int padH,\n                 int padW, int dilationH, int dilationW, int group,\n                 int deformable_group) {\n  TORCH_CHECK(weight.ndimension() == 4,\n           \"4D weight tensor (nOutputPlane,nInputPlane,kH,kW) expected, \"\n           \"but got: %s\",\n           weight.ndimension());\n\n  TORCH_CHECK(weight.is_contiguous(), \"weight tensor has to be contiguous\");\n\n  TORCH_CHECK(kW > 0 && kH > 0,\n           \"kernel size should be greater than zero, but got kH: %d kW: %d\", kH,\n           kW);\n\n  TORCH_CHECK((weight.size(2) == kH && weight.size(3) == kW),\n           \"kernel size should be consistent with weight, \",\n           \"but got kH: %d kW: %d weight.size(2): %d, weight.size(3): %d\", kH,\n           kW, weight.size(2), weight.size(3));\n\n  TORCH_CHECK(dW > 0 && dH > 0,\n           \"stride should be greater than zero, but got dH: %d dW: %d\", dH, dW);\n\n  TORCH_CHECK(\n      dilationW > 0 && dilationH > 0,\n      \"dilation should be greater than 0, but got dilationH: %d dilationW: %d\",\n      dilationH, dilationW);\n\n  int ndim = input.ndimension();\n  int dimf = 0;\n  int dimh = 1;\n  int dimw = 2;\n\n  if (ndim == 4) {\n    dimf++;\n    dimh++;\n    dimw++;\n  }\n\n  TORCH_CHECK(ndim == 3 || ndim == 4, \"3D or 4D input tensor expected but got: %s\",\n           ndim);\n\n  long nInputPlane = weight.size(1) * group;\n  long inputHeight = input.size(dimh);\n  long inputWidth = input.size(dimw);\n  long nOutputPlane = weight.size(0);\n  long outputHeight =\n      (inputHeight + 2 * padH - (dilationH * (kH - 1) + 1)) / dH + 1;\n  long outputWidth =\n      (inputWidth + 2 * padW - (dilationW * (kW - 1) + 1)) / dW + 1;\n\n  TORCH_CHECK(nInputPlane % deformable_group == 0,\n           \"input channels must divide deformable group size\");\n\n  TORCH_CHECK(outputWidth >= 1 && outputHeight >= 1,\n              \"Given input size: (\", nInputPlane, \" x \", inputHeight, \" x \",\n              inputWidth, \"). Calculated output size: (\", nOutputPlane,\n              \" x \", outputHeight, \" x \", outputWidth,\n              \"). Output size is too small\");\n\n  TORCH_CHECK(input.size(1) == nInputPlane,\n           \"invalid number of input planes, expected: %d, but got: %d\",\n           nInputPlane, input.size(1));\n\n  TORCH_CHECK((inputHeight >= kH && inputWidth >= kW),\n           \"input image is smaller than kernel\");\n\n  TORCH_CHECK((offset.size(2) == outputHeight && offset.size(3) == outputWidth),\n           \"invalid spatial size of offset, expected height: %d width: %d, but \"\n           \"got height: %d width: %d\",\n           outputHeight, outputWidth, offset.size(2), offset.size(3));\n\n  TORCH_CHECK((offset.size(1) == deformable_group * 2 * kH * kW),\n           \"invalid number of channels of offset\");\n\n  if (gradOutput != nullptr) {\n    TORCH_CHECK(gradOutput->size(dimf) == nOutputPlane,\n             \"invalid number of gradOutput planes, expected: %d, but got: %d\",\n             nOutputPlane, gradOutput->size(dimf));\n\n    TORCH_CHECK((gradOutput->size(dimh) == outputHeight &&\n              gradOutput->size(dimw) == outputWidth),\n             \"invalid size of gradOutput, expected height: %d width: %d , but \"\n             \"got height: %d width: %d\",\n             outputHeight, outputWidth, gradOutput->size(dimh),\n             gradOutput->size(dimw));\n  }\n}\n\nint deform_conv_forward_cuda(at::Tensor input, at::Tensor weight,\n                             at::Tensor offset, at::Tensor output,\n                             at::Tensor columns, at::Tensor ones, int kW,\n                             int kH, int dW, int dH, int padW, int padH,\n                             int dilationW, int dilationH, int group,\n                             int deformable_group, int im2col_step) {\n  // todo: resize columns to include im2col: done\n  // todo: add im2col_step as input\n  // todo: add new output buffer and transpose it to output (or directly\n  // transpose output) todo: possibly change data indexing because of\n  // parallel_imgs\n\n  shape_check(input, offset, nullptr, weight, kH, kW, dH, dW, padH, padW,\n              dilationH, dilationW, group, deformable_group);\n\n  input = input.contiguous();\n  offset = offset.contiguous();\n  weight = weight.contiguous();\n\n  int batch = 1;\n  if (input.ndimension() == 3) {\n    // Force batch\n    batch = 0;\n    input.unsqueeze_(0);\n    offset.unsqueeze_(0);\n  }\n\n  // todo: assert batchsize dividable by im2col_step\n\n  long batchSize = input.size(0);\n  long nInputPlane = input.size(1);\n  long inputHeight = input.size(2);\n  long inputWidth = input.size(3);\n\n  long nOutputPlane = weight.size(0);\n\n  long outputWidth =\n      (inputWidth + 2 * padW - (dilationW * (kW - 1) + 1)) / dW + 1;\n  long outputHeight =\n      (inputHeight + 2 * padH - (dilationH * (kH - 1) + 1)) / dH + 1;\n\n  TORCH_CHECK(offset.size(0) == batchSize, \"invalid batch size of offset\");\n\n  output = output.view({batchSize / im2col_step, im2col_step, nOutputPlane,\n                        outputHeight, outputWidth});\n  columns = at::zeros(\n      {nInputPlane * kW * kH, im2col_step * outputHeight * outputWidth},\n      input.options());\n\n  if (ones.ndimension() != 2 ||\n      ones.size(0) * ones.size(1) < outputHeight * outputWidth) {\n    ones = at::ones({outputHeight, outputWidth}, input.options());\n  }\n\n  input = input.view({batchSize / im2col_step, im2col_step, nInputPlane,\n                      inputHeight, inputWidth});\n  offset =\n      offset.view({batchSize / im2col_step, im2col_step,\n                   deformable_group * 2 * kH * kW, outputHeight, outputWidth});\n\n  at::Tensor output_buffer =\n      at::zeros({batchSize / im2col_step, nOutputPlane,\n                 im2col_step * outputHeight, outputWidth},\n                output.options());\n\n  output_buffer = output_buffer.view(\n      {output_buffer.size(0), group, output_buffer.size(1) / group,\n       output_buffer.size(2), output_buffer.size(3)});\n\n  for (int elt = 0; elt < batchSize / im2col_step; elt++) {\n    deformable_im2col(input[elt], offset[elt], nInputPlane, inputHeight,\n                      inputWidth, kH, kW, padH, padW, dH, dW, dilationH,\n                      dilationW, im2col_step, deformable_group, columns);\n\n    columns = columns.view({group, columns.size(0) / group, columns.size(1)});\n    weight = weight.view({group, weight.size(0) / group, weight.size(1),\n                          weight.size(2), weight.size(3)});\n\n    for (int g = 0; g < group; g++) {\n      output_buffer[elt][g] = output_buffer[elt][g]\n                                  .flatten(1)\n                                  .addmm_(weight[g].flatten(1), columns[g])\n                                  .view_as(output_buffer[elt][g]);\n    }\n  }\n\n  output_buffer = output_buffer.view(\n      {output_buffer.size(0), output_buffer.size(1) * output_buffer.size(2),\n       output_buffer.size(3), output_buffer.size(4)});\n\n  output_buffer = output_buffer.view({batchSize / im2col_step, nOutputPlane,\n                                      im2col_step, outputHeight, outputWidth});\n  output_buffer.transpose_(1, 2);\n  output.copy_(output_buffer);\n  output = output.view({batchSize, nOutputPlane, outputHeight, outputWidth});\n\n  input = input.view({batchSize, nInputPlane, inputHeight, inputWidth});\n  offset = offset.view(\n      {batchSize, deformable_group * 2 * kH * kW, outputHeight, outputWidth});\n\n  if (batch == 0) {\n    output = output.view({nOutputPlane, outputHeight, outputWidth});\n    input = input.view({nInputPlane, inputHeight, inputWidth});\n    offset = offset.view({offset.size(1), offset.size(2), offset.size(3)});\n  }\n\n  return 1;\n}\n\nint deform_conv_backward_input_cuda(at::Tensor input, at::Tensor offset,\n                                    at::Tensor gradOutput, at::Tensor gradInput,\n                                    at::Tensor gradOffset, at::Tensor weight,\n                                    at::Tensor columns, int kW, int kH, int dW,\n                                    int dH, int padW, int padH, int dilationW,\n                                    int dilationH, int group,\n                                    int deformable_group, int im2col_step) {\n  shape_check(input, offset, &gradOutput, weight, kH, kW, dH, dW, padH, padW,\n              dilationH, dilationW, group, deformable_group);\n\n  input = input.contiguous();\n  offset = offset.contiguous();\n  gradOutput = gradOutput.contiguous();\n  weight = weight.contiguous();\n\n  int batch = 1;\n\n  if (input.ndimension() == 3) {\n    // Force batch\n    batch = 0;\n    input = input.view({1, input.size(0), input.size(1), input.size(2)});\n    offset = offset.view({1, offset.size(0), offset.size(1), offset.size(2)});\n    gradOutput = gradOutput.view(\n        {1, gradOutput.size(0), gradOutput.size(1), gradOutput.size(2)});\n  }\n\n  long batchSize = input.size(0);\n  long nInputPlane = input.size(1);\n  long inputHeight = input.size(2);\n  long inputWidth = input.size(3);\n\n  long nOutputPlane = weight.size(0);\n\n  long outputWidth =\n      (inputWidth + 2 * padW - (dilationW * (kW - 1) + 1)) / dW + 1;\n  long outputHeight =\n      (inputHeight + 2 * padH - (dilationH * (kH - 1) + 1)) / dH + 1;\n\n  TORCH_CHECK(offset.size(0) == batchSize, \"invalid batch size of offset\");\n  gradInput = gradInput.view({batchSize, nInputPlane, inputHeight, inputWidth});\n  columns = at::zeros(\n      {nInputPlane * kW * kH, im2col_step * outputHeight * outputWidth},\n      input.options());\n\n  // change order of grad output\n  gradOutput = gradOutput.view({batchSize / im2col_step, im2col_step,\n                                nOutputPlane, outputHeight, outputWidth});\n  gradOutput.transpose_(1, 2);\n\n  gradInput = gradInput.view({batchSize / im2col_step, im2col_step, nInputPlane,\n                              inputHeight, inputWidth});\n  input = input.view({batchSize / im2col_step, im2col_step, nInputPlane,\n                      inputHeight, inputWidth});\n  gradOffset = gradOffset.view({batchSize / im2col_step, im2col_step,\n                                deformable_group * 2 * kH * kW, outputHeight,\n                                outputWidth});\n  offset =\n      offset.view({batchSize / im2col_step, im2col_step,\n                   deformable_group * 2 * kH * kW, outputHeight, outputWidth});\n\n  for (int elt = 0; elt < batchSize / im2col_step; elt++) {\n    // divide into groups\n    columns = columns.view({group, columns.size(0) / group, columns.size(1)});\n    weight = weight.view({group, weight.size(0) / group, weight.size(1),\n                          weight.size(2), weight.size(3)});\n    gradOutput = gradOutput.view(\n        {gradOutput.size(0), group, gradOutput.size(1) / group,\n         gradOutput.size(2), gradOutput.size(3), gradOutput.size(4)});\n\n    for (int g = 0; g < group; g++) {\n      columns[g] = columns[g].addmm_(weight[g].flatten(1).transpose(0, 1),\n                                     gradOutput[elt][g].flatten(1), 0.0f, 1.0f);\n    }\n\n    columns =\n        columns.view({columns.size(0) * columns.size(1), columns.size(2)});\n    gradOutput = gradOutput.view(\n        {gradOutput.size(0), gradOutput.size(1) * gradOutput.size(2),\n         gradOutput.size(3), gradOutput.size(4), gradOutput.size(5)});\n\n    deformable_col2im_coord(columns, input[elt], offset[elt], nInputPlane,\n                            inputHeight, inputWidth, kH, kW, padH, padW, dH, dW,\n                            dilationH, dilationW, im2col_step, deformable_group,\n                            gradOffset[elt]);\n\n    deformable_col2im(columns, offset[elt], nInputPlane, inputHeight,\n                      inputWidth, kH, kW, padH, padW, dH, dW, dilationH,\n                      dilationW, im2col_step, deformable_group, gradInput[elt]);\n  }\n\n  gradOutput.transpose_(1, 2);\n  gradOutput =\n      gradOutput.view({batchSize, nOutputPlane, outputHeight, outputWidth});\n\n  gradInput = gradInput.view({batchSize, nInputPlane, inputHeight, inputWidth});\n  input = input.view({batchSize, nInputPlane, inputHeight, inputWidth});\n  gradOffset = gradOffset.view(\n      {batchSize, deformable_group * 2 * kH * kW, outputHeight, outputWidth});\n  offset = offset.view(\n      {batchSize, deformable_group * 2 * kH * kW, outputHeight, outputWidth});\n\n  if (batch == 0) {\n    gradOutput = gradOutput.view({nOutputPlane, outputHeight, outputWidth});\n    input = input.view({nInputPlane, inputHeight, inputWidth});\n    gradInput = gradInput.view({nInputPlane, inputHeight, inputWidth});\n    offset = offset.view({offset.size(1), offset.size(2), offset.size(3)});\n    gradOffset =\n        gradOffset.view({offset.size(1), offset.size(2), offset.size(3)});\n  }\n\n  return 1;\n}\n\nint deform_conv_backward_parameters_cuda(\n    at::Tensor input, at::Tensor offset, at::Tensor gradOutput,\n    at::Tensor gradWeight,  // at::Tensor gradBias,\n    at::Tensor columns, at::Tensor ones, int kW, int kH, int dW, int dH,\n    int padW, int padH, int dilationW, int dilationH, int group,\n    int deformable_group, float scale, int im2col_step) {\n  // todo: transpose and reshape outGrad\n  // todo: reshape columns\n  // todo: add im2col_step as input\n\n  shape_check(input, offset, &gradOutput, gradWeight, kH, kW, dH, dW, padH,\n              padW, dilationH, dilationW, group, deformable_group);\n\n  input = input.contiguous();\n  offset = offset.contiguous();\n  gradOutput = gradOutput.contiguous();\n\n  int batch = 1;\n\n  if (input.ndimension() == 3) {\n    // Force batch\n    batch = 0;\n    input = input.view({1, input.size(0), input.size(1), input.size(2)});\n    gradOutput = gradOutput.view(\n        {1, gradOutput.size(0), gradOutput.size(1), gradOutput.size(2)});\n  }\n\n  long batchSize = input.size(0);\n  long nInputPlane = input.size(1);\n  long inputHeight = input.size(2);\n  long inputWidth = input.size(3);\n\n  long nOutputPlane = gradWeight.size(0);\n\n  long outputWidth =\n      (inputWidth + 2 * padW - (dilationW * (kW - 1) + 1)) / dW + 1;\n  long outputHeight =\n      (inputHeight + 2 * padH - (dilationH * (kH - 1) + 1)) / dH + 1;\n\n  TORCH_CHECK(offset.size(0) == batchSize, \"invalid batch size of offset\");\n\n  columns = at::zeros(\n      {nInputPlane * kW * kH, im2col_step * outputHeight * outputWidth},\n      input.options());\n\n  gradOutput = gradOutput.view({batchSize / im2col_step, im2col_step,\n                                nOutputPlane, outputHeight, outputWidth});\n  gradOutput.transpose_(1, 2);\n\n  at::Tensor gradOutputBuffer = at::zeros_like(gradOutput);\n  gradOutputBuffer =\n      gradOutputBuffer.view({batchSize / im2col_step, nOutputPlane, im2col_step,\n                             outputHeight, outputWidth});\n  gradOutputBuffer.copy_(gradOutput);\n  gradOutputBuffer =\n      gradOutputBuffer.view({batchSize / im2col_step, nOutputPlane,\n                             im2col_step * outputHeight, outputWidth});\n\n  gradOutput.transpose_(1, 2);\n  gradOutput =\n      gradOutput.view({batchSize, nOutputPlane, outputHeight, outputWidth});\n\n  input = input.view({batchSize / im2col_step, im2col_step, nInputPlane,\n                      inputHeight, inputWidth});\n  offset =\n      offset.view({batchSize / im2col_step, im2col_step,\n                   deformable_group * 2 * kH * kW, outputHeight, outputWidth});\n\n  for (int elt = 0; elt < batchSize / im2col_step; elt++) {\n    deformable_im2col(input[elt], offset[elt], nInputPlane, inputHeight,\n                      inputWidth, kH, kW, padH, padW, dH, dW, dilationH,\n                      dilationW, im2col_step, deformable_group, columns);\n\n    // divide into group\n    gradOutputBuffer = gradOutputBuffer.view(\n        {gradOutputBuffer.size(0), group, gradOutputBuffer.size(1) / group,\n         gradOutputBuffer.size(2), gradOutputBuffer.size(3)});\n    columns = columns.view({group, columns.size(0) / group, columns.size(1)});\n    gradWeight =\n        gradWeight.view({group, gradWeight.size(0) / group, gradWeight.size(1),\n                         gradWeight.size(2), gradWeight.size(3)});\n\n    for (int g = 0; g < group; g++) {\n      gradWeight[g] = gradWeight[g]\n                          .flatten(1)\n                          .addmm_(gradOutputBuffer[elt][g].flatten(1),\n                                  columns[g].transpose(1, 0), 1.0, scale)\n                          .view_as(gradWeight[g]);\n    }\n    gradOutputBuffer = gradOutputBuffer.view(\n        {gradOutputBuffer.size(0),\n         gradOutputBuffer.size(1) * gradOutputBuffer.size(2),\n         gradOutputBuffer.size(3), gradOutputBuffer.size(4)});\n    columns =\n        columns.view({columns.size(0) * columns.size(1), columns.size(2)});\n    gradWeight = gradWeight.view({gradWeight.size(0) * gradWeight.size(1),\n                                  gradWeight.size(2), gradWeight.size(3),\n                                  gradWeight.size(4)});\n  }\n\n  input = input.view({batchSize, nInputPlane, inputHeight, inputWidth});\n  offset = offset.view(\n      {batchSize, deformable_group * 2 * kH * kW, outputHeight, outputWidth});\n\n  if (batch == 0) {\n    gradOutput = gradOutput.view({nOutputPlane, outputHeight, outputWidth});\n    input = input.view({nInputPlane, inputHeight, inputWidth});\n  }\n\n  return 1;\n}\n\nvoid modulated_deform_conv_cuda_forward(\n    at::Tensor input, at::Tensor weight, at::Tensor bias, at::Tensor ones,\n    at::Tensor offset, at::Tensor mask, at::Tensor output, at::Tensor columns,\n    int kernel_h, int kernel_w, const int stride_h, const int stride_w,\n    const int pad_h, const int pad_w, const int dilation_h,\n    const int dilation_w, const int group, const int deformable_group,\n    const bool with_bias) {\n  TORCH_CHECK(input.is_contiguous(), \"input tensor has to be contiguous\");\n  TORCH_CHECK(weight.is_contiguous(), \"weight tensor has to be contiguous\");\n\n  const int batch = input.size(0);\n  const int channels = input.size(1);\n  const int height = input.size(2);\n  const int width = input.size(3);\n\n  const int channels_out = weight.size(0);\n  const int channels_kernel = weight.size(1);\n  const int kernel_h_ = weight.size(2);\n  const int kernel_w_ = weight.size(3);\n\n  TORCH_CHECK(kernel_h_ == kernel_h && kernel_w_ == kernel_w,\n              \"Input shape and kernel shape wont match: (\", kernel_h_, \" x \",\n              kernel_w_, \" vs \", kernel_h, \" x \", kernel_w, \").\");\n  TORCH_CHECK(channels == channels_kernel * group,\n              \"Input shape and kernel channels wont match: (\", channels,\n              \" vs \", channels_kernel * group, \").\");\n\n  const int height_out =\n      (height + 2 * pad_h - (dilation_h * (kernel_h - 1) + 1)) / stride_h + 1;\n  const int width_out =\n      (width + 2 * pad_w - (dilation_w * (kernel_w - 1) + 1)) / stride_w + 1;\n\n  if (ones.ndimension() != 2 ||\n      ones.size(0) * ones.size(1) < height_out * width_out) {\n    // Resize plane and fill with ones...\n    ones = at::ones({height_out, width_out}, input.options());\n  }\n\n  // resize output\n  output = output.view({batch, channels_out, height_out, width_out}).zero_();\n  // resize temporary columns\n  columns =\n      at::zeros({channels * kernel_h * kernel_w, 1 * height_out * width_out},\n                input.options());\n\n  output = output.view({output.size(0), group, output.size(1) / group,\n                        output.size(2), output.size(3)});\n\n  for (int b = 0; b < batch; b++) {\n    modulated_deformable_im2col_cuda(\n        input[b], offset[b], mask[b], 1, channels, height, width, height_out,\n        width_out, kernel_h, kernel_w, pad_h, pad_w, stride_h, stride_w,\n        dilation_h, dilation_w, deformable_group, columns);\n\n    // divide into group\n    weight = weight.view({group, weight.size(0) / group, weight.size(1),\n                          weight.size(2), weight.size(3)});\n    columns = columns.view({group, columns.size(0) / group, columns.size(1)});\n\n    for (int g = 0; g < group; g++) {\n      output[b][g] = output[b][g]\n                         .flatten(1)\n                         .addmm_(weight[g].flatten(1), columns[g])\n                         .view_as(output[b][g]);\n    }\n\n    weight = weight.view({weight.size(0) * weight.size(1), weight.size(2),\n                          weight.size(3), weight.size(4)});\n    columns =\n        columns.view({columns.size(0) * columns.size(1), columns.size(2)});\n  }\n\n  output = output.view({output.size(0), output.size(1) * output.size(2),\n                        output.size(3), output.size(4)});\n\n  if (with_bias) {\n    output += bias.view({1, bias.size(0), 1, 1});\n  }\n}\n\nvoid modulated_deform_conv_cuda_backward(\n    at::Tensor input, at::Tensor weight, at::Tensor bias, at::Tensor ones,\n    at::Tensor offset, at::Tensor mask, at::Tensor columns,\n    at::Tensor grad_input, at::Tensor grad_weight, at::Tensor grad_bias,\n    at::Tensor grad_offset, at::Tensor grad_mask, at::Tensor grad_output,\n    int kernel_h, int kernel_w, int stride_h, int stride_w, int pad_h,\n    int pad_w, int dilation_h, int dilation_w, int group, int deformable_group,\n    const bool with_bias) {\n  TORCH_CHECK(input.is_contiguous(), \"input tensor has to be contiguous\");\n  TORCH_CHECK(weight.is_contiguous(), \"weight tensor has to be contiguous\");\n\n  const int batch = input.size(0);\n  const int channels = input.size(1);\n  const int height = input.size(2);\n  const int width = input.size(3);\n\n  const int channels_kernel = weight.size(1);\n  const int kernel_h_ = weight.size(2);\n  const int kernel_w_ = weight.size(3);\n  TORCH_CHECK(kernel_h_ == kernel_h && kernel_w_ == kernel_w,\n              \"Input shape and kernel shape wont match: (\", kernel_h_, \" x \",\n              kernel_w_, \" vs \", kernel_h, \" x \", kernel_w, \").\");\n  TORCH_CHECK(channels == channels_kernel * group,\n              \"Input shape and kernel channels wont match: (\", channels,\n              \" vs \", channels_kernel * group, \").\");\n\n  const int height_out =\n      (height + 2 * pad_h - (dilation_h * (kernel_h - 1) + 1)) / stride_h + 1;\n  const int width_out =\n      (width + 2 * pad_w - (dilation_w * (kernel_w - 1) + 1)) / stride_w + 1;\n\n  if (ones.ndimension() != 2 ||\n      ones.size(0) * ones.size(1) < height_out * width_out) {\n    // Resize plane and fill with ones...\n    ones = at::ones({height_out, width_out}, input.options());\n  }\n\n  grad_input = grad_input.view({batch, channels, height, width});\n  columns = at::zeros({channels * kernel_h * kernel_w, height_out * width_out},\n                      input.options());\n\n  grad_output =\n      grad_output.view({grad_output.size(0), group, grad_output.size(1) / group,\n                        grad_output.size(2), grad_output.size(3)});\n\n  for (int b = 0; b < batch; b++) {\n    // divide int group\n    columns = columns.view({group, columns.size(0) / group, columns.size(1)});\n    weight = weight.view({group, weight.size(0) / group, weight.size(1),\n                          weight.size(2), weight.size(3)});\n\n    for (int g = 0; g < group; g++) {\n      columns[g].addmm_(weight[g].flatten(1).transpose(0, 1),\n                        grad_output[b][g].flatten(1), 0.0f, 1.0f);\n    }\n\n    columns =\n        columns.view({columns.size(0) * columns.size(1), columns.size(2)});\n    weight = weight.view({weight.size(0) * weight.size(1), weight.size(2),\n                          weight.size(3), weight.size(4)});\n\n    // gradient w.r.t. input coordinate data\n    modulated_deformable_col2im_coord_cuda(\n        columns, input[b], offset[b], mask[b], 1, channels, height, width,\n        height_out, width_out, kernel_h, kernel_w, pad_h, pad_w, stride_h,\n        stride_w, dilation_h, dilation_w, deformable_group, grad_offset[b],\n        grad_mask[b]);\n    // gradient w.r.t. input data\n    modulated_deformable_col2im_cuda(\n        columns, offset[b], mask[b], 1, channels, height, width, height_out,\n        width_out, kernel_h, kernel_w, pad_h, pad_w, stride_h, stride_w,\n        dilation_h, dilation_w, deformable_group, grad_input[b]);\n\n    // gradient w.r.t. weight, dWeight should accumulate across the batch and\n    // group\n    modulated_deformable_im2col_cuda(\n        input[b], offset[b], mask[b], 1, channels, height, width, height_out,\n        width_out, kernel_h, kernel_w, pad_h, pad_w, stride_h, stride_w,\n        dilation_h, dilation_w, deformable_group, columns);\n\n    columns = columns.view({group, columns.size(0) / group, columns.size(1)});\n    grad_weight = grad_weight.view({group, grad_weight.size(0) / group,\n                                    grad_weight.size(1), grad_weight.size(2),\n                                    grad_weight.size(3)});\n    if (with_bias)\n      grad_bias = grad_bias.view({group, grad_bias.size(0) / group});\n\n    for (int g = 0; g < group; g++) {\n      grad_weight[g] =\n          grad_weight[g]\n              .flatten(1)\n              .addmm_(grad_output[b][g].flatten(1), columns[g].transpose(0, 1))\n              .view_as(grad_weight[g]);\n      if (with_bias) {\n        grad_bias[g] =\n            grad_bias[g]\n                .view({-1, 1})\n                .addmm_(grad_output[b][g].flatten(1), ones.view({-1, 1}))\n                .view(-1);\n      }\n    }\n\n    columns =\n        columns.view({columns.size(0) * columns.size(1), columns.size(2)});\n    grad_weight = grad_weight.view({grad_weight.size(0) * grad_weight.size(1),\n                                    grad_weight.size(2), grad_weight.size(3),\n                                    grad_weight.size(4)});\n    if (with_bias)\n      grad_bias = grad_bias.view({grad_bias.size(0) * grad_bias.size(1)});\n  }\n  grad_output = grad_output.view({grad_output.size(0) * grad_output.size(1),\n                                  grad_output.size(2), grad_output.size(3),\n                                  grad_output.size(4)});\n}\n\nPYBIND11_MODULE(TORCH_EXTENSION_NAME, m) {\n  m.def(\"deform_conv_forward_cuda\", &deform_conv_forward_cuda,\n        \"deform forward (CUDA)\");\n  m.def(\"deform_conv_backward_input_cuda\", &deform_conv_backward_input_cuda,\n        \"deform_conv_backward_input (CUDA)\");\n  m.def(\"deform_conv_backward_parameters_cuda\",\n        &deform_conv_backward_parameters_cuda,\n        \"deform_conv_backward_parameters (CUDA)\");\n  m.def(\"modulated_deform_conv_cuda_forward\",\n        &modulated_deform_conv_cuda_forward,\n        \"modulated deform conv forward (CUDA)\");\n  m.def(\"modulated_deform_conv_cuda_backward\",\n        &modulated_deform_conv_cuda_backward,\n        \"modulated deform conv backward (CUDA)\");\n}\n"
  },
  {
    "path": "network/backbone/assets/dcn/src/deform_conv_cuda_kernel.cu",
    "content": "/*!\n ******************* BEGIN Caffe Copyright Notice and Disclaimer ****************\n *\n * COPYRIGHT\n *\n * All contributions by the University of California:\n * Copyright (c) 2014-2017 The Regents of the University of California (Regents)\n * All rights reserved.\n *\n * All other contributions:\n * Copyright (c) 2014-2017, the respective contributors\n * All rights reserved.\n *\n * Caffe uses a shared copyright model: each contributor holds copyright over\n * their contributions to Caffe. The project versioning records all such\n * contribution and copyright details. If a contributor wants to further mark\n * their specific copyright on a particular contribution, they should indicate\n * their copyright solely in the commit message of the change when it is\n * committed.\n *\n * LICENSE\n *\n * Redistribution and use in source and binary forms, with or without\n * modification, are permitted provided that the following conditions are met:\n *\n * 1. Redistributions of source code must retain the above copyright notice, this\n * list of conditions and the following disclaimer.\n * 2. Redistributions in binary form must reproduce the above copyright notice,\n * this list of conditions and the following disclaimer in the documentation\n * and/or other materials provided with the distribution.\n *\n * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\" AND\n * ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED\n * WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\n * DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR\n * ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES\n * (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;\n * LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND\n * ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT\n * (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS\n * SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n *\n * CONTRIBUTION AGREEMENT\n *\n * By contributing to the BVLC/caffe repository through pull-request, comment,\n * or otherwise, the contributor releases their content to the\n * license and copyright terms herein.\n *\n ***************** END Caffe Copyright Notice and Disclaimer ********************\n *\n * Copyright (c) 2018 Microsoft\n * Licensed under The MIT License [see LICENSE for details]\n * \\file modulated_deformable_im2col.cuh\n * \\brief Function definitions of converting an image to\n * column matrix based on kernel, padding, dilation, and offset.\n * These functions are mainly used in deformable convolution operators.\n * \\ref: https://arxiv.org/abs/1703.06211\n * \\author Yuwen Xiong, Haozhi Qi, Jifeng Dai, Xizhou Zhu, Han Hu, Dazhi Cheng\n */\n\n// modify from https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu\n\n#include <ATen/ATen.h>\n#include <ATen/cuda/Atomic.cuh>\n#include <cuda_fp16.h>\n#include <float.h>\n#include <math.h>\n#include <stdio.h>\n\nusing namespace at;\n\n#define CUDA_KERNEL_LOOP(i, n)                                 \\\n  for (int i = blockIdx.x * blockDim.x + threadIdx.x; i < (n); \\\n       i += blockDim.x * gridDim.x)\n\nconst int CUDA_NUM_THREADS = 1024;\nconst int kMaxGridNum = 65535;\n\ninline int GET_BLOCKS(const int N)\n{\n  return std::min(kMaxGridNum, (N + CUDA_NUM_THREADS - 1) / CUDA_NUM_THREADS);\n}\n\ntemplate <typename T>\n__device__ inline void dcn_atomic_add(T *address, T val)\n{\n  atomicAdd(address, val);\n}\n\ntemplate <>\n__device__ inline void dcn_atomic_add<at::Half>(at::Half *address, at::Half val)\n{\n#if defined(__CUDA_ARCH__)\n  static_assert(sizeof(at::Half) == sizeof(__half), \"Half size mismatch\");\n  atomicAdd(reinterpret_cast<__half *>(address),\n            *reinterpret_cast<const __half *>(&val));\n#endif\n}\n\ntemplate <typename scalar_t>\n__device__ scalar_t deformable_im2col_bilinear(const scalar_t *bottom_data, const int data_width,\n                                               const int height, const int width, scalar_t h, scalar_t w)\n{\n\n  int h_low = floor(h);\n  int w_low = floor(w);\n  int h_high = h_low + 1;\n  int w_high = w_low + 1;\n\n  scalar_t lh = h - h_low;\n  scalar_t lw = w - w_low;\n  scalar_t hh = 1 - lh, hw = 1 - lw;\n\n  scalar_t v1 = 0;\n  if (h_low >= 0 && w_low >= 0)\n    v1 = bottom_data[h_low * data_width + w_low];\n  scalar_t v2 = 0;\n  if (h_low >= 0 && w_high <= width - 1)\n    v2 = bottom_data[h_low * data_width + w_high];\n  scalar_t v3 = 0;\n  if (h_high <= height - 1 && w_low >= 0)\n    v3 = bottom_data[h_high * data_width + w_low];\n  scalar_t v4 = 0;\n  if (h_high <= height - 1 && w_high <= width - 1)\n    v4 = bottom_data[h_high * data_width + w_high];\n\n  scalar_t w1 = hh * hw, w2 = hh * lw, w3 = lh * hw, w4 = lh * lw;\n\n  scalar_t val = (w1 * v1 + w2 * v2 + w3 * v3 + w4 * v4);\n  return val;\n}\n\ntemplate <typename scalar_t>\n__device__ scalar_t get_gradient_weight(scalar_t argmax_h, scalar_t argmax_w,\n                                        const int h, const int w, const int height, const int width)\n{\n\n  if (argmax_h <= -1 || argmax_h >= height || argmax_w <= -1 || argmax_w >= width)\n  {\n    //empty\n    return 0;\n  }\n\n  int argmax_h_low = floor(argmax_h);\n  int argmax_w_low = floor(argmax_w);\n  int argmax_h_high = argmax_h_low + 1;\n  int argmax_w_high = argmax_w_low + 1;\n\n  scalar_t weight = 0;\n  if (h == argmax_h_low && w == argmax_w_low)\n    weight = (h + 1 - argmax_h) * (w + 1 - argmax_w);\n  if (h == argmax_h_low && w == argmax_w_high)\n    weight = (h + 1 - argmax_h) * (argmax_w + 1 - w);\n  if (h == argmax_h_high && w == argmax_w_low)\n    weight = (argmax_h + 1 - h) * (w + 1 - argmax_w);\n  if (h == argmax_h_high && w == argmax_w_high)\n    weight = (argmax_h + 1 - h) * (argmax_w + 1 - w);\n  return weight;\n}\n\ntemplate <typename scalar_t>\n__device__ scalar_t get_coordinate_weight(scalar_t argmax_h, scalar_t argmax_w,\n                                          const int height, const int width, const scalar_t *im_data,\n                                          const int data_width, const int bp_dir)\n{\n\n  if (argmax_h <= -1 || argmax_h >= height || argmax_w <= -1 || argmax_w >= width)\n  {\n    //empty\n    return 0;\n  }\n\n  int argmax_h_low = floor(argmax_h);\n  int argmax_w_low = floor(argmax_w);\n  int argmax_h_high = argmax_h_low + 1;\n  int argmax_w_high = argmax_w_low + 1;\n\n  scalar_t weight = 0;\n\n  if (bp_dir == 0)\n  {\n    if (argmax_h_low >= 0 && argmax_w_low >= 0)\n      weight += -1 * (argmax_w_low + 1 - argmax_w) * im_data[argmax_h_low * data_width + argmax_w_low];\n    if (argmax_h_low >= 0 && argmax_w_high <= width - 1)\n      weight += -1 * (argmax_w - argmax_w_low) * im_data[argmax_h_low * data_width + argmax_w_high];\n    if (argmax_h_high <= height - 1 && argmax_w_low >= 0)\n      weight += (argmax_w_low + 1 - argmax_w) * im_data[argmax_h_high * data_width + argmax_w_low];\n    if (argmax_h_high <= height - 1 && argmax_w_high <= width - 1)\n      weight += (argmax_w - argmax_w_low) * im_data[argmax_h_high * data_width + argmax_w_high];\n  }\n  else if (bp_dir == 1)\n  {\n    if (argmax_h_low >= 0 && argmax_w_low >= 0)\n      weight += -1 * (argmax_h_low + 1 - argmax_h) * im_data[argmax_h_low * data_width + argmax_w_low];\n    if (argmax_h_low >= 0 && argmax_w_high <= width - 1)\n      weight += (argmax_h_low + 1 - argmax_h) * im_data[argmax_h_low * data_width + argmax_w_high];\n    if (argmax_h_high <= height - 1 && argmax_w_low >= 0)\n      weight += -1 * (argmax_h - argmax_h_low) * im_data[argmax_h_high * data_width + argmax_w_low];\n    if (argmax_h_high <= height - 1 && argmax_w_high <= width - 1)\n      weight += (argmax_h - argmax_h_low) * im_data[argmax_h_high * data_width + argmax_w_high];\n  }\n\n  return weight;\n}\n\ntemplate <typename scalar_t>\n__global__ void deformable_im2col_gpu_kernel(const int n, const scalar_t *data_im, const scalar_t *data_offset,\n                                             const int height, const int width, const int kernel_h, const int kernel_w,\n                                             const int pad_h, const int pad_w, const int stride_h, const int stride_w,\n                                             const int dilation_h, const int dilation_w, const int channel_per_deformable_group,\n                                             const int batch_size, const int num_channels, const int deformable_group,\n                                             const int height_col, const int width_col,\n                                             scalar_t *data_col)\n{\n  CUDA_KERNEL_LOOP(index, n)\n  {\n    // index index of output matrix\n    const int w_col = index % width_col;\n    const int h_col = (index / width_col) % height_col;\n    const int b_col = (index / width_col / height_col) % batch_size;\n    const int c_im = (index / width_col / height_col) / batch_size;\n    const int c_col = c_im * kernel_h * kernel_w;\n\n    // compute deformable group index\n    const int deformable_group_index = c_im / channel_per_deformable_group;\n\n    const int h_in = h_col * stride_h - pad_h;\n    const int w_in = w_col * stride_w - pad_w;\n    scalar_t *data_col_ptr = data_col + ((c_col * batch_size + b_col) * height_col + h_col) * width_col + w_col;\n    //const scalar_t* data_im_ptr = data_im + ((b_col * num_channels + c_im) * height + h_in) * width + w_in;\n    const scalar_t *data_im_ptr = data_im + (b_col * num_channels + c_im) * height * width;\n    const scalar_t *data_offset_ptr = data_offset + (b_col * deformable_group + deformable_group_index) * 2 * kernel_h * kernel_w * height_col * width_col;\n\n    for (int i = 0; i < kernel_h; ++i)\n    {\n      for (int j = 0; j < kernel_w; ++j)\n      {\n        const int data_offset_h_ptr = ((2 * (i * kernel_w + j)) * height_col + h_col) * width_col + w_col;\n        const int data_offset_w_ptr = ((2 * (i * kernel_w + j) + 1) * height_col + h_col) * width_col + w_col;\n        const scalar_t offset_h = data_offset_ptr[data_offset_h_ptr];\n        const scalar_t offset_w = data_offset_ptr[data_offset_w_ptr];\n        scalar_t val = static_cast<scalar_t>(0);\n        const scalar_t h_im = h_in + i * dilation_h + offset_h;\n        const scalar_t w_im = w_in + j * dilation_w + offset_w;\n        if (h_im > -1 && w_im > -1 && h_im < height && w_im < width)\n        {\n          //const scalar_t map_h = i * dilation_h + offset_h;\n          //const scalar_t map_w = j * dilation_w + offset_w;\n          //const int cur_height = height - h_in;\n          //const int cur_width = width - w_in;\n          //val = deformable_im2col_bilinear(data_im_ptr, width, cur_height, cur_width, map_h, map_w);\n          val = deformable_im2col_bilinear(data_im_ptr, width, height, width, h_im, w_im);\n        }\n        *data_col_ptr = val;\n        data_col_ptr += batch_size * height_col * width_col;\n      }\n    }\n  }\n}\n\nvoid deformable_im2col(\n    const at::Tensor data_im, const at::Tensor data_offset, const int channels,\n    const int height, const int width, const int ksize_h, const int ksize_w,\n    const int pad_h, const int pad_w, const int stride_h, const int stride_w,\n    const int dilation_h, const int dilation_w, const int parallel_imgs,\n    const int deformable_group, at::Tensor data_col)\n{\n  // num_axes should be smaller than block size\n  // todo: check parallel_imgs is correctly passed in\n  int height_col = (height + 2 * pad_h - (dilation_h * (ksize_h - 1) + 1)) / stride_h + 1;\n  int width_col = (width + 2 * pad_w - (dilation_w * (ksize_w - 1) + 1)) / stride_w + 1;\n  int num_kernels = channels * height_col * width_col * parallel_imgs;\n  int channel_per_deformable_group = channels / deformable_group;\n\n  AT_DISPATCH_FLOATING_TYPES_AND_HALF(\n      data_im.scalar_type(), \"deformable_im2col_gpu\", ([&] {\n        const scalar_t *data_im_ = data_im.data_ptr<scalar_t>();\n        const scalar_t *data_offset_ = data_offset.data_ptr<scalar_t>();\n        scalar_t *data_col_ = data_col.data_ptr<scalar_t>();\n\n        deformable_im2col_gpu_kernel<<<GET_BLOCKS(num_kernels), CUDA_NUM_THREADS>>>(\n            num_kernels, data_im_, data_offset_, height, width, ksize_h, ksize_w,\n            pad_h, pad_w, stride_h, stride_w, dilation_h, dilation_w,\n            channel_per_deformable_group, parallel_imgs, channels, deformable_group,\n            height_col, width_col, data_col_);\n      }));\n\n  cudaError_t err = cudaGetLastError();\n  if (err != cudaSuccess)\n  {\n    printf(\"error in deformable_im2col: %s\\n\", cudaGetErrorString(err));\n  }\n}\n\ntemplate <typename scalar_t>\n__global__ void deformable_col2im_gpu_kernel(\n    const int n, const scalar_t *data_col, const scalar_t *data_offset,\n    const int channels, const int height, const int width,\n    const int kernel_h, const int kernel_w,\n    const int pad_h, const int pad_w,\n    const int stride_h, const int stride_w,\n    const int dilation_h, const int dilation_w,\n    const int channel_per_deformable_group,\n    const int batch_size, const int deformable_group,\n    const int height_col, const int width_col,\n    scalar_t *grad_im)\n{\n  CUDA_KERNEL_LOOP(index, n)\n  {\n    const int j = (index / width_col / height_col / batch_size) % kernel_w;\n    const int i = (index / width_col / height_col / batch_size / kernel_w) % kernel_h;\n    const int c = index / width_col / height_col / batch_size / kernel_w / kernel_h;\n    // compute the start and end of the output\n\n    const int deformable_group_index = c / channel_per_deformable_group;\n\n    int w_out = index % width_col;\n    int h_out = (index / width_col) % height_col;\n    int b = (index / width_col / height_col) % batch_size;\n    int w_in = w_out * stride_w - pad_w;\n    int h_in = h_out * stride_h - pad_h;\n\n    const scalar_t *data_offset_ptr = data_offset + (b * deformable_group + deformable_group_index) *\n                                                        2 * kernel_h * kernel_w * height_col * width_col;\n    const int data_offset_h_ptr = ((2 * (i * kernel_w + j)) * height_col + h_out) * width_col + w_out;\n    const int data_offset_w_ptr = ((2 * (i * kernel_w + j) + 1) * height_col + h_out) * width_col + w_out;\n    const scalar_t offset_h = data_offset_ptr[data_offset_h_ptr];\n    const scalar_t offset_w = data_offset_ptr[data_offset_w_ptr];\n    const scalar_t cur_inv_h_data = h_in + i * dilation_h + offset_h;\n    const scalar_t cur_inv_w_data = w_in + j * dilation_w + offset_w;\n\n    const scalar_t cur_top_grad = data_col[index];\n    const int cur_h = (int)cur_inv_h_data;\n    const int cur_w = (int)cur_inv_w_data;\n    for (int dy = -2; dy <= 2; dy++)\n    {\n      for (int dx = -2; dx <= 2; dx++)\n      {\n        if (cur_h + dy >= 0 && cur_h + dy < height &&\n            cur_w + dx >= 0 && cur_w + dx < width &&\n            abs(cur_inv_h_data - (cur_h + dy)) < 1 &&\n            abs(cur_inv_w_data - (cur_w + dx)) < 1)\n        {\n          int cur_bottom_grad_pos = ((b * channels + c) * height + cur_h + dy) * width + cur_w + dx;\n          scalar_t weight = get_gradient_weight(cur_inv_h_data, cur_inv_w_data, cur_h + dy, cur_w + dx, height, width);\n          dcn_atomic_add(grad_im + cur_bottom_grad_pos, weight * cur_top_grad);\n        }\n      }\n    }\n  }\n}\n\nvoid deformable_col2im(\n    const at::Tensor data_col, const at::Tensor data_offset, const int channels,\n    const int height, const int width, const int ksize_h,\n    const int ksize_w, const int pad_h, const int pad_w,\n    const int stride_h, const int stride_w,\n    const int dilation_h, const int dilation_w,\n    const int parallel_imgs, const int deformable_group,\n    at::Tensor grad_im)\n{\n\n  // todo: make sure parallel_imgs is passed in correctly\n  int height_col = (height + 2 * pad_h - (dilation_h * (ksize_h - 1) + 1)) / stride_h + 1;\n  int width_col = (width + 2 * pad_w - (dilation_w * (ksize_w - 1) + 1)) / stride_w + 1;\n  int num_kernels = channels * ksize_h * ksize_w * height_col * width_col * parallel_imgs;\n  int channel_per_deformable_group = channels / deformable_group;\n\n  AT_DISPATCH_FLOATING_TYPES_AND_HALF(\n      data_col.scalar_type(), \"deformable_col2im_gpu\", ([&] {\n        const scalar_t *data_col_ = data_col.data_ptr<scalar_t>();\n        const scalar_t *data_offset_ = data_offset.data_ptr<scalar_t>();\n        scalar_t *grad_im_ = grad_im.data_ptr<scalar_t>();\n\n        deformable_col2im_gpu_kernel<<<GET_BLOCKS(num_kernels), CUDA_NUM_THREADS>>>(\n            num_kernels, data_col_, data_offset_, channels, height, width, ksize_h,\n            ksize_w, pad_h, pad_w, stride_h, stride_w,\n            dilation_h, dilation_w, channel_per_deformable_group,\n            parallel_imgs, deformable_group, height_col, width_col, grad_im_);\n      }));\n\n  cudaError_t err = cudaGetLastError();\n  if (err != cudaSuccess)\n  {\n    printf(\"error in deformable_col2im: %s\\n\", cudaGetErrorString(err));\n  }\n}\n\ntemplate <typename scalar_t>\n__global__ void deformable_col2im_coord_gpu_kernel(const int n, const scalar_t *data_col,\n                                                   const scalar_t *data_im, const scalar_t *data_offset,\n                                                   const int channels, const int height, const int width,\n                                                   const int kernel_h, const int kernel_w,\n                                                   const int pad_h, const int pad_w,\n                                                   const int stride_h, const int stride_w,\n                                                   const int dilation_h, const int dilation_w,\n                                                   const int channel_per_deformable_group,\n                                                   const int batch_size, const int offset_channels, const int deformable_group,\n                                                   const int height_col, const int width_col, scalar_t *grad_offset)\n{\n  CUDA_KERNEL_LOOP(index, n)\n  {\n    scalar_t val = 0;\n    int w = index % width_col;\n    int h = (index / width_col) % height_col;\n    int c = (index / width_col / height_col) % offset_channels;\n    int b = (index / width_col / height_col) / offset_channels;\n    // compute the start and end of the output\n\n    const int deformable_group_index = c / (2 * kernel_h * kernel_w);\n    const int col_step = kernel_h * kernel_w;\n    int cnt = 0;\n    const scalar_t *data_col_ptr = data_col + deformable_group_index * channel_per_deformable_group *\n                                                  batch_size * width_col * height_col;\n    const scalar_t *data_im_ptr = data_im + (b * deformable_group + deformable_group_index) *\n                                                channel_per_deformable_group / kernel_h / kernel_w * height * width;\n    const scalar_t *data_offset_ptr = data_offset + (b * deformable_group + deformable_group_index) * 2 *\n                                                        kernel_h * kernel_w * height_col * width_col;\n\n    const int offset_c = c - deformable_group_index * 2 * kernel_h * kernel_w;\n\n    for (int col_c = (offset_c / 2); col_c < channel_per_deformable_group; col_c += col_step)\n    {\n      const int col_pos = (((col_c * batch_size + b) * height_col) + h) * width_col + w;\n      const int bp_dir = offset_c % 2;\n\n      int j = (col_pos / width_col / height_col / batch_size) % kernel_w;\n      int i = (col_pos / width_col / height_col / batch_size / kernel_w) % kernel_h;\n      int w_out = col_pos % width_col;\n      int h_out = (col_pos / width_col) % height_col;\n      int w_in = w_out * stride_w - pad_w;\n      int h_in = h_out * stride_h - pad_h;\n      const int data_offset_h_ptr = (((2 * (i * kernel_w + j)) * height_col + h_out) * width_col + w_out);\n      const int data_offset_w_ptr = (((2 * (i * kernel_w + j) + 1) * height_col + h_out) * width_col + w_out);\n      const scalar_t offset_h = data_offset_ptr[data_offset_h_ptr];\n      const scalar_t offset_w = data_offset_ptr[data_offset_w_ptr];\n      scalar_t inv_h = h_in + i * dilation_h + offset_h;\n      scalar_t inv_w = w_in + j * dilation_w + offset_w;\n      if (inv_h <= -1 || inv_w <= -1 || inv_h >= height || inv_w >= width)\n      {\n        inv_h = inv_w = -2;\n      }\n      const scalar_t weight = get_coordinate_weight(\n          inv_h, inv_w,\n          height, width, data_im_ptr + cnt * height * width, width, bp_dir);\n      val += weight * data_col_ptr[col_pos];\n      cnt += 1;\n    }\n\n    grad_offset[index] = val;\n  }\n}\n\nvoid deformable_col2im_coord(\n    const at::Tensor data_col, const at::Tensor data_im, const at::Tensor data_offset,\n    const int channels, const int height, const int width, const int ksize_h,\n    const int ksize_w, const int pad_h, const int pad_w, const int stride_h,\n    const int stride_w, const int dilation_h, const int dilation_w,\n    const int parallel_imgs, const int deformable_group, at::Tensor grad_offset)\n{\n\n  int height_col = (height + 2 * pad_h - (dilation_h * (ksize_h - 1) + 1)) / stride_h + 1;\n  int width_col = (width + 2 * pad_w - (dilation_w * (ksize_w - 1) + 1)) / stride_w + 1;\n  int num_kernels = height_col * width_col * 2 * ksize_h * ksize_w * deformable_group * parallel_imgs;\n  int channel_per_deformable_group = channels * ksize_h * ksize_w / deformable_group;\n\n  AT_DISPATCH_FLOATING_TYPES_AND_HALF(\n      data_col.scalar_type(), \"deformable_col2im_coord_gpu\", ([&] {\n        const scalar_t *data_col_ = data_col.data_ptr<scalar_t>();\n        const scalar_t *data_im_ = data_im.data_ptr<scalar_t>();\n        const scalar_t *data_offset_ = data_offset.data_ptr<scalar_t>();\n        scalar_t *grad_offset_ = grad_offset.data_ptr<scalar_t>();\n\n        deformable_col2im_coord_gpu_kernel<<<GET_BLOCKS(num_kernels), CUDA_NUM_THREADS>>>(\n            num_kernels, data_col_, data_im_, data_offset_, channels, height, width,\n            ksize_h, ksize_w, pad_h, pad_w, stride_h, stride_w,\n            dilation_h, dilation_w, channel_per_deformable_group,\n            parallel_imgs, 2 * ksize_h * ksize_w * deformable_group, deformable_group,\n            height_col, width_col, grad_offset_);\n      }));\n}\n\ntemplate <typename scalar_t>\n__device__ scalar_t dmcn_im2col_bilinear(const scalar_t *bottom_data, const int data_width,\n                                         const int height, const int width, scalar_t h, scalar_t w)\n{\n  int h_low = floor(h);\n  int w_low = floor(w);\n  int h_high = h_low + 1;\n  int w_high = w_low + 1;\n\n  scalar_t lh = h - h_low;\n  scalar_t lw = w - w_low;\n  scalar_t hh = 1 - lh, hw = 1 - lw;\n\n  scalar_t v1 = 0;\n  if (h_low >= 0 && w_low >= 0)\n    v1 = bottom_data[h_low * data_width + w_low];\n  scalar_t v2 = 0;\n  if (h_low >= 0 && w_high <= width - 1)\n    v2 = bottom_data[h_low * data_width + w_high];\n  scalar_t v3 = 0;\n  if (h_high <= height - 1 && w_low >= 0)\n    v3 = bottom_data[h_high * data_width + w_low];\n  scalar_t v4 = 0;\n  if (h_high <= height - 1 && w_high <= width - 1)\n    v4 = bottom_data[h_high * data_width + w_high];\n\n  scalar_t w1 = hh * hw, w2 = hh * lw, w3 = lh * hw, w4 = lh * lw;\n\n  scalar_t val = (w1 * v1 + w2 * v2 + w3 * v3 + w4 * v4);\n  return val;\n}\n\ntemplate <typename scalar_t>\n__device__ scalar_t dmcn_get_gradient_weight(scalar_t argmax_h, scalar_t argmax_w,\n                                             const int h, const int w, const int height, const int width)\n{\n  if (argmax_h <= -1 || argmax_h >= height || argmax_w <= -1 || argmax_w >= width)\n  {\n    //empty\n    return 0;\n  }\n\n  int argmax_h_low = floor(argmax_h);\n  int argmax_w_low = floor(argmax_w);\n  int argmax_h_high = argmax_h_low + 1;\n  int argmax_w_high = argmax_w_low + 1;\n\n  scalar_t weight = 0;\n  if (h == argmax_h_low && w == argmax_w_low)\n    weight = (h + 1 - argmax_h) * (w + 1 - argmax_w);\n  if (h == argmax_h_low && w == argmax_w_high)\n    weight = (h + 1 - argmax_h) * (argmax_w + 1 - w);\n  if (h == argmax_h_high && w == argmax_w_low)\n    weight = (argmax_h + 1 - h) * (w + 1 - argmax_w);\n  if (h == argmax_h_high && w == argmax_w_high)\n    weight = (argmax_h + 1 - h) * (argmax_w + 1 - w);\n  return weight;\n}\n\ntemplate <typename scalar_t>\n__device__ scalar_t dmcn_get_coordinate_weight(scalar_t argmax_h, scalar_t argmax_w,\n                                               const int height, const int width, const scalar_t *im_data,\n                                               const int data_width, const int bp_dir)\n{\n  if (argmax_h <= -1 || argmax_h >= height || argmax_w <= -1 || argmax_w >= width)\n  {\n    //empty\n    return 0;\n  }\n\n  int argmax_h_low = floor(argmax_h);\n  int argmax_w_low = floor(argmax_w);\n  int argmax_h_high = argmax_h_low + 1;\n  int argmax_w_high = argmax_w_low + 1;\n\n  scalar_t weight = 0;\n\n  if (bp_dir == 0)\n  {\n    if (argmax_h_low >= 0 && argmax_w_low >= 0)\n      weight += -1 * (argmax_w_low + 1 - argmax_w) * im_data[argmax_h_low * data_width + argmax_w_low];\n    if (argmax_h_low >= 0 && argmax_w_high <= width - 1)\n      weight += -1 * (argmax_w - argmax_w_low) * im_data[argmax_h_low * data_width + argmax_w_high];\n    if (argmax_h_high <= height - 1 && argmax_w_low >= 0)\n      weight += (argmax_w_low + 1 - argmax_w) * im_data[argmax_h_high * data_width + argmax_w_low];\n    if (argmax_h_high <= height - 1 && argmax_w_high <= width - 1)\n      weight += (argmax_w - argmax_w_low) * im_data[argmax_h_high * data_width + argmax_w_high];\n  }\n  else if (bp_dir == 1)\n  {\n    if (argmax_h_low >= 0 && argmax_w_low >= 0)\n      weight += -1 * (argmax_h_low + 1 - argmax_h) * im_data[argmax_h_low * data_width + argmax_w_low];\n    if (argmax_h_low >= 0 && argmax_w_high <= width - 1)\n      weight += (argmax_h_low + 1 - argmax_h) * im_data[argmax_h_low * data_width + argmax_w_high];\n    if (argmax_h_high <= height - 1 && argmax_w_low >= 0)\n      weight += -1 * (argmax_h - argmax_h_low) * im_data[argmax_h_high * data_width + argmax_w_low];\n    if (argmax_h_high <= height - 1 && argmax_w_high <= width - 1)\n      weight += (argmax_h - argmax_h_low) * im_data[argmax_h_high * data_width + argmax_w_high];\n  }\n\n  return weight;\n}\n\ntemplate <typename scalar_t>\n__global__ void modulated_deformable_im2col_gpu_kernel(const int n,\n                                                       const scalar_t *data_im, const scalar_t *data_offset, const scalar_t *data_mask,\n                                                       const int height, const int width, const int kernel_h, const int kernel_w,\n                                                       const int pad_h, const int pad_w,\n                                                       const int stride_h, const int stride_w,\n                                                       const int dilation_h, const int dilation_w,\n                                                       const int channel_per_deformable_group,\n                                                       const int batch_size, const int num_channels, const int deformable_group,\n                                                       const int height_col, const int width_col,\n                                                       scalar_t *data_col)\n{\n  CUDA_KERNEL_LOOP(index, n)\n  {\n    // index index of output matrix\n    const int w_col = index % width_col;\n    const int h_col = (index / width_col) % height_col;\n    const int b_col = (index / width_col / height_col) % batch_size;\n    const int c_im = (index / width_col / height_col) / batch_size;\n    const int c_col = c_im * kernel_h * kernel_w;\n\n    // compute deformable group index\n    const int deformable_group_index = c_im / channel_per_deformable_group;\n\n    const int h_in = h_col * stride_h - pad_h;\n    const int w_in = w_col * stride_w - pad_w;\n\n    scalar_t *data_col_ptr = data_col + ((c_col * batch_size + b_col) * height_col + h_col) * width_col + w_col;\n    //const float* data_im_ptr = data_im + ((b_col * num_channels + c_im) * height + h_in) * width + w_in;\n    const scalar_t *data_im_ptr = data_im + (b_col * num_channels + c_im) * height * width;\n    const scalar_t *data_offset_ptr = data_offset + (b_col * deformable_group + deformable_group_index) * 2 * kernel_h * kernel_w * height_col * width_col;\n\n    const scalar_t *data_mask_ptr = data_mask + (b_col * deformable_group + deformable_group_index) * kernel_h * kernel_w * height_col * width_col;\n\n    for (int i = 0; i < kernel_h; ++i)\n    {\n      for (int j = 0; j < kernel_w; ++j)\n      {\n        const int data_offset_h_ptr = ((2 * (i * kernel_w + j)) * height_col + h_col) * width_col + w_col;\n        const int data_offset_w_ptr = ((2 * (i * kernel_w + j) + 1) * height_col + h_col) * width_col + w_col;\n        const int data_mask_hw_ptr = ((i * kernel_w + j) * height_col + h_col) * width_col + w_col;\n        const scalar_t offset_h = data_offset_ptr[data_offset_h_ptr];\n        const scalar_t offset_w = data_offset_ptr[data_offset_w_ptr];\n        const scalar_t mask = data_mask_ptr[data_mask_hw_ptr];\n        scalar_t val = static_cast<scalar_t>(0);\n        const scalar_t h_im = h_in + i * dilation_h + offset_h;\n        const scalar_t w_im = w_in + j * dilation_w + offset_w;\n        //if (h_im >= 0 && w_im >= 0 && h_im < height && w_im < width) {\n        if (h_im > -1 && w_im > -1 && h_im < height && w_im < width)\n        {\n          //const float map_h = i * dilation_h + offset_h;\n          //const float map_w = j * dilation_w + offset_w;\n          //const int cur_height = height - h_in;\n          //const int cur_width = width - w_in;\n          //val = dmcn_im2col_bilinear(data_im_ptr, width, cur_height, cur_width, map_h, map_w);\n          val = dmcn_im2col_bilinear(data_im_ptr, width, height, width, h_im, w_im);\n        }\n        *data_col_ptr = val * mask;\n        data_col_ptr += batch_size * height_col * width_col;\n        //data_col_ptr += height_col * width_col;\n      }\n    }\n  }\n}\n\ntemplate <typename scalar_t>\n__global__ void modulated_deformable_col2im_gpu_kernel(const int n,\n                                                       const scalar_t *data_col, const scalar_t *data_offset, const scalar_t *data_mask,\n                                                       const int channels, const int height, const int width,\n                                                       const int kernel_h, const int kernel_w,\n                                                       const int pad_h, const int pad_w,\n                                                       const int stride_h, const int stride_w,\n                                                       const int dilation_h, const int dilation_w,\n                                                       const int channel_per_deformable_group,\n                                                       const int batch_size, const int deformable_group,\n                                                       const int height_col, const int width_col,\n                                                       scalar_t *grad_im)\n{\n  CUDA_KERNEL_LOOP(index, n)\n  {\n    const int j = (index / width_col / height_col / batch_size) % kernel_w;\n    const int i = (index / width_col / height_col / batch_size / kernel_w) % kernel_h;\n    const int c = index / width_col / height_col / batch_size / kernel_w / kernel_h;\n    // compute the start and end of the output\n\n    const int deformable_group_index = c / channel_per_deformable_group;\n\n    int w_out = index % width_col;\n    int h_out = (index / width_col) % height_col;\n    int b = (index / width_col / height_col) % batch_size;\n    int w_in = w_out * stride_w - pad_w;\n    int h_in = h_out * stride_h - pad_h;\n\n    const scalar_t *data_offset_ptr = data_offset + (b * deformable_group + deformable_group_index) * 2 * kernel_h * kernel_w * height_col * width_col;\n    const scalar_t *data_mask_ptr = data_mask + (b * deformable_group + deformable_group_index) * kernel_h * kernel_w * height_col * width_col;\n    const int data_offset_h_ptr = ((2 * (i * kernel_w + j)) * height_col + h_out) * width_col + w_out;\n    const int data_offset_w_ptr = ((2 * (i * kernel_w + j) + 1) * height_col + h_out) * width_col + w_out;\n    const int data_mask_hw_ptr = ((i * kernel_w + j) * height_col + h_out) * width_col + w_out;\n    const scalar_t offset_h = data_offset_ptr[data_offset_h_ptr];\n    const scalar_t offset_w = data_offset_ptr[data_offset_w_ptr];\n    const scalar_t mask = data_mask_ptr[data_mask_hw_ptr];\n    const scalar_t cur_inv_h_data = h_in + i * dilation_h + offset_h;\n    const scalar_t cur_inv_w_data = w_in + j * dilation_w + offset_w;\n\n    const scalar_t cur_top_grad = data_col[index] * mask;\n    const int cur_h = (int)cur_inv_h_data;\n    const int cur_w = (int)cur_inv_w_data;\n    for (int dy = -2; dy <= 2; dy++)\n    {\n      for (int dx = -2; dx <= 2; dx++)\n      {\n        if (cur_h + dy >= 0 && cur_h + dy < height &&\n            cur_w + dx >= 0 && cur_w + dx < width &&\n            abs(cur_inv_h_data - (cur_h + dy)) < 1 &&\n            abs(cur_inv_w_data - (cur_w + dx)) < 1)\n        {\n          int cur_bottom_grad_pos = ((b * channels + c) * height + cur_h + dy) * width + cur_w + dx;\n          scalar_t weight = dmcn_get_gradient_weight(cur_inv_h_data, cur_inv_w_data, cur_h + dy, cur_w + dx, height, width);\n          dcn_atomic_add(grad_im + cur_bottom_grad_pos, weight * cur_top_grad);\n        }\n      }\n    }\n  }\n}\n\ntemplate <typename scalar_t>\n__global__ void modulated_deformable_col2im_coord_gpu_kernel(const int n,\n                                                             const scalar_t *data_col, const scalar_t *data_im,\n                                                             const scalar_t *data_offset, const scalar_t *data_mask,\n                                                             const int channels, const int height, const int width,\n                                                             const int kernel_h, const int kernel_w,\n                                                             const int pad_h, const int pad_w,\n                                                             const int stride_h, const int stride_w,\n                                                             const int dilation_h, const int dilation_w,\n                                                             const int channel_per_deformable_group,\n                                                             const int batch_size, const int offset_channels, const int deformable_group,\n                                                             const int height_col, const int width_col,\n                                                             scalar_t *grad_offset, scalar_t *grad_mask)\n{\n  CUDA_KERNEL_LOOP(index, n)\n  {\n    scalar_t val = 0, mval = 0;\n    int w = index % width_col;\n    int h = (index / width_col) % height_col;\n    int c = (index / width_col / height_col) % offset_channels;\n    int b = (index / width_col / height_col) / offset_channels;\n    // compute the start and end of the output\n\n    const int deformable_group_index = c / (2 * kernel_h * kernel_w);\n    const int col_step = kernel_h * kernel_w;\n    int cnt = 0;\n    const scalar_t *data_col_ptr = data_col + deformable_group_index * channel_per_deformable_group * batch_size * width_col * height_col;\n    const scalar_t *data_im_ptr = data_im + (b * deformable_group + deformable_group_index) * channel_per_deformable_group / kernel_h / kernel_w * height * width;\n    const scalar_t *data_offset_ptr = data_offset + (b * deformable_group + deformable_group_index) * 2 * kernel_h * kernel_w * height_col * width_col;\n    const scalar_t *data_mask_ptr = data_mask + (b * deformable_group + deformable_group_index) * kernel_h * kernel_w * height_col * width_col;\n\n    const int offset_c = c - deformable_group_index * 2 * kernel_h * kernel_w;\n\n    for (int col_c = (offset_c / 2); col_c < channel_per_deformable_group; col_c += col_step)\n    {\n      const int col_pos = (((col_c * batch_size + b) * height_col) + h) * width_col + w;\n      const int bp_dir = offset_c % 2;\n\n      int j = (col_pos / width_col / height_col / batch_size) % kernel_w;\n      int i = (col_pos / width_col / height_col / batch_size / kernel_w) % kernel_h;\n      int w_out = col_pos % width_col;\n      int h_out = (col_pos / width_col) % height_col;\n      int w_in = w_out * stride_w - pad_w;\n      int h_in = h_out * stride_h - pad_h;\n      const int data_offset_h_ptr = (((2 * (i * kernel_w + j)) * height_col + h_out) * width_col + w_out);\n      const int data_offset_w_ptr = (((2 * (i * kernel_w + j) + 1) * height_col + h_out) * width_col + w_out);\n      const int data_mask_hw_ptr = (((i * kernel_w + j) * height_col + h_out) * width_col + w_out);\n      const scalar_t offset_h = data_offset_ptr[data_offset_h_ptr];\n      const scalar_t offset_w = data_offset_ptr[data_offset_w_ptr];\n      const scalar_t mask = data_mask_ptr[data_mask_hw_ptr];\n      scalar_t inv_h = h_in + i * dilation_h + offset_h;\n      scalar_t inv_w = w_in + j * dilation_w + offset_w;\n      if (inv_h <= -1 || inv_w <= -1 || inv_h >= height || inv_w >= width)\n      {\n        inv_h = inv_w = -2;\n      }\n      else\n      {\n        mval += data_col_ptr[col_pos] * dmcn_im2col_bilinear(data_im_ptr + cnt * height * width, width, height, width, inv_h, inv_w);\n      }\n      const scalar_t weight = dmcn_get_coordinate_weight(\n          inv_h, inv_w,\n          height, width, data_im_ptr + cnt * height * width, width, bp_dir);\n      val += weight * data_col_ptr[col_pos] * mask;\n      cnt += 1;\n    }\n    // KERNEL_ASSIGN(grad_offset[index], offset_req, val);\n    grad_offset[index] = val;\n    if (offset_c % 2 == 0)\n      // KERNEL_ASSIGN(grad_mask[(((b * deformable_group + deformable_group_index) * kernel_h * kernel_w + offset_c / 2) * height_col + h) * width_col + w], mask_req, mval);\n      grad_mask[(((b * deformable_group + deformable_group_index) * kernel_h * kernel_w + offset_c / 2) * height_col + h) * width_col + w] = mval;\n  }\n}\n\nvoid modulated_deformable_im2col_cuda(\n    const at::Tensor data_im, const at::Tensor data_offset, const at::Tensor data_mask,\n    const int batch_size, const int channels, const int height_im, const int width_im,\n    const int height_col, const int width_col, const int kernel_h, const int kenerl_w,\n    const int pad_h, const int pad_w, const int stride_h, const int stride_w,\n    const int dilation_h, const int dilation_w,\n    const int deformable_group, at::Tensor data_col)\n{\n  // num_axes should be smaller than block size\n  const int channel_per_deformable_group = channels / deformable_group;\n  const int num_kernels = channels * batch_size * height_col * width_col;\n\n  AT_DISPATCH_FLOATING_TYPES_AND_HALF(\n      data_im.scalar_type(), \"modulated_deformable_im2col_gpu\", ([&] {\n        const scalar_t *data_im_ = data_im.data_ptr<scalar_t>();\n        const scalar_t *data_offset_ = data_offset.data_ptr<scalar_t>();\n        const scalar_t *data_mask_ = data_mask.data_ptr<scalar_t>();\n        scalar_t *data_col_ = data_col.data_ptr<scalar_t>();\n\n        modulated_deformable_im2col_gpu_kernel<<<GET_BLOCKS(num_kernels), CUDA_NUM_THREADS>>>(\n            num_kernels, data_im_, data_offset_, data_mask_, height_im, width_im, kernel_h, kenerl_w,\n            pad_h, pad_w, stride_h, stride_w, dilation_h, dilation_w, channel_per_deformable_group,\n            batch_size, channels, deformable_group, height_col, width_col, data_col_);\n      }));\n\n  cudaError_t err = cudaGetLastError();\n  if (err != cudaSuccess)\n  {\n    // printf(\"error in modulated_deformable_im2col_cuda: %s\\n\", cudaGetErrorString(err));\n  }\n}\n\nvoid modulated_deformable_col2im_cuda(\n    const at::Tensor data_col, const at::Tensor data_offset, const at::Tensor data_mask,\n    const int batch_size, const int channels, const int height_im, const int width_im,\n    const int height_col, const int width_col, const int kernel_h, const int kernel_w,\n    const int pad_h, const int pad_w, const int stride_h, const int stride_w,\n    const int dilation_h, const int dilation_w,\n    const int deformable_group, at::Tensor grad_im)\n{\n\n  const int channel_per_deformable_group = channels / deformable_group;\n  const int num_kernels = channels * kernel_h * kernel_w * batch_size * height_col * width_col;\n\n  AT_DISPATCH_FLOATING_TYPES_AND_HALF(\n      data_col.scalar_type(), \"modulated_deformable_col2im_gpu\", ([&] {\n        const scalar_t *data_col_ = data_col.data_ptr<scalar_t>();\n        const scalar_t *data_offset_ = data_offset.data_ptr<scalar_t>();\n        const scalar_t *data_mask_ = data_mask.data_ptr<scalar_t>();\n        scalar_t *grad_im_ = grad_im.data_ptr<scalar_t>();\n\n        modulated_deformable_col2im_gpu_kernel<<<GET_BLOCKS(num_kernels), CUDA_NUM_THREADS>>>(\n            num_kernels, data_col_, data_offset_, data_mask_, channels, height_im, width_im,\n            kernel_h, kernel_w, pad_h, pad_h, stride_h, stride_w,\n            dilation_h, dilation_w, channel_per_deformable_group,\n            batch_size, deformable_group, height_col, width_col, grad_im_);\n      }));\n\n  cudaError_t err = cudaGetLastError();\n  if (err != cudaSuccess)\n  {\n    printf(\"error in modulated_deformable_col2im_cuda: %s\\n\", cudaGetErrorString(err));\n  }\n}\n\nvoid modulated_deformable_col2im_coord_cuda(\n    const at::Tensor data_col, const at::Tensor data_im, const at::Tensor data_offset, const at::Tensor data_mask,\n    const int batch_size, const int channels, const int height_im, const int width_im,\n    const int height_col, const int width_col, const int kernel_h, const int kernel_w,\n    const int pad_h, const int pad_w, const int stride_h, const int stride_w,\n    const int dilation_h, const int dilation_w,\n    const int deformable_group,\n    at::Tensor grad_offset, at::Tensor grad_mask)\n{\n  const int num_kernels = batch_size * height_col * width_col * 2 * kernel_h * kernel_w * deformable_group;\n  const int channel_per_deformable_group = channels * kernel_h * kernel_w / deformable_group;\n\n  AT_DISPATCH_FLOATING_TYPES_AND_HALF(\n      data_col.scalar_type(), \"modulated_deformable_col2im_coord_gpu\", ([&] {\n        const scalar_t *data_col_ = data_col.data_ptr<scalar_t>();\n        const scalar_t *data_im_ = data_im.data_ptr<scalar_t>();\n        const scalar_t *data_offset_ = data_offset.data_ptr<scalar_t>();\n        const scalar_t *data_mask_ = data_mask.data_ptr<scalar_t>();\n        scalar_t *grad_offset_ = grad_offset.data_ptr<scalar_t>();\n        scalar_t *grad_mask_ = grad_mask.data_ptr<scalar_t>();\n\n        modulated_deformable_col2im_coord_gpu_kernel<<<GET_BLOCKS(num_kernels), CUDA_NUM_THREADS>>>(\n            num_kernels, data_col_, data_im_, data_offset_, data_mask_, channels, height_im, width_im,\n            kernel_h, kernel_w, pad_h, pad_w, stride_h, stride_w,\n            dilation_h, dilation_w, channel_per_deformable_group,\n            batch_size, 2 * kernel_h * kernel_w * deformable_group, deformable_group, height_col, width_col,\n            grad_offset_, grad_mask_);\n      }));\n  cudaError_t err = cudaGetLastError();\n  if (err != cudaSuccess)\n  {\n    printf(\"error in modulated_deformable_col2im_coord_cuda: %s\\n\", cudaGetErrorString(err));\n  }\n}\n"
  },
  {
    "path": "network/backbone/assets/dcn/src/deform_pool_cuda.cpp",
    "content": "// modify from\n// https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/modulated_dcn_cuda.c\n\n// based on\n// author: Charles Shang\n// https://github.com/torch/cunn/blob/master/lib/THCUNN/generic/SpatialConvolutionMM.cu\n\n#include <torch/extension.h>\n\n#include <cmath>\n#include <vector>\n\nvoid DeformablePSROIPoolForward(\n    const at::Tensor data, const at::Tensor bbox, const at::Tensor trans,\n    at::Tensor out, at::Tensor top_count, const int batch, const int channels,\n    const int height, const int width, const int num_bbox,\n    const int channels_trans, const int no_trans, const float spatial_scale,\n    const int output_dim, const int group_size, const int pooled_size,\n    const int part_size, const int sample_per_part, const float trans_std);\n\nvoid DeformablePSROIPoolBackwardAcc(\n    const at::Tensor out_grad, const at::Tensor data, const at::Tensor bbox,\n    const at::Tensor trans, const at::Tensor top_count, at::Tensor in_grad,\n    at::Tensor trans_grad, const int batch, const int channels,\n    const int height, const int width, const int num_bbox,\n    const int channels_trans, const int no_trans, const float spatial_scale,\n    const int output_dim, const int group_size, const int pooled_size,\n    const int part_size, const int sample_per_part, const float trans_std);\n\nvoid deform_psroi_pooling_cuda_forward(\n    at::Tensor input, at::Tensor bbox, at::Tensor trans, at::Tensor out,\n    at::Tensor top_count, const int no_trans, const float spatial_scale,\n    const int output_dim, const int group_size, const int pooled_size,\n    const int part_size, const int sample_per_part, const float trans_std) {\n  TORCH_CHECK(input.is_contiguous(), \"input tensor has to be contiguous\");\n\n  const int batch = input.size(0);\n  const int channels = input.size(1);\n  const int height = input.size(2);\n  const int width = input.size(3);\n  const int channels_trans = no_trans ? 2 : trans.size(1);\n\n  const int num_bbox = bbox.size(0);\n  TORCH_CHECK(num_bbox == out.size(0),\n              \"Output shape and bbox number wont match: (\", out.size(0),\n              \" vs \", num_bbox, \").\");\n\n  DeformablePSROIPoolForward(\n      input, bbox, trans, out, top_count, batch, channels, height, width,\n      num_bbox, channels_trans, no_trans, spatial_scale, output_dim, group_size,\n      pooled_size, part_size, sample_per_part, trans_std);\n}\n\nvoid deform_psroi_pooling_cuda_backward(\n    at::Tensor out_grad, at::Tensor input, at::Tensor bbox, at::Tensor trans,\n    at::Tensor top_count, at::Tensor input_grad, at::Tensor trans_grad,\n    const int no_trans, const float spatial_scale, const int output_dim,\n    const int group_size, const int pooled_size, const int part_size,\n    const int sample_per_part, const float trans_std) {\n  TORCH_CHECK(out_grad.is_contiguous(), \"out_grad tensor has to be contiguous\");\n  TORCH_CHECK(input.is_contiguous(), \"input tensor has to be contiguous\");\n\n  const int batch = input.size(0);\n  const int channels = input.size(1);\n  const int height = input.size(2);\n  const int width = input.size(3);\n  const int channels_trans = no_trans ? 2 : trans.size(1);\n\n  const int num_bbox = bbox.size(0);\n  TORCH_CHECK(num_bbox == out_grad.size(0),\n              \"Output shape and bbox number wont match: (\", out_grad.size(0),\n              \" vs \", num_bbox, \").\");\n\n  DeformablePSROIPoolBackwardAcc(\n      out_grad, input, bbox, trans, top_count, input_grad, trans_grad, batch,\n      channels, height, width, num_bbox, channels_trans, no_trans,\n      spatial_scale, output_dim, group_size, pooled_size, part_size,\n      sample_per_part, trans_std);\n}\n\nPYBIND11_MODULE(TORCH_EXTENSION_NAME, m) {\n  m.def(\"deform_psroi_pooling_cuda_forward\", &deform_psroi_pooling_cuda_forward,\n        \"deform psroi pooling forward(CUDA)\");\n  m.def(\"deform_psroi_pooling_cuda_backward\",\n        &deform_psroi_pooling_cuda_backward,\n        \"deform psroi pooling backward(CUDA)\");\n}\n"
  },
  {
    "path": "network/backbone/assets/dcn/src/deform_pool_cuda_kernel.cu",
    "content": "/*!\n * Copyright (c) 2017 Microsoft\n * Licensed under The MIT License [see LICENSE for details]\n * \\file deformable_psroi_pooling.cu\n * \\brief\n * \\author Yi Li, Guodong Zhang, Jifeng Dai\n*/\n/***************** Adapted by Charles Shang *********************/\n// modify from https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/cuda/deform_psroi_pooling_cuda.cu\n\n#include <ATen/ATen.h>\n#include <ATen/cuda/Atomic.cuh>\n#include <algorithm>\n#include <cuda_fp16.h>\n#include <math.h>\n#include <stdio.h>\n\nusing namespace at;\n\n#define CUDA_KERNEL_LOOP(i, n)                        \\\n  for (int i = blockIdx.x * blockDim.x + threadIdx.x; \\\n       i < (n);                                       \\\n       i += blockDim.x * gridDim.x)\n\nconst int CUDA_NUM_THREADS = 1024;\ninline int GET_BLOCKS(const int N)\n{\n  return (N + CUDA_NUM_THREADS - 1) / CUDA_NUM_THREADS;\n}\n\ntemplate <typename T>\n__device__ inline void dcn_atomic_add(T *address, T val)\n{\n  atomicAdd(address, val);\n}\n\ntemplate <>\n__device__ inline void dcn_atomic_add<at::Half>(at::Half *address, at::Half val)\n{\n#if defined(__CUDA_ARCH__)\n  static_assert(sizeof(at::Half) == sizeof(__half), \"Half size mismatch\");\n  atomicAdd(reinterpret_cast<__half *>(address),\n            *reinterpret_cast<const __half *>(&val));\n#endif\n}\n\ntemplate <typename scalar_t>\n__device__ scalar_t bilinear_interp(\n    const scalar_t *data,\n    const scalar_t x,\n    const scalar_t y,\n    const int width,\n    const int height)\n{\n  int x1 = floor(x);\n  int x2 = ceil(x);\n  int y1 = floor(y);\n  int y2 = ceil(y);\n  scalar_t dist_x = (scalar_t)(x - x1);\n  scalar_t dist_y = (scalar_t)(y - y1);\n  scalar_t value11 = data[y1 * width + x1];\n  scalar_t value12 = data[y2 * width + x1];\n  scalar_t value21 = data[y1 * width + x2];\n  scalar_t value22 = data[y2 * width + x2];\n  scalar_t value = (1 - dist_x) * (1 - dist_y) * value11 + (1 - dist_x) * dist_y * value12 + dist_x * (1 - dist_y) * value21 + dist_x * dist_y * value22;\n  return value;\n}\n\ntemplate <typename scalar_t>\n__global__ void DeformablePSROIPoolForwardKernel(\n    const int count,\n    const scalar_t *bottom_data,\n    const scalar_t spatial_scale,\n    const int channels,\n    const int height, const int width,\n    const int pooled_height, const int pooled_width,\n    const scalar_t *bottom_rois, const scalar_t *bottom_trans,\n    const int no_trans,\n    const scalar_t trans_std,\n    const int sample_per_part,\n    const int output_dim,\n    const int group_size,\n    const int part_size,\n    const int num_classes,\n    const int channels_each_class,\n    scalar_t *top_data,\n    scalar_t *top_count)\n{\n  CUDA_KERNEL_LOOP(index, count)\n  {\n    // The output is in order (n, ctop, ph, pw)\n    int pw = index % pooled_width;\n    int ph = (index / pooled_width) % pooled_height;\n    int ctop = (index / pooled_width / pooled_height) % output_dim;\n    int n = index / pooled_width / pooled_height / output_dim;\n\n    // [start, end) interval for spatial sampling\n    const scalar_t *offset_bottom_rois = bottom_rois + n * 5;\n    int roi_batch_ind = offset_bottom_rois[0];\n    scalar_t roi_start_w = (scalar_t)(round(offset_bottom_rois[1])) * spatial_scale - 0.5;\n    scalar_t roi_start_h = (scalar_t)(round(offset_bottom_rois[2])) * spatial_scale - 0.5;\n    scalar_t roi_end_w = (scalar_t)(round(offset_bottom_rois[3]) + 1.) * spatial_scale - 0.5;\n    scalar_t roi_end_h = (scalar_t)(round(offset_bottom_rois[4]) + 1.) * spatial_scale - 0.5;\n\n    // Force too small ROIs to be 1x1\n    scalar_t roi_width = max(roi_end_w - roi_start_w, 0.1); //avoid 0\n    scalar_t roi_height = max(roi_end_h - roi_start_h, 0.1);\n\n    // Compute w and h at bottom\n    scalar_t bin_size_h = roi_height / (scalar_t)(pooled_height);\n    scalar_t bin_size_w = roi_width / (scalar_t)(pooled_width);\n\n    scalar_t sub_bin_size_h = bin_size_h / (scalar_t)(sample_per_part);\n    scalar_t sub_bin_size_w = bin_size_w / (scalar_t)(sample_per_part);\n\n    int part_h = floor((scalar_t)(ph) / pooled_height * part_size);\n    int part_w = floor((scalar_t)(pw) / pooled_width * part_size);\n    int class_id = ctop / channels_each_class;\n    scalar_t trans_x = no_trans ? (scalar_t)(0) : bottom_trans[(((n * num_classes + class_id) * 2) * part_size + part_h) * part_size + part_w] * (scalar_t)trans_std;\n    scalar_t trans_y = no_trans ? (scalar_t)(0) : bottom_trans[(((n * num_classes + class_id) * 2 + 1) * part_size + part_h) * part_size + part_w] * (scalar_t)trans_std;\n\n    scalar_t wstart = (scalar_t)(pw)*bin_size_w + roi_start_w;\n    wstart += trans_x * roi_width;\n    scalar_t hstart = (scalar_t)(ph)*bin_size_h + roi_start_h;\n    hstart += trans_y * roi_height;\n\n    scalar_t sum = 0;\n    int count = 0;\n    int gw = floor((scalar_t)(pw)*group_size / pooled_width);\n    int gh = floor((scalar_t)(ph)*group_size / pooled_height);\n    gw = min(max(gw, 0), group_size - 1);\n    gh = min(max(gh, 0), group_size - 1);\n\n    const scalar_t *offset_bottom_data = bottom_data + (roi_batch_ind * channels) * height * width;\n    for (int ih = 0; ih < sample_per_part; ih++)\n    {\n      for (int iw = 0; iw < sample_per_part; iw++)\n      {\n        scalar_t w = wstart + iw * sub_bin_size_w;\n        scalar_t h = hstart + ih * sub_bin_size_h;\n        // bilinear interpolation\n        if (w < -0.5 || w > width - 0.5 || h < -0.5 || h > height - 0.5)\n        {\n          continue;\n        }\n        w = min(max(w, 0.), width - 1.);\n        h = min(max(h, 0.), height - 1.);\n        int c = (ctop * group_size + gh) * group_size + gw;\n        scalar_t val = bilinear_interp(offset_bottom_data + c * height * width, w, h, width, height);\n        sum += val;\n        count++;\n      }\n    }\n    top_data[index] = count == 0 ? (scalar_t)(0) : sum / count;\n    top_count[index] = count;\n  }\n}\n\ntemplate <typename scalar_t>\n__global__ void DeformablePSROIPoolBackwardAccKernel(\n    const int count,\n    const scalar_t *top_diff,\n    const scalar_t *top_count,\n    const int num_rois,\n    const scalar_t spatial_scale,\n    const int channels,\n    const int height, const int width,\n    const int pooled_height, const int pooled_width,\n    const int output_dim,\n    scalar_t *bottom_data_diff, scalar_t *bottom_trans_diff,\n    const scalar_t *bottom_data,\n    const scalar_t *bottom_rois,\n    const scalar_t *bottom_trans,\n    const int no_trans,\n    const scalar_t trans_std,\n    const int sample_per_part,\n    const int group_size,\n    const int part_size,\n    const int num_classes,\n    const int channels_each_class)\n{\n  CUDA_KERNEL_LOOP(index, count)\n  {\n    // The output is in order (n, ctop, ph, pw)\n    int pw = index % pooled_width;\n    int ph = (index / pooled_width) % pooled_height;\n    int ctop = (index / pooled_width / pooled_height) % output_dim;\n    int n = index / pooled_width / pooled_height / output_dim;\n\n    // [start, end) interval for spatial sampling\n    const scalar_t *offset_bottom_rois = bottom_rois + n * 5;\n    int roi_batch_ind = offset_bottom_rois[0];\n    scalar_t roi_start_w = (scalar_t)(round(offset_bottom_rois[1])) * spatial_scale - 0.5;\n    scalar_t roi_start_h = (scalar_t)(round(offset_bottom_rois[2])) * spatial_scale - 0.5;\n    scalar_t roi_end_w = (scalar_t)(round(offset_bottom_rois[3]) + 1.) * spatial_scale - 0.5;\n    scalar_t roi_end_h = (scalar_t)(round(offset_bottom_rois[4]) + 1.) * spatial_scale - 0.5;\n\n    // Force too small ROIs to be 1x1\n    scalar_t roi_width = max(roi_end_w - roi_start_w, 0.1); //avoid 0\n    scalar_t roi_height = max(roi_end_h - roi_start_h, 0.1);\n\n    // Compute w and h at bottom\n    scalar_t bin_size_h = roi_height / (scalar_t)(pooled_height);\n    scalar_t bin_size_w = roi_width / (scalar_t)(pooled_width);\n\n    scalar_t sub_bin_size_h = bin_size_h / (scalar_t)(sample_per_part);\n    scalar_t sub_bin_size_w = bin_size_w / (scalar_t)(sample_per_part);\n\n    int part_h = floor((scalar_t)(ph) / pooled_height * part_size);\n    int part_w = floor((scalar_t)(pw) / pooled_width * part_size);\n    int class_id = ctop / channels_each_class;\n    scalar_t trans_x = no_trans ? (scalar_t)(0) : bottom_trans[(((n * num_classes + class_id) * 2) * part_size + part_h) * part_size + part_w] * (scalar_t)trans_std;\n    scalar_t trans_y = no_trans ? (scalar_t)(0) : bottom_trans[(((n * num_classes + class_id) * 2 + 1) * part_size + part_h) * part_size + part_w] * (scalar_t)trans_std;\n\n    scalar_t wstart = (scalar_t)(pw)*bin_size_w + roi_start_w;\n    wstart += trans_x * roi_width;\n    scalar_t hstart = (scalar_t)(ph)*bin_size_h + roi_start_h;\n    hstart += trans_y * roi_height;\n\n    if (top_count[index] <= 0)\n    {\n      continue;\n    }\n    scalar_t diff_val = top_diff[index] / top_count[index];\n    const scalar_t *offset_bottom_data = bottom_data + roi_batch_ind * channels * height * width;\n    scalar_t *offset_bottom_data_diff = bottom_data_diff + roi_batch_ind * channels * height * width;\n    int gw = floor((scalar_t)(pw)*group_size / pooled_width);\n    int gh = floor((scalar_t)(ph)*group_size / pooled_height);\n    gw = min(max(gw, 0), group_size - 1);\n    gh = min(max(gh, 0), group_size - 1);\n\n    for (int ih = 0; ih < sample_per_part; ih++)\n    {\n      for (int iw = 0; iw < sample_per_part; iw++)\n      {\n        scalar_t w = wstart + iw * sub_bin_size_w;\n        scalar_t h = hstart + ih * sub_bin_size_h;\n        // bilinear interpolation\n        if (w < -0.5 || w > width - 0.5 || h < -0.5 || h > height - 0.5)\n        {\n          continue;\n        }\n        w = min(max(w, 0.), width - 1.);\n        h = min(max(h, 0.), height - 1.);\n        int c = (ctop * group_size + gh) * group_size + gw;\n        // backward on feature\n        int x0 = floor(w);\n        int x1 = ceil(w);\n        int y0 = floor(h);\n        int y1 = ceil(h);\n        scalar_t dist_x = w - x0, dist_y = h - y0;\n        scalar_t q00 = (1 - dist_x) * (1 - dist_y);\n        scalar_t q01 = (1 - dist_x) * dist_y;\n        scalar_t q10 = dist_x * (1 - dist_y);\n        scalar_t q11 = dist_x * dist_y;\n        int bottom_index_base = c * height * width;\n        dcn_atomic_add(offset_bottom_data_diff + bottom_index_base + y0 * width + x0, q00 * diff_val);\n        dcn_atomic_add(offset_bottom_data_diff + bottom_index_base + y1 * width + x0, q01 * diff_val);\n        dcn_atomic_add(offset_bottom_data_diff + bottom_index_base + y0 * width + x1, q10 * diff_val);\n        dcn_atomic_add(offset_bottom_data_diff + bottom_index_base + y1 * width + x1, q11 * diff_val);\n\n        if (no_trans)\n        {\n          continue;\n        }\n        scalar_t U00 = offset_bottom_data[bottom_index_base + y0 * width + x0];\n        scalar_t U01 = offset_bottom_data[bottom_index_base + y1 * width + x0];\n        scalar_t U10 = offset_bottom_data[bottom_index_base + y0 * width + x1];\n        scalar_t U11 = offset_bottom_data[bottom_index_base + y1 * width + x1];\n        scalar_t diff_x = (U11 * dist_y + U10 * (1 - dist_y) - U01 * dist_y - U00 * (1 - dist_y)) * trans_std * diff_val;\n        diff_x *= roi_width;\n        scalar_t diff_y = (U11 * dist_x + U01 * (1 - dist_x) - U10 * dist_x - U00 * (1 - dist_x)) * trans_std * diff_val;\n        diff_y *= roi_height;\n\n        dcn_atomic_add(bottom_trans_diff + (((n * num_classes + class_id) * 2) * part_size + part_h) * part_size + part_w, diff_x);\n        dcn_atomic_add(bottom_trans_diff + (((n * num_classes + class_id) * 2 + 1) * part_size + part_h) * part_size + part_w, diff_y);\n      }\n    }\n  }\n}\n\nvoid DeformablePSROIPoolForward(const at::Tensor data,\n                                const at::Tensor bbox,\n                                const at::Tensor trans,\n                                at::Tensor out,\n                                at::Tensor top_count,\n                                const int batch,\n                                const int channels,\n                                const int height,\n                                const int width,\n                                const int num_bbox,\n                                const int channels_trans,\n                                const int no_trans,\n                                const float spatial_scale,\n                                const int output_dim,\n                                const int group_size,\n                                const int pooled_size,\n                                const int part_size,\n                                const int sample_per_part,\n                                const float trans_std)\n{\n  const int pooled_height = pooled_size;\n  const int pooled_width = pooled_size;\n  const int count = num_bbox * output_dim * pooled_height * pooled_width;\n  const int num_classes = no_trans ? 1 : channels_trans / 2;\n  const int channels_each_class = no_trans ? output_dim : output_dim / num_classes;\n\n  AT_DISPATCH_FLOATING_TYPES_AND_HALF(\n      data.scalar_type(), \"deformable_psroi_pool_forward\", ([&] {\n        const scalar_t *bottom_data = data.data_ptr<scalar_t>();\n        const scalar_t *bottom_rois = bbox.data_ptr<scalar_t>();\n        const scalar_t *bottom_trans = no_trans ? nullptr : trans.data_ptr<scalar_t>();\n        scalar_t *top_data = out.data_ptr<scalar_t>();\n        scalar_t *top_count_data = top_count.data_ptr<scalar_t>();\n\n        DeformablePSROIPoolForwardKernel<<<GET_BLOCKS(count), CUDA_NUM_THREADS>>>(\n            count, bottom_data, (scalar_t)spatial_scale, channels, height, width, pooled_height, pooled_width,\n            bottom_rois, bottom_trans, no_trans, (scalar_t)trans_std, sample_per_part, output_dim,\n            group_size, part_size, num_classes, channels_each_class, top_data, top_count_data);\n      }));\n\n  cudaError_t err = cudaGetLastError();\n  if (err != cudaSuccess)\n  {\n    printf(\"error in DeformablePSROIPoolForward: %s\\n\", cudaGetErrorString(err));\n  }\n}\n\nvoid DeformablePSROIPoolBackwardAcc(const at::Tensor out_grad,\n                                    const at::Tensor data,\n                                    const at::Tensor bbox,\n                                    const at::Tensor trans,\n                                    const at::Tensor top_count,\n                                    at::Tensor in_grad,\n                                    at::Tensor trans_grad,\n                                    const int batch,\n                                    const int channels,\n                                    const int height,\n                                    const int width,\n                                    const int num_bbox,\n                                    const int channels_trans,\n                                    const int no_trans,\n                                    const float spatial_scale,\n                                    const int output_dim,\n                                    const int group_size,\n                                    const int pooled_size,\n                                    const int part_size,\n                                    const int sample_per_part,\n                                    const float trans_std)\n{\n  // LOG(INFO) << \"DeformablePSROIPoolBackward\";\n  const int num_rois = num_bbox;\n  const int pooled_height = pooled_size;\n  const int pooled_width = pooled_size;\n  const int count = num_bbox * output_dim * pooled_height * pooled_width;\n  const int num_classes = no_trans ? 1 : channels_trans / 2;\n  const int channels_each_class = no_trans ? output_dim : output_dim / num_classes;\n\n  AT_DISPATCH_FLOATING_TYPES_AND_HALF(\n      out_grad.scalar_type(), \"deformable_psroi_pool_backward_acc\", ([&] {\n        const scalar_t *top_diff = out_grad.data_ptr<scalar_t>();\n        const scalar_t *bottom_data = data.data_ptr<scalar_t>();\n        const scalar_t *bottom_rois = bbox.data_ptr<scalar_t>();\n        const scalar_t *bottom_trans = no_trans ? nullptr : trans.data_ptr<scalar_t>();\n        scalar_t *bottom_data_diff = in_grad.data_ptr<scalar_t>();\n        scalar_t *bottom_trans_diff = no_trans ? nullptr : trans_grad.data_ptr<scalar_t>();\n        const scalar_t *top_count_data = top_count.data_ptr<scalar_t>();\n\n        DeformablePSROIPoolBackwardAccKernel<<<GET_BLOCKS(count), CUDA_NUM_THREADS>>>(\n            count, top_diff, top_count_data, num_rois, (scalar_t)spatial_scale, channels, height, width,\n            pooled_height, pooled_width, output_dim, bottom_data_diff, bottom_trans_diff,\n            bottom_data, bottom_rois, bottom_trans, no_trans, (scalar_t)trans_std, sample_per_part,\n            group_size, part_size, num_classes, channels_each_class);\n      }));\n\n  cudaError_t err = cudaGetLastError();\n  if (err != cudaSuccess)\n  {\n    printf(\"error in DeformablePSROIPoolForward: %s\\n\", cudaGetErrorString(err));\n  }\n}"
  },
  {
    "path": "network/backbone/resnet.py",
    "content": "import torch.nn as nn\nimport math\nimport torch.utils.model_zoo as model_zoo\nBatchNorm2d = nn.BatchNorm2d\n\n__all__ = ['ResNet', 'resnet18', 'resnet34', 'resnet50', 'resnet101',\n           'resnet152']\n\n\nmodel_urls = {\n    'resnet18': 'https://download.pytorch.org/models/resnet18-5c106cde.pth',\n    'resnet34': 'https://download.pytorch.org/models/resnet34-333f7ec4.pth',\n    'resnet50': 'https://download.pytorch.org/models/resnet50-19c8e357.pth',\n    'resnet101': 'https://download.pytorch.org/models/resnet101-5d3b4d8f.pth',\n    'resnet152': 'https://download.pytorch.org/models/resnet152-b121ed2d.pth',\n}\n\n\ndef constant_init(module, constant, bias=0):\n    nn.init.constant_(module.weight, constant)\n    if hasattr(module, 'bias'):\n        nn.init.constant_(module.bias, bias)\n\n\ndef conv3x3(in_planes, out_planes, stride=1):\n    \"\"\"3x3 convolution with padding\"\"\"\n    return nn.Conv2d(in_planes, out_planes, kernel_size=3, stride=stride,\n                     padding=1, bias=False)\n\n\nclass BasicBlock(nn.Module):\n    expansion = 1\n\n    def __init__(self, inplanes, planes, stride=1, downsample=None, dcn=None):\n        super(BasicBlock, self).__init__()\n        self.with_dcn = dcn is not None\n        self.conv1 = conv3x3(inplanes, planes, stride)\n        self.bn1 = BatchNorm2d(planes)\n        self.relu = nn.ReLU(inplace=True)\n        self.with_modulated_dcn = False\n        if self.with_dcn:\n            fallback_on_stride = dcn.get('fallback_on_stride', False)\n            self.with_modulated_dcn = dcn.get('modulated', False)\n        # self.conv2 = conv3x3(planes, planes)\n        if not self.with_dcn or fallback_on_stride:\n            self.conv2 = nn.Conv2d(planes, planes, kernel_size=3,\n                                   padding=1, bias=False)\n        else:\n            deformable_groups = dcn.get('deformable_groups', 1)\n            if not self.with_modulated_dcn:\n                from network.backbone.assets.dcn import DeformConv\n                conv_op = DeformConv\n                offset_channels = 18\n            else:\n                from network.backbone.assets.dcn import ModulatedDeformConv\n                conv_op = ModulatedDeformConv\n                offset_channels = 27\n            self.conv2_offset = nn.Conv2d(\n                planes,\n                deformable_groups * offset_channels,\n                kernel_size=3,\n                padding=1)\n            self.conv2 = conv_op(\n                planes,\n                planes,\n                kernel_size=3,\n                padding=1,\n                deformable_groups=deformable_groups,\n                bias=False)\n        self.bn2 = BatchNorm2d(planes)\n        self.downsample = downsample\n        self.stride = stride\n\n    def forward(self, x):\n        residual = x\n\n        out = self.conv1(x)\n        out = self.bn1(out)\n        out = self.relu(out)\n\n        # out = self.conv2(out)\n        if not self.with_dcn:\n            out = self.conv2(out)\n        elif self.with_modulated_dcn:\n            offset_mask = self.conv2_offset(out)\n            offset = offset_mask[:, :18, :, :]\n            mask = offset_mask[:, -9:, :, :].sigmoid()\n            out = self.conv2(out, offset, mask)\n        else:\n            offset = self.conv2_offset(out)\n            out = self.conv2(out, offset)\n        out = self.bn2(out)\n\n        if self.downsample is not None:\n            residual = self.downsample(x)\n\n        out += residual\n        out = self.relu(out)\n\n        return out\n\n\nclass Bottleneck(nn.Module):\n    expansion = 4\n\n    def __init__(self, inplanes, planes, stride=1, downsample=None, dcn=None):\n        super(Bottleneck, self).__init__()\n        self.with_dcn = dcn is not None\n        self.conv1 = nn.Conv2d(inplanes, planes, kernel_size=1, bias=False)\n        self.bn1 = BatchNorm2d(planes)\n        fallback_on_stride = False\n        self.with_modulated_dcn = False\n        if self.with_dcn:\n            fallback_on_stride = dcn.get('fallback_on_stride', False)\n            self.with_modulated_dcn = dcn.get('modulated', False)\n        if not self.with_dcn or fallback_on_stride:\n            self.conv2 = nn.Conv2d(planes, planes, kernel_size=3,\n                                   stride=stride, padding=1, bias=False)\n        else:\n            deformable_groups = dcn.get('deformable_groups', 1)\n            if not self.with_modulated_dcn:\n                from network.backbone.assets.dcn import DeformConv\n                conv_op = DeformConv\n                offset_channels = 18\n            else:\n                from network.backbone.assets.dcn import ModulatedDeformConv\n                conv_op = ModulatedDeformConv\n                offset_channels = 27\n            self.conv2_offset = nn.Conv2d(\n                planes, deformable_groups * offset_channels,\n                kernel_size=3,\n                padding=1)\n            self.conv2 = conv_op(\n                planes, planes, kernel_size=3, padding=1, stride=stride,\n                deformable_groups=deformable_groups, bias=False)\n        self.bn2 = BatchNorm2d(planes)\n        self.conv3 = nn.Conv2d(planes, planes * 4, kernel_size=1, bias=False)\n        self.bn3 = BatchNorm2d(planes * 4)\n        self.relu = nn.ReLU(inplace=True)\n        self.downsample = downsample\n        self.stride = stride\n        self.dcn = dcn\n        self.with_dcn = dcn is not None\n\n    def forward(self, x):\n        residual = x\n\n        out = self.conv1(x)\n        out = self.bn1(out)\n        out = self.relu(out)\n\n        # out = self.conv2(out)\n        if not self.with_dcn:\n            out = self.conv2(out)\n        elif self.with_modulated_dcn:\n            offset_mask = self.conv2_offset(out)\n            offset = offset_mask[:, :18, :, :]\n            mask = offset_mask[:, -9:, :, :].sigmoid()\n            out = self.conv2(out, offset, mask)\n        else:\n            offset = self.conv2_offset(out)\n            out = self.conv2(out, offset)\n        out = self.bn2(out)\n        out = self.relu(out)\n\n        out = self.conv3(out)\n        out = self.bn3(out)\n\n        if self.downsample is not None:\n            residual = self.downsample(x)\n\n        out += residual\n        out = self.relu(out)\n\n        return out\n\n\nclass ResNet(nn.Module):\n    def __init__(self, block, layers, num_classes=1000, \n                 dcn=None, stage_with_dcn=(False, False, False, False)):\n        self.dcn = dcn\n        self.stage_with_dcn = stage_with_dcn\n        self.inplanes = 64\n        super(ResNet, self).__init__()\n        self.conv1 = nn.Conv2d(3, 64, kernel_size=7, stride=2, padding=3,\n                               bias=False)\n        self.bn1 = BatchNorm2d(64)\n        self.relu = nn.ReLU(inplace=True)\n        self.maxpool = nn.MaxPool2d(kernel_size=3, stride=2, padding=1)\n        self.layer1 = self._make_layer(block, 64, layers[0])\n        self.layer2 = self._make_layer(\n            block, 128, layers[1], stride=2, dcn=dcn)\n        self.layer3 = self._make_layer(\n            block, 256, layers[2], stride=2, dcn=dcn)\n        self.layer4 = self._make_layer(\n            block, 512, layers[3], stride=2, dcn=dcn)\n        self.avgpool = nn.AvgPool2d(7, stride=1)\n        self.fc = nn.Linear(512 * block.expansion, num_classes)\n    \n        self.smooth = nn.Conv2d(2048, 256, kernel_size=1, stride=1, padding=1)    \n\n        for m in self.modules():\n            if isinstance(m, nn.Conv2d):\n                n = m.kernel_size[0] * m.kernel_size[1] * m.out_channels\n                m.weight.data.normal_(0, math.sqrt(2. / n))\n            elif isinstance(m, BatchNorm2d):\n                m.weight.data.fill_(1)\n                m.bias.data.zero_()\n        if self.dcn is not None:\n            for m in self.modules():\n                if isinstance(m, Bottleneck) or isinstance(m, BasicBlock):\n                    if hasattr(m, 'conv2_offset'):\n                        constant_init(m.conv2_offset, 0)\n\n    def _make_layer(self, block, planes, blocks, stride=1, dcn=None):\n        downsample = None\n        if stride != 1 or self.inplanes != planes * block.expansion:\n            downsample = nn.Sequential(\n                nn.Conv2d(self.inplanes, planes * block.expansion,\n                          kernel_size=1, stride=stride, bias=False),\n                BatchNorm2d(planes * block.expansion),\n            )\n\n        layers = []\n        layers.append(block(self.inplanes, planes,\n                            stride, downsample, dcn=dcn))\n        self.inplanes = planes * block.expansion\n        for i in range(1, blocks):\n            layers.append(block(self.inplanes, planes, dcn=dcn))\n\n        return nn.Sequential(*layers)\n\n    def forward(self, x):\n        x = self.conv1(x)\n        x = self.bn1(x)\n        x = self.relu(x)\n        x1 = self.maxpool(x)\n\n        x2 = self.layer1(x1)\n        x3 = self.layer2(x2)\n        x4 = self.layer3(x3)\n        x5 = self.layer4(x4)\n\n        return x1, x2, x3, x4, x5\n\n\ndef resnet18(pretrained=True, **kwargs):\n    \"\"\"Constructs a ResNet-18 model.\n    Args:\n        pretrained (bool): If True, returns a model pre-trained on ImageNet\n    \"\"\"\n    model = ResNet(BasicBlock, [2, 2, 2, 2], **kwargs)\n    if pretrained:\n        model.load_state_dict(model_zoo.load_url(\n            model_urls['resnet18']), strict=False)\n    return model\n\ndef deformable_resnet18(pretrained=True, **kwargs):\n    \"\"\"Constructs a ResNet-18 model.\n    Args:\n        pretrained (bool): If True, returns a model pre-trained on ImageNet\n    \"\"\"\n    model = ResNet(BasicBlock, [2, 2, 2, 2],\n                    dcn=dict(modulated=True,\n                            deformable_groups=1,\n                            fallback_on_stride=False),\n                    stage_with_dcn=[False, True, True, True], **kwargs)\n    if pretrained:\n        model.load_state_dict(model_zoo.load_url(\n            model_urls['resnet18']), strict=False)\n    return model\n\n\ndef resnet34(pretrained=True, **kwargs):\n    \"\"\"Constructs a ResNet-34 model.\n    Args:\n        pretrained (bool): If True, returns a model pre-trained on ImageNet\n    \"\"\"\n    model = ResNet(BasicBlock, [3, 4, 6, 3], **kwargs)\n    if pretrained:\n        model.load_state_dict(model_zoo.load_url(\n            model_urls['resnet34']), strict=False)\n    return model\n\n\ndef resnet50(pretrained=True, **kwargs):\n    \"\"\"Constructs a ResNet-50 model.\n    Args:\n        pretrained (bool): If True, returns a model pre-trained on ImageNet\n    \"\"\"\n    model = ResNet(Bottleneck, [3, 4, 6, 3], **kwargs)\n    if pretrained:\n        model.load_state_dict(model_zoo.load_url(\n            model_urls['resnet50']), strict=False)\n    return model\n\n\ndef deformable_resnet50(pretrained=True, **kwargs):\n    \"\"\"Constructs a ResNet-50 model with deformable conv.\n    Args:\n        pretrained (bool): If True, returns a model pre-trained on ImageNet\n    \"\"\"\n    model = ResNet(Bottleneck, [3, 4, 6, 3],\n                   dcn=dict(modulated=True,\n                            deformable_groups=1,\n                            fallback_on_stride=False),\n                   stage_with_dcn=[False, True, True, True],\n                   **kwargs)\n    if pretrained:\n        model.load_state_dict(model_zoo.load_url(\n            model_urls['resnet50']), strict=False)\n    return model\n\n\ndef resnet101(pretrained=True, **kwargs):\n    \"\"\"Constructs a ResNet-101 model.\n    Args:\n        pretrained (bool): If True, returns a model pre-trained on ImageNet\n    \"\"\"\n    model = ResNet(Bottleneck, [3, 4, 23, 3], **kwargs)\n    if pretrained:\n        model.load_state_dict(model_zoo.load_url(\n            model_urls['resnet101']), strict=False)\n    return model\n\n\ndef resnet152(pretrained=True, **kwargs):\n    \"\"\"Constructs a ResNet-152 model.\n    Args:\n        pretrained (bool): If True, returns a model pre-trained on ImageNet\n    \"\"\"\n    model = ResNet(Bottleneck, [3, 8, 36, 3], **kwargs)\n    if pretrained:\n        model.load_state_dict(model_zoo.load_url(\n            model_urls['resnet152']), strict=False)\n    return model\n"
  },
  {
    "path": "network/backbone/vgg.py",
    "content": "import torch.nn as nn\nimport torch.utils.model_zoo as model_zoo\nimport torchvision.models as models\n\nmodel_urls = {\n    'vgg11': 'https://download.pytorch.org/models/vgg11-bbd30ac9.pth',\n    'vgg16': 'https://download.pytorch.org/models/vgg16-397923af.pth',\n    'vgg19': 'https://download.pytorch.org/models/vgg19-dcbb9e9d.pth',\n    'vgg11_bn': 'https://download.pytorch.org/models/vgg11_bn-6002323d.pth',\n    'vgg16_bn': 'https://download.pytorch.org/models/vgg16_bn-6c64b313.pth',\n    'vgg19_bn': 'https://download.pytorch.org/models/vgg19_bn-c79401a0.pth',\n}\n\n\nclass VggNet(nn.Module):\n    def __init__(self, name=\"vgg16\", pretrain=True):\n        super().__init__()\n        if name == \"vgg16\":\n            base_net = models.vgg16(pretrained=False)\n        elif name == \"vgg16_bn\":\n            base_net = models.vgg16_bn(pretrained=False)\n        else:\n            print(\" base model is not support !\")\n        if pretrain:\n            print(\"load the {} weight from ./cache\".format(name))\n            base_net.load_state_dict(model_zoo.load_url(model_urls[name], model_dir=\"./cache\"))\n\n        if name == \"vgg16\":\n            self.stage1 = nn.Sequential(*[base_net.features[layer] for layer in range(0, 5)])\n            self.stage2 = nn.Sequential(*[base_net.features[layer] for layer in range(5, 10)])\n            self.stage3 = nn.Sequential(*[base_net.features[layer] for layer in range(10, 17)])\n            self.stage4 = nn.Sequential(*[base_net.features[layer] for layer in range(17, 24)])\n            self.stage5 = nn.Sequential(*[base_net.features[layer] for layer in range(24, 31)])\n        elif name == \"vgg16_bn\":\n            self.stage1 = nn.Sequential(*[base_net.features[layer] for layer in range(0, 7)])\n            self.stage2 = nn.Sequential(*[base_net.features[layer] for layer in range(7, 14)])\n            self.stage3 = nn.Sequential(*[base_net.features[layer] for layer in range(14, 24)])\n            self.stage4 = nn.Sequential(*[base_net.features[layer] for layer in range(24, 34)])\n            self.stage5 = nn.Sequential(*[base_net.features[layer] for layer in range(34, 44)])\n\n    def forward(self, x):\n        C1 = self.stage1(x)\n        C2 = self.stage2(C1)\n        C3 = self.stage3(C2)\n        C4 = self.stage4(C3)\n        C5 = self.stage5(C4)\n\n        return C1, C2, C3, C4, C5\n\n\nif __name__ == '__main__':\n    import torch\n    input = torch.randn((4, 3, 512, 512))\n    net = VggNet()\n    C1, C2, C3, C4, C5 = net(input)\n    print(C1.size())\n    print(C2.size())\n    print(C3.size())\n    print(C4.size())\n    print(C5.size())\n"
  },
  {
    "path": "network/layers/Adaptive_Deformation.py",
    "content": "###################################################################\n# File Name: AdaptiveDeformation.py\n# Author: S.X.Zhang\n###################################################################\n\nfrom __future__ import print_function\nfrom __future__ import division\nfrom __future__ import absolute_import\n\nimport torch\nimport torch.nn as nn\nimport torch.nn.functional as F\nfrom torch.nn import init\n\n\nclass MeanAggregator(nn.Module):\n    def __init__(self):\n        super(MeanAggregator, self).__init__()\n\n    def forward(self, features, A):\n        x = torch.bmm(A, features)\n        return x\n\n\nclass GraphConv(nn.Module):\n    def __init__(self, in_dim, out_dim, agg):\n        super(GraphConv, self).__init__()\n        self.in_dim = in_dim\n        self.out_dim = out_dim\n        self.weight = nn.Parameter(torch.FloatTensor(in_dim * 2, out_dim))\n        self.bias = nn.Parameter(torch.FloatTensor(out_dim))\n        init.xavier_uniform_(self.weight)\n        init.constant_(self.bias, 0)\n        self.agg = agg()\n\n    def forward(self, features, A):\n        b, n, d = features.shape\n        assert (d == self.in_dim)\n        agg_feats = self.agg(features, A)\n        cat_feats = torch.cat([features, agg_feats], dim=2)\n        out = torch.einsum('bnd,df->bnf', (cat_feats, self.weight))\n        out = F.relu(out + self.bias)\n        return out\n\n\nclass AdaptiveDeformation(nn.Module):\n    def __init__(self, input, state_dim):\n        super(AdaptiveDeformation, self).__init__()\n        self.bn0 = nn.BatchNorm1d(input, affine=False)\n        self.conv1 = nn.Conv1d(input, state_dim, 1)\n        self.rnn = nn.LSTM(input, state_dim, 1, bidirectional=True)\n        self.gconv1 = GraphConv(input, 256, MeanAggregator)\n        self.gconv2 = GraphConv(256, 1024, MeanAggregator)\n        self.gconv3 = GraphConv(1024, 512, MeanAggregator)\n        self.gconv4 = GraphConv(512, state_dim, MeanAggregator)\n\n        self.prediction = nn.Sequential(\n            nn.Conv1d(4*state_dim, 128, 1),\n            nn.ReLU(inplace=True),\n            nn.Dropout(0.1),\n            nn.Conv1d(128, 64, 1),\n            nn.ReLU(inplace=True),\n            nn.Dropout(0.1),\n            nn.Conv1d(64, 2, 1))\n\n    def forward(self, x, A):\n        x = self.bn0(x)\n\n        # # rnn block\n        yl = x.permute(2, 0, 1)\n        yl, _ = self.rnn(yl)\n        yl = yl.permute(1, 2, 0)\n\n        # # gcn block\n        yg = x.permute(0, 2, 1)\n        b, n, c = yg.shape\n        A = A.expand(b, n, n)\n        yg = self.gconv1(yg, A)\n        yg = self.gconv2(yg, A)\n        yg = self.gconv3(yg, A)\n        yg = self.gconv4(yg, A)\n        yg = yg.permute(0, 2, 1)\n\n        # res block\n        x = torch.cat([yl, yg, self.conv1(x)], dim=1)\n        pred = self.prediction(x)\n\n        return pred\n"
  },
  {
    "path": "network/layers/CircConv.py",
    "content": "import torch.nn as nn\nimport torch\n\n\nclass CircConv(nn.Module):\n    def __init__(self, state_dim, out_state_dim=None, n_adj=4):\n        super(CircConv, self).__init__()\n\n        self.n_adj = n_adj\n        out_state_dim = state_dim if out_state_dim is None else out_state_dim\n        self.fc = nn.Conv1d(state_dim, out_state_dim, kernel_size=self.n_adj*2+1)\n\n    def forward(self, input, adj):\n        input = torch.cat([input[..., -self.n_adj:], input, input[..., :self.n_adj]], dim=2)\n        return self.fc(input)\n\n\nclass DilatedCircConv(nn.Module):\n    def __init__(self, state_dim, out_state_dim=None, n_adj=4, dilation=1):\n        super(DilatedCircConv, self).__init__()\n\n        self.n_adj = n_adj\n        self.dilation = dilation\n        out_state_dim = state_dim if out_state_dim is None else out_state_dim\n        self.fc = nn.Conv1d(state_dim, out_state_dim, kernel_size=self.n_adj*2+1, dilation=self.dilation)\n\n    def forward(self, input, adj):\n        if self.n_adj != 0:\n            input = torch.cat([input[..., -self.n_adj*self.dilation:], input, input[..., :self.n_adj*self.dilation]], dim=2)\n        return self.fc(input)\n\n\n_conv_factory = {\n    'grid': CircConv,\n    'dgrid': DilatedCircConv\n}\n\n\nclass BasicBlock(nn.Module):\n    def __init__(self, state_dim, out_state_dim, conv_type, n_adj=4, dilation=1):\n        super(BasicBlock, self).__init__()\n\n        self.conv = _conv_factory[conv_type](state_dim, out_state_dim, n_adj, dilation)\n        self.relu = nn.ReLU(inplace=True)\n        self.norm = nn.BatchNorm1d(out_state_dim)\n\n    def forward(self, x, adj=None):\n        x = self.conv(x, adj)\n        x = self.relu(x)\n        x = self.norm(x)\n        return x\n\n\nclass DeepSnake(nn.Module):\n    def __init__(self, state_dim, feature_dim, conv_type='dgrid'):\n        super(DeepSnake, self).__init__()\n\n        self.head = BasicBlock(feature_dim, state_dim, conv_type)\n\n        self.res_layer_num = 7\n        dilation = [1, 1, 1, 2, 2, 4, 4]\n        for i in range(self.res_layer_num):\n            conv = BasicBlock(state_dim, state_dim, conv_type, n_adj=4, dilation=dilation[i])\n            self.__setattr__('res'+str(i), conv)\n\n        fusion_state_dim = 256\n        self.fusion = nn.Conv1d(state_dim * (self.res_layer_num + 1), fusion_state_dim, 1)\n        self.prediction = nn.Sequential(\n            nn.Conv1d(state_dim * (self.res_layer_num + 1) + fusion_state_dim, 256, 1),\n            nn.ReLU(inplace=True),\n            nn.Conv1d(256, 64, 1),\n            nn.ReLU(inplace=True),\n            nn.Conv1d(64, 2, 1)\n        )\n\n    def forward(self, x, adj):\n        states = []\n\n        x = self.head(x, adj)\n        states.append(x)\n        for i in range(self.res_layer_num):\n            x = self.__getattr__('res'+str(i))(x, adj) + x\n            states.append(x)\n\n        state = torch.cat(states, dim=1)\n        global_state = torch.max(self.fusion(state), dim=2, keepdim=True)[0]\n        global_state = global_state.expand(global_state.size(0), global_state.size(1), state.size(2))\n        state = torch.cat([global_state, state], dim=1)\n        x = self.prediction(state)\n\n        return x\n"
  },
  {
    "path": "network/layers/GCN.py",
    "content": "###################################################################\n# File Name: GCN.py\n# Author: S.X.Zhang\n###################################################################\n\nfrom __future__ import print_function\nfrom __future__ import division\nfrom __future__ import absolute_import\n\nimport torch\nimport torch.nn as nn\nimport torch.nn.functional as F\nfrom torch.nn import init\n\n\nclass MeanAggregator(nn.Module):\n    def __init__(self):\n        super(MeanAggregator, self).__init__()\n\n    def forward(self, features, A):\n        x = torch.bmm(A, features)\n        return x\n\n\nclass GraphConv(nn.Module):\n    def __init__(self, in_dim, out_dim, agg):\n        super(GraphConv, self).__init__()\n        self.in_dim = in_dim\n        self.out_dim = out_dim\n        self.weight = nn.Parameter(torch.FloatTensor(in_dim * 2, out_dim))\n        self.bias = nn.Parameter(torch.FloatTensor(out_dim))\n        init.xavier_uniform_(self.weight)\n        init.constant_(self.bias, 0)\n        self.agg = agg()\n\n    def forward(self, features, A):\n        b, n, d = features.shape\n        assert (d == self.in_dim)\n        agg_feats = self.agg(features, A)\n        cat_feats = torch.cat([features, agg_feats], dim=2)\n        out = torch.einsum('bnd,df->bnf', (cat_feats, self.weight))\n        out = F.relu(out + self.bias)\n        return out\n\n\nclass GCN(nn.Module):\n    def __init__(self, in_dim, out_dim):\n        super(GCN, self).__init__()\n        self.bn0 = nn.BatchNorm1d(in_dim, affine=False)\n\n        self.conv1 = GraphConv(in_dim, 256, MeanAggregator)\n        self.conv2 = GraphConv(256, 1024, MeanAggregator)\n        self.conv3 = GraphConv(1024, 512, MeanAggregator)\n        self.conv4 = GraphConv(512, out_dim, MeanAggregator)\n\n        self.prediction = nn.Sequential(\n            nn.Conv1d(out_dim, 128, 1),\n            nn.ReLU(inplace=True),\n            nn.Conv1d(128, 64, 1),\n            nn.ReLU(inplace=True),\n            nn.Conv1d(64, 2, 1))\n\n    def forward(self, x, A):\n        x = self.bn0(x)\n        x = x.permute(0, 2, 1)\n        b, n, c = x.shape\n        A = A.expand(b, n, n)\n\n        x = self.conv1(x, A)\n        x = self.conv2(x, A)\n        x = self.conv3(x, A)\n        x = self.conv4(x, A)\n\n        x = x.permute(0, 2, 1)\n        pred = self.prediction(x)\n\n        return pred\n"
  },
  {
    "path": "network/layers/GraphConv.py",
    "content": "import math\n\nimport torch\nfrom torch.nn.parameter import Parameter\nfrom torch.nn.modules.module import Module\nfrom torch.nn import init\n\n\nclass GraphConvolution(Module):\n    \"\"\"\n    Simple GCN layer, similar to https://arxiv.org/abs/1609.02907\n    \"\"\"\n\n    def __init__(self, in_features, out_features, bias=True):\n        super(GraphConvolution, self).__init__()\n        self.in_features = in_features\n        self.out_features = out_features\n        self.weight = Parameter(torch.FloatTensor(in_features, out_features))\n        init.xavier_uniform_(self.weight)\n        if bias:\n            self.bias = Parameter(torch.FloatTensor(out_features))\n            init.constant_(self.bias, 0)\n        else:\n            self.register_parameter('bias', None)\n\n        self.reset_parameters()\n\n    def reset_parameters(self):\n        stdv = 1. / math.sqrt(self.weight.size(1))\n        self.weight.data.uniform_(-stdv, stdv)\n        if self.bias is not None:\n            self.bias.data.uniform_(-stdv, stdv)\n\n    def forward(self, input, adj):\n        support = torch.mm(input, self.weight)\n        output = torch.spmm(adj, support)\n        if self.bias is not None:\n            return output + self.bias\n        else:\n            return output\n\n    def __repr__(self):\n        return self.__class__.__name__ + ' (' \\\n               + str(self.in_features) + ' -> ' \\\n               + str(self.out_features) + ')'\n"
  },
  {
    "path": "network/layers/RNN.py",
    "content": "###################################################################\n# File Name: RNN.py\n# Author: S.X.Zhang\n###################################################################\n\nfrom __future__ import print_function\nfrom __future__ import division\nfrom __future__ import absolute_import\n\nimport torch\nimport torch.nn as nn\nimport torch.nn.functional as F\nfrom torch.nn import init\n\n\nclass RNN(nn.Module):\n    def __init__(self, input, state_dim):\n        super(RNN, self).__init__()\n        self.bn0 = nn.BatchNorm1d(input, affine=False)\n        self.rnn = nn.LSTM(input, state_dim, 1, dropout=0.1, bidirectional=True)\n        self.prediction = nn.Sequential(\n            nn.Conv1d(state_dim*2, 128, 1),\n            nn.ReLU(inplace=True),\n            nn.Conv1d(128, 64, 1),\n            nn.ReLU(inplace=True),\n            nn.Conv1d(64, 2, 1))\n\n    def forward(self, x, adj):\n        x = self.bn0(x)\n        x = x.permute(2, 0, 1)\n        x, _ = self.rnn(x)\n        x = x.permute(1, 2, 0)\n        pred = self.prediction(x)\n\n        return pred\n"
  },
  {
    "path": "network/layers/Transformer.py",
    "content": "###################################################################\n# File Name: GCN.py\n# Author: S.X.Zhang\n###################################################################\nimport torch\nfrom torch import nn, Tensor\nimport numpy as np\nfrom cfglib.config import config as cfg\n\n\nclass Positional_encoding(nn.Module):\n    def __init__(self, PE_size, n_position=256):\n        super(Positional_encoding, self).__init__()\n        self.PE_size = PE_size\n        self.n_position = n_position\n        self.register_buffer('pos_table', self.get_encoding_table(n_position, PE_size))\n\n    def get_encoding_table(self, n_position, PE_size):\n        position_table = np.array(\n            [[pos / np.power(10000, 2. * i / self.PE_size) for i in range(self.PE_size)] for pos in range(n_position)])\n        position_table[:, 0::2] = np.sin(position_table[:, 0::2])\n        position_table[:, 1::2] = np.cos(position_table[:, 1::2])\n        return torch.FloatTensor(position_table).unsqueeze(0)\n\n    def forward(self, inputs):\n        return inputs + self.pos_table[:, :inputs.size(1), :].clone().detach()\n\n\nclass MultiHeadAttention(nn.Module):\n    def __init__(self, num_heads, embed_dim, dropout=0.1, if_resi=True):\n        super(MultiHeadAttention, self).__init__()\n        self.layer_norm = nn.LayerNorm(embed_dim)\n        self.MultiheadAttention = nn.MultiheadAttention(embed_dim, num_heads, dropout=dropout)\n        self.Q_proj = nn.Sequential(nn.Linear(embed_dim, embed_dim), nn.ReLU())\n        self.K_proj = nn.Sequential(nn.Linear(embed_dim, embed_dim), nn.ReLU())\n        self.V_proj = nn.Sequential(nn.Linear(embed_dim, embed_dim), nn.ReLU())\n        self.if_resi = if_resi\n\n    def forward(self, inputs):\n        query = self.layer_norm(inputs)\n        q = self.Q_proj(query)\n        k = self.K_proj(query)\n        v = self.V_proj(query)\n        attn_output, attn_output_weights = self.MultiheadAttention(q, k, v)\n        if self.if_resi:\n            attn_output += inputs\n        else:\n            attn_output = attn_output\n\n        return attn_output\n\n\nclass FeedForward(nn.Module):\n    def __init__(self, in_channel, FFN_channel, if_resi=True):\n        super(FeedForward, self).__init__()\n        \"\"\"\n        1024 2048\n        \"\"\"\n        output_channel = (FFN_channel, in_channel)\n        self.fc1 = nn.Sequential(nn.Linear(in_channel, output_channel[0]), nn.ReLU())\n        self.fc2 = nn.Linear(output_channel[0], output_channel[1])\n        self.layer_norm = nn.LayerNorm(in_channel)\n        self.if_resi = if_resi\n\n    def forward(self, inputs):\n        outputs = self.layer_norm(inputs)\n        outputs = self.fc1(outputs)\n        outputs = self.fc2(outputs)\n        if self.if_resi:\n            outputs += inputs\n        else:\n            outputs = outputs\n        return outputs\n\n\nclass TransformerLayer(nn.Module):\n    def __init__(self, out_dim, in_dim, num_heads, attention_size,\n                 dim_feedforward=1024, drop_rate=0.1, if_resi=True, block_nums=3):\n        super(TransformerLayer, self).__init__()\n        self.block_nums = block_nums\n        self.if_resi = if_resi\n        self.linear = nn.Linear(in_dim, attention_size)\n        for i in range(self.block_nums):\n            self.__setattr__('MHA_self_%d' % i, MultiHeadAttention(num_heads, attention_size,\n                                                                   dropout=drop_rate, if_resi=if_resi))\n            self.__setattr__('FFN_%d' % i, FeedForward(out_dim, dim_feedforward, if_resi=if_resi))\n\n    def forward(self, query):\n        inputs = self.linear(query)\n        # outputs = inputs\n        for i in range(self.block_nums):\n            outputs = self.__getattr__('MHA_self_%d' % i)(inputs)\n            outputs = self.__getattr__('FFN_%d' % i)(outputs)\n            if self.if_resi:\n                inputs = inputs+outputs\n            else:\n                inputs = outputs\n        # outputs = inputs\n        return inputs\n\n\nclass Transformer(nn.Module):\n\n    def __init__(self, in_dim, out_dim, num_heads=8,\n                 dim_feedforward=1024, drop_rate=0.1, if_resi=False, block_nums=3):\n        super().__init__()\n\n        self.bn0 = nn.BatchNorm1d(in_dim, affine=False)\n        self.conv1 = nn.Conv1d(in_dim, out_dim, 1, dilation=1)\n\n        # self.pos_embedding = Positional_encoding(in_dim)\n        self.transformer = TransformerLayer(out_dim, in_dim, num_heads, attention_size=out_dim,\n                                            dim_feedforward=dim_feedforward, drop_rate=drop_rate,\n                                            if_resi=if_resi, block_nums=block_nums)\n\n        self.prediction = nn.Sequential(\n            nn.Conv1d(2*out_dim, 128, 1),\n            nn.ReLU(inplace=True),\n            nn.Dropout(0.1),\n            nn.Conv1d(128, 64, 1),\n            nn.ReLU(inplace=True),\n            # nn.Dropout(0.1),\n            nn.Conv1d(64, 2, 1))\n\n    def forward(self, x, adj):\n        x = self.bn0(x)\n\n        x1 = x.permute(0, 2, 1)\n        # x1 = self.pos_embedding(x1)\n        x1 = self.transformer(x1)\n        x1 = x1.permute(0, 2, 1)\n\n        x = torch.cat([x1, self.conv1(x)], dim=1)\n        # x = x1+self.conv1(x)\n        pred = self.prediction(x)\n\n        return pred\n\n\n\n"
  },
  {
    "path": "network/layers/Transformer_old.py",
    "content": "###################################################################\n# File Name: GCN.py\n# Author: S.X.Zhang\n###################################################################\n\nimport torch\nimport torch.nn.functional as F\nfrom torch import nn, Tensor\nfrom torch.autograd import Variable\nimport numpy as np\nfrom cfglib.config import config as cfg\n\n\nclass Positional_encoding(nn.Module):\n    def __init__(self, PE_size, n_position=200):\n        super(Positional_encoding, self).__init__()\n        self.PE_size = PE_size\n        self.n_position = n_position\n        self.register_buffer('pos_table', self.get_encoding_table(n_position, PE_size))\n\n    def get_encoding_table(self, n_position, PE_size):\n        position_table = np.array(\n            [[pos / np.power(10000, 2. * i / self.PE_size) for i in range(self.PE_size)] for pos in range(n_position)])\n        position_table[:, 0::2] = np.sin(position_table[:, 0::2])\n        position_table[:, 1::2] = np.cos(position_table[:, 1::2])\n        return torch.FloatTensor(position_table).unsqueeze(0)\n\n    def forward(self, inputs):\n        return inputs + self.pos_table[:, :inputs.size(1), :].clone().detach()\n\n\nclass MultiHeadAttention(nn.Module):\n    def __init__(self, num_heads, embedding_size, attention_size,\n                 drop_rate, future_blind=True, query_mask=False, if_resi=True):\n        super(MultiHeadAttention, self).__init__()\n        self.num_heads = num_heads\n        self.embedding_size = embedding_size\n        self.attention_size = attention_size\n        self.drop_rate = drop_rate\n        self.future_blind = future_blind\n        \n        self.Q_proj = nn.Sequential(nn.Linear(self.embedding_size, self.attention_size), nn.ReLU())\n        self.K_proj = nn.Sequential(nn.Linear(self.embedding_size, self.attention_size), nn.ReLU())\n        self.V_proj = nn.Sequential(nn.Linear(self.embedding_size, self.attention_size), nn.ReLU())\n\n        self.drop_out = nn.Dropout(p=self.drop_rate)\n        self.layer_norm = nn.LayerNorm(self.attention_size)\n        self.if_resi = if_resi\n\n    def forward(self, query, key, value):\n        q = self.Q_proj(query)\n        k = self.K_proj(key)\n        v = self.V_proj(value)\n\n        q_ = torch.cat(torch.chunk(q, self.num_heads, dim=2), dim=0)\n        k_ = torch.cat(torch.chunk(k, self.num_heads, dim=2), dim=0)\n        v_ = torch.cat(torch.chunk(v, self.num_heads, dim=2), dim=0)\n\n        outputs = torch.bmm(q_, k_.permute(0, 2, 1))\n        outputs = outputs / (k_.size()[-1] ** 0.5)\n\n        # key mask\n\n        # future mask\n        if self.future_blind:\n            diag_vals = torch.ones_like(outputs[0, :, :]).to(cfg.device)\n            tril = torch.tril(diag_vals, diagonal=0)\n            masks = Variable(torch.unsqueeze(tril, 0).repeat(outputs.size()[0], 1, 1))  # (h*N,T_q,T_k)\n            padding = Variable(torch.ones_like(masks).to(cfg.device) * (-2 ** 32 + 1))\n            condition = masks.eq(0)\n            outputs = torch.where(condition, padding, outputs)\n\n        outputs = F.softmax(outputs, dim=-1)\n        # if self.future_blind==True:a\n        #     print(outputs[0])\n        outputs = self.drop_out(outputs)\n\n        outputs = torch.bmm(outputs, v_)\n        outputs = torch.cat(torch.chunk(outputs, self.num_heads, dim=0), dim=2)  # N,T_q,C\n\n        if self.if_resi:\n            # outputs += query\n            outputs += q\n        else:\n            outputs = outputs\n        outputs = self.layer_norm(outputs)\n\n        return outputs\n\n\nclass FeedForward(nn.Module):\n    def __init__(self, in_channel, FFN_channel, if_resi=True):\n        super(FeedForward, self).__init__()\n        \"\"\"\n        1024 2048\n        \"\"\"\n        output_channel = (FFN_channel, in_channel)\n        self.fc1 = nn.Sequential(nn.Linear(in_channel, output_channel[0]), nn.ReLU())\n        self.fc2 = nn.Linear(output_channel[0], output_channel[1])\n        self.layer_norm = nn.LayerNorm(in_channel)\n        self.if_resi = if_resi\n\n    def forward(self, inputs):\n        outputs = self.fc1(inputs)\n        outputs = self.fc2(outputs)\n        if self.if_resi:\n            outputs += inputs\n        else:\n            outputs = outputs\n        outputs = self.layer_norm(outputs)\n        return outputs\n\n\nclass TransformerLayer(nn.Module):\n    def __init__(self, out_dim, num_heads, embedding_size, attention_size,\n                 dim_feedforward=1024, drop_rate=0.1, if_resi=True, block_nums=3):\n        super(TransformerLayer, self).__init__()\n        self.block_nums = block_nums\n        self.if_resi = if_resi\n        for i in range(self.block_nums):\n            self.__setattr__('MHA_self_%d' % i, MultiHeadAttention(num_heads, embedding_size, attention_size,\n                                                                   drop_rate, future_blind=False, if_resi=if_resi))\n            self.__setattr__('FFN_%d' % i, FeedForward(out_dim, dim_feedforward, if_resi=if_resi))\n\n    def forward(self, query):\n        outputs = None\n        for i in range(self.block_nums):\n            outputs = self.__getattr__('MHA_self_%d' % i)(query, query, query)\n            outputs = self.__getattr__('FFN_%d' % i)(outputs)\n        return outputs\n\n\nclass Transformer(nn.Module):\n\n    def __init__(self, in_dim, out_dim, num_heads=8,\n                 dim_feedforward=1024, drop_rate=0.1, if_resi=False, block_nums=3):\n        super().__init__()\n\n        self.bn0 = nn.BatchNorm1d(in_dim, affine=False)\n        self.conv1 = nn.Conv1d(in_dim, out_dim, 1, dilation=1)\n\n        embed_dim = in_dim\n        # self.pos_embedding = Positional_encoding(embed_dim)\n        self.transformer = TransformerLayer(out_dim, num_heads, embedding_size=embed_dim,\n                                            attention_size=out_dim, dim_feedforward=dim_feedforward,\n                                            drop_rate=drop_rate, if_resi=if_resi, block_nums=block_nums)\n\n        self.prediction = nn.Sequential(\n            nn.Conv1d(out_dim*2, 128, 1),\n            nn.ReLU(inplace=True),\n            nn.Dropout(0.1),\n            nn.Conv1d(128, 64, 1),\n            nn.ReLU(inplace=True),\n            # nn.Dropout(0.1),\n            nn.Conv1d(64, 2, 1))\n\n    def forward(self, x, adj):\n        x = self.bn0(x)\n\n        x1 = x.permute(0, 2, 1)\n        x1 = self.transformer(x1)\n        x1 = x1.permute(0, 2, 1)\n\n        x = torch.cat([x1, self.conv1(x)], dim=1)\n        # x = x1+self.conv1(x)\n        pred = self.prediction(x)\n\n        return pred\n\n\n\n"
  },
  {
    "path": "network/layers/__init__.py",
    "content": ""
  },
  {
    "path": "network/layers/gcn_utils.py",
    "content": "# -*- coding: utf-8 -*-\n__author__ = \"S.X.Zhang\"\nimport torch\nimport numpy as np\nimport cv2\nimport torch.nn as nn\nfrom torch.autograd import Variable\n\n\ndef normalize_adj(A, type=\"AD\"):\n    if type == \"DAD\":\n        A = A + np.eye(A.shape[0])  # A=A+I\n        d = np.sum(A, axis=0)\n        d_inv = np.power(d, -0.5).flatten()\n        d_inv[np.isinf(d_inv)] = 0.0\n        d_inv = np.diag(d_inv)\n        G = A.dot(d_inv).transpose().dot(d_inv)  # L = D^-1/2 A D^-1/2\n        G = torch.from_numpy(G)\n    elif type == \"AD\":\n        A = A + np.eye(A.shape[0])  # A=A+I\n        A = torch.from_numpy(A)\n        D = A.sum(1, keepdim=True)\n        G = A.div(D)  # L= A/D\n    else:\n        A = A + np.eye(A.shape[0])  # A=A+I\n        D = A.sum(1, keepdim=True)\n        D = np.diag(D)\n        G = torch.from_numpy(D - A)  # L = D-A\n    return G\n\n\ndef np_to_variable(x, is_cuda=True, dtype=torch.FloatTensor):\n    v = Variable(torch.from_numpy(x).type(dtype))\n    if is_cuda:\n        v = v.cuda()\n    return v\n\n\ndef set_trainable(model, requires_grad):\n    for param in model.parameters():\n        param.requires_grad = requires_grad\n\n\ndef weights_normal_init(model, dev=0.01):\n    if isinstance(model, list):\n        for m in model:\n            weights_normal_init(m, dev)\n    else:\n        for m in model.modules():\n            if isinstance(m, nn.Conv2d):\n                m.weight.data.normal_(0.0, dev)\n            elif isinstance(m, nn.Linear):\n                m.weight.data.normal_(0.0, dev)\n\n\ndef clip_gradient(model, clip_norm):\n    \"\"\"Computes a gradient clipping coefficient based on gradient norm.\"\"\"\n    totalnorm = 0\n    for p in model.parameters():\n        if p.requires_grad:\n            modulenorm = p.grad.data.norm()\n            totalnorm += modulenorm ** 2\n    totalnorm = np.sqrt(totalnorm)\n\n    norm = clip_norm / max(totalnorm, clip_norm)\n    for p in model.parameters():\n        if p.requires_grad:\n            p.grad.mul_(norm)\n\n\ndef EuclideanDistances(A, B):\n    BT = B.transpose()\n    vecProd = np.dot(A,BT)\n    SqA = A**2\n    sumSqA = np.matrix(np.sum(SqA, axis=1))\n    sumSqAEx = np.tile(sumSqA.transpose(), (1, vecProd.shape[1]))\n\n    SqB = B**2\n    sumSqB = np.sum(SqB, axis=1)\n    sumSqBEx = np.tile(sumSqB, (vecProd.shape[0], 1))\n    SqED = sumSqBEx + sumSqAEx - 2*vecProd\n    SqED[SqED<0]=0.0\n    ED = np.sqrt(SqED)\n    return ED\n\n\ndef get_center_feature(cnn_feature, img_poly, ind, h, w):\n    batch_size = cnn_feature.size(0)\n    for i in range(batch_size):\n        poly = img_poly[ind == i].cpu().numpy()\n        mask = np.zeros((h, w), dtype=np.uint8)\n        cv2.fillPoly(mask, poly.astype(np.int32), color=(1,))\n    return None\n\n\ndef get_node_feature(cnn_feature, img_poly, ind, h, w):\n    img_poly = img_poly.clone().float()\n    img_poly[..., 0] = img_poly[..., 0] / (w / 2.) - 1\n    img_poly[..., 1] = img_poly[..., 1] / (h / 2.) - 1\n\n    batch_size = cnn_feature.size(0)\n    gcn_feature = torch.zeros([img_poly.size(0), cnn_feature.size(1), img_poly.size(1)]).to(img_poly.device)\n    for i in range(batch_size):\n        poly = img_poly[ind == i].unsqueeze(0)\n        gcn_feature[ind == i] = torch.nn.functional.grid_sample(cnn_feature[i:i + 1], poly)[0].permute(1, 0, 2)\n    return gcn_feature\n\n\ndef get_adj_mat(n_adj, n_nodes):\n    a = np.zeros([n_nodes, n_nodes], dtype=np.float)\n\n    for i in range(n_nodes):\n        for j in range(-n_adj // 2, n_adj // 2 + 1):\n            if j != 0:\n                a[i][(i + j) % n_nodes] = 1\n                a[(i + j) % n_nodes][i] = 1\n    return a\n\n\ndef get_adj_ind(n_adj, n_nodes, device):\n    ind = torch.tensor([i for i in range(-n_adj // 2, n_adj // 2 + 1) if i != 0]).long()\n    ind = (torch.arange(n_nodes)[:, None] + ind[None]) % n_nodes\n    return ind.to(device)\n\n\ndef coord_embedding(b, w, h, device):\n    x_range = torch.linspace(0, 1, w, device=device)\n    y_range = torch.linspace(0, 1, h, device=device)\n    y, x = torch.meshgrid(y_range, x_range)\n    y = y.expand([b, 1, -1, -1])\n    x = x.expand([b, 1, -1, -1])\n    coord_map = torch.cat([x, y], 1)\n\n    return coord_map\n\n\ndef img_poly_to_can_poly(img_poly):\n    if len(img_poly) == 0:\n        return torch.zeros_like(img_poly)\n    x_min = torch.min(img_poly[..., 0], dim=-1)[0]\n    y_min = torch.min(img_poly[..., 1], dim=-1)[0]\n    can_poly = img_poly.clone()\n    can_poly[..., 0] = can_poly[..., 0] - x_min[..., None]\n    can_poly[..., 1] = can_poly[..., 1] - y_min[..., None]\n    # x_max = torch.max(img_poly[..., 0], dim=-1)[0]\n    # y_max = torch.max(img_poly[..., 1], dim=-1)[0]\n    # h, w = y_max - y_min + 1, x_max - x_min + 1\n    # long_side = torch.max(h, w)\n    # can_poly = can_poly / long_side[..., None, None]\n    return can_poly\n"
  },
  {
    "path": "network/layers/model_block.py",
    "content": "# -*- coding: utf-8 -*-\n__author__ = \"S.X.Zhang\"\nimport torch\nimport torch.nn as nn\nimport torch.nn.functional as F\nfrom network.layers.vgg import VggNet\nfrom network.layers.resnet import ResNet\nfrom network.layers.resnet_dcn import ResNet_DCN\nfrom cfglib.config import config as cfg\n\n\nclass UpBlok(nn.Module):\n\n    def __init__(self, in_channels, out_channels):\n        super().__init__()\n        self.conv1x1 = nn.Conv2d(in_channels, out_channels, kernel_size=1, stride=1, padding=0)\n        self.conv3x3 = nn.Conv2d(out_channels, out_channels, kernel_size=3, stride=1, padding=1)\n        self.deconv = nn.ConvTranspose2d(out_channels, out_channels, kernel_size=4, stride=2, padding=1)\n\n    def forward(self, upsampled, shortcut):\n        # NOTE:\n        #   Several legacy checkpoints (e.g. deformable_resnet50 TD500 @ epoch 300)\n        #   were trained when UpBlok contained an internal ConvTranspose upsample.\n        #   After refactoring the FPN we use MergeBlok (no upsample) for scale=4.\n        #   When those old weights are loaded the learnt deconv filters are skipped,\n        #   leaving feature maps at half resolution.  The explicit resize keeps\n        #   the runtime compatible across both families of checkpoints without\n        #   affecting the newer models (resnet18/50 scale=4) whose tensors already\n        #   match in spatial dims.\n        if upsampled.shape[-2:] != shortcut.shape[-2:]:\n            upsampled = F.interpolate(upsampled, size=shortcut.shape[-2:], mode='bilinear', align_corners=False)\n        x = torch.cat([upsampled, shortcut], dim=1)\n        x = self.conv1x1(x)\n        x = F.relu(x)\n        x = self.conv3x3(x)\n        x = F.relu(x)\n        x = self.deconv(x)\n        return x\n\n\nclass MergeBlok(nn.Module):\n    def __init__(self, in_channels, out_channels):\n        super().__init__()\n        self.conv1x1 = nn.Conv2d(in_channels, out_channels, kernel_size=1, stride=1, padding=0)\n        self.conv3x3 = nn.Conv2d(out_channels, out_channels, kernel_size=3, stride=1, padding=1)\n\n    def forward(self, upsampled, shortcut):\n        # See note in UpBlok.forward: we guard the concat by resizing when old\n        # checkpoints (with learned deconv upsample) are loaded on the refactored\n        # FPN. The check is a no-op for the current layout where shapes already\n        # align (e.g. resnet18 scale=4 and matching weights).\n        if upsampled.shape[-2:] != shortcut.shape[-2:]:\n            upsampled = F.interpolate(upsampled, size=shortcut.shape[-2:], mode='bilinear', align_corners=False)\n        x = torch.cat([upsampled, shortcut], dim=1)\n        x = self.conv1x1(x)\n        x = F.relu(x)\n        x = self.conv3x3(x)\n        return x\n\n\nclass FPN(nn.Module):\n\n    def __init__(self, backbone='resnet50', is_training=True):\n        super().__init__()\n        self.is_training = is_training\n        self.backbone_name = backbone\n\n        if backbone in ['vgg_bn', 'vgg']:\n            self.backbone = VggNet(name=backbone, pretrain=is_training)\n            self.deconv5 = nn.ConvTranspose2d(512, 256, kernel_size=4, stride=2, padding=1)\n            self.merge4 = UpBlok(512 + 256, 128)\n            self.merge3 = UpBlok(256 + 128, 64)\n            if cfg.scale == 1:\n                self.merge2 = UpBlok(128 + 64, 32)  # FPN 1/2\n                self.merge1 = UpBlok(64 + 32, 32)   # FPN 1/1\n            elif cfg.scale == 2:\n                self.merge2 = UpBlok(128 + 64, 32)    # FPN 1/2\n                self.merge1 = MergeBlok(64 + 32, 32)  # FPN 1/2\n            elif cfg.scale == 4:\n                self.merge2 = MergeBlok(128 + 64, 32)  # FPN 1/4\n\n        elif backbone in ['resnet50']:\n            self.backbone = ResNet(name=backbone, pretrain=is_training)\n            self.deconv5 = nn.ConvTranspose2d(2048, 256, kernel_size=4, stride=2, padding=1)\n            self.merge4 = UpBlok(1024 + 256, 128)\n            self.merge3 = UpBlok(512 + 128, 64)\n            if cfg.scale == 1:\n                self.merge2 = UpBlok(256 + 64, 32)  # FPN 1/2\n                self.merge1 = UpBlok(64 + 32, 32)   # FPN 1/1\n            elif cfg.scale == 2:\n                self.merge2 = UpBlok(256 + 64, 32)    # FPN 1/2\n                self.merge1 = MergeBlok(64 + 32, 32)  # FPN 1/2\n            elif cfg.scale == 4:\n                self.merge2 = MergeBlok(256 + 64, 32)  # FPN 1/4\n                self.merge1 = MergeBlok(64 + 32, 32)  # FPN 1/4\n        \n        elif backbone in ['resnet18']:\n            self.backbone = ResNet(name=backbone, pretrain=is_training)\n            self.deconv5 = nn.ConvTranspose2d(512, 256, kernel_size=4, stride=2, padding=1)\n            self.merge4 = UpBlok(256 + 256, 128)\n            self.merge3 = UpBlok(128 + 128, 64)\n            if cfg.scale == 1:\n                self.merge2 = UpBlok(64 + 64, 32)  # FPN 1/2\n                self.merge1 = UpBlok(64 + 32, 32)   # FPN 1/1\n            elif cfg.scale == 2:\n                self.merge2 = UpBlok(64 + 64, 32)    # FPN 1/2\n                self.merge1 = MergeBlok(64 + 32, 32)  # FPN 1/2\n            elif cfg.scale == 4:\n                self.merge2 = MergeBlok(64 + 64, 32)  # FPN 1/4\n                self.merge1 = MergeBlok(64 + 32, 32)  # FPN 1/4\n       \n        elif backbone in [\"deformable_resnet18\"]:\n            self.backbone = ResNet_DCN(name=backbone, pretrain=is_training)\n            self.deconv5 = nn.ConvTranspose2d(512, 256, kernel_size=4, stride=2, padding=1)\n            self.merge4 = UpBlok(256 + 256, 128)\n            self.merge3 = UpBlok(128 + 128, 64)\n            if cfg.scale == 1:\n                self.merge2 = UpBlok(64 + 64, 32)  # FPN 1/2\n                self.merge1 = UpBlok(64 + 32, 32)   # FPN 1/1\n            elif cfg.scale == 2:\n                self.merge2 = UpBlok(64 + 64, 32)    # FPN 1/2\n                self.merge1 = MergeBlok(64 + 32, 32)  # FPN 1/2\n            elif cfg.scale == 4:\n                self.merge2 = MergeBlok(64 + 64, 32)  # FPN 1/4\n                self.merge1 = MergeBlok(64 + 32, 32)  # FPN 1/4\n        \n        elif backbone in [\"deformable_resnet50\"]:\n            self.backbone = ResNet_DCN(name=backbone, pretrain=is_training)\n            self.deconv5 = nn.ConvTranspose2d(2048, 256, kernel_size=4, stride=2, padding=1)\n            self.merge4 = UpBlok(1024 + 256, 128)\n            self.merge3 = UpBlok(512 + 128, 64)\n            if cfg.scale == 1:\n                self.merge2 = UpBlok(256 + 64, 32)  # FPN 1/2\n                self.merge1 = UpBlok(64 + 32, 32)  # FPN 1/1\n            elif cfg.scale == 2:\n                self.merge2 = UpBlok(256 + 64, 32)  # FPN 1/2\n                self.merge1 = MergeBlok(64 + 32, 32)  # FPN 1/2\n            elif cfg.scale == 4:\n                self.merge2 = MergeBlok(256 + 64, 32)  # FPN 1/4\n                self.merge1 = MergeBlok(64 + 32, 32)  # FPN 1/4\n        else:\n            print(\"backbone is not support !\")\n\n    def forward(self, x):\n        C1, C2, C3, C4, C5 = self.backbone(x)\n        #print(C5.size())\n        #print(C4.size())\n        #print(C3.size())\n        #print(C2.size())\n        #print(C1.size())\n        up5 = self.deconv5(C5)\n        up5 = F.relu(up5)\n\n        up4 = self.merge4(C4, up5)\n        up4 = F.relu(up4)\n\n        up3 = self.merge3(C3, up4)\n        up3 = F.relu(up3)\n\n        up2 = self.merge2(C2, up3)\n        up2 = F.relu(up2)\n\n        up1 = self.merge1(C1, up2)\n        up1 = F.relu(up1)\n\n        return up1, up2, up3, up4, up5\n"
  },
  {
    "path": "network/layers/position_encoding.py",
    "content": "# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved\n\"\"\"\nVarious positional encodings for the transformer.\n\"\"\"\nimport math\nimport torch\nfrom torch import nn\n\nfrom util.misc import NestedTensor\n\n\nclass PositionEmbeddingSine(nn.Module):\n    \"\"\"\n    This is a more standard version of the position embedding, very similar to the one\n    used by the Attention is all you need paper, generalized to work on images.\n    \"\"\"\n    def __init__(self, num_pos_feats=64, temperature=10000, normalize=False, scale=None):\n        super().__init__()\n        self.num_pos_feats = num_pos_feats\n        self.temperature = temperature\n        self.normalize = normalize\n        if scale is not None and normalize is False:\n            raise ValueError(\"normalize should be True if scale is passed\")\n        if scale is None:\n            scale = 2 * math.pi\n        self.scale = scale\n\n    def forward(self, tensor_list: NestedTensor):\n        x = tensor_list.tensors\n        mask = tensor_list.mask\n        assert mask is not None\n        not_mask = ~mask\n        y_embed = not_mask.cumsum(1, dtype=torch.float32)\n        x_embed = not_mask.cumsum(2, dtype=torch.float32)\n        if self.normalize:\n            eps = 1e-6\n            y_embed = y_embed / (y_embed[:, -1:, :] + eps) * self.scale\n            x_embed = x_embed / (x_embed[:, :, -1:] + eps) * self.scale\n\n        dim_t = torch.arange(self.num_pos_feats, dtype=torch.float32, device=x.device)\n        dim_t = self.temperature ** (2 * (dim_t // 2) / self.num_pos_feats)\n\n        pos_x = x_embed[:, :, :, None] / dim_t\n        pos_y = y_embed[:, :, :, None] / dim_t\n        pos_x = torch.stack((pos_x[:, :, :, 0::2].sin(), pos_x[:, :, :, 1::2].cos()), dim=4).flatten(3)\n        pos_y = torch.stack((pos_y[:, :, :, 0::2].sin(), pos_y[:, :, :, 1::2].cos()), dim=4).flatten(3)\n        pos = torch.cat((pos_y, pos_x), dim=3).permute(0, 3, 1, 2)\n        return pos\n\n\nclass PositionEmbeddingLearned(nn.Module):\n    \"\"\"\n    Absolute pos embedding, learned.\n    \"\"\"\n    def __init__(self, num_pos_feats=256):\n        super().__init__()\n        self.row_embed = nn.Embedding(50, num_pos_feats)\n        self.col_embed = nn.Embedding(50, num_pos_feats)\n        self.reset_parameters()\n\n    def reset_parameters(self):\n        nn.init.uniform_(self.row_embed.weight)\n        nn.init.uniform_(self.col_embed.weight)\n\n    def forward(self, tensor_list: NestedTensor):\n        x = tensor_list.tensors\n        h, w = x.shape[-2:]\n        i = torch.arange(w, device=x.device)\n        j = torch.arange(h, device=x.device)\n        x_emb = self.col_embed(i)\n        y_emb = self.row_embed(j)\n        pos = torch.cat([\n            x_emb.unsqueeze(0).repeat(h, 1, 1),\n            y_emb.unsqueeze(1).repeat(1, w, 1),\n        ], dim=-1).permute(2, 0, 1).unsqueeze(0).repeat(x.shape[0], 1, 1, 1)\n        return pos\n\n\ndef build_position_encoding(args):\n    N_steps = args.hidden_dim // 2\n    if args.position_embedding in ('v2', 'sine'):\n        # TODO find a better way of exposing other arguments\n        position_embedding = PositionEmbeddingSine(N_steps, normalize=True)\n    elif args.position_embedding in ('v3', 'learned'):\n        position_embedding = PositionEmbeddingLearned(N_steps)\n    else:\n        raise ValueError(f\"not supported {args.position_embedding}\")\n\n    return position_embedding\n"
  },
  {
    "path": "network/layers/resnet.py",
    "content": "import torch\nimport torch.nn as nn\nfrom torchvision.models import resnet\nimport torch.utils.model_zoo as model_zoo\nfrom cfglib.config import config as cfg\n\nmodel_urls = {\n    'resnet18': 'https://download.pytorch.org/models/resnet18-5c106cde.pth',\n    'resnet50': 'https://download.pytorch.org/models/resnet50-19c8e357.pth',\n    'resnet34': 'https://download.pytorch.org/models/resnet34-333f7ec4.pth',\n    'resnet101': 'https://download.pytorch.org/models/resnet101-5d3b4d8f.pth',\n    'resnet152': 'https://download.pytorch.org/models/resnet152-b121ed2d.pth',\n\n}\n\n\nclass ResNet(nn.Module):\n    def __init__(self, name=\"resnet50\", pretrain=True):\n        super().__init__()\n\n        if name == \"resnet50\":\n            base_net = resnet.resnet50(pretrained=False)\n        elif name == \"resnet101\":\n            base_net = resnet.resnet101(pretrained=False)\n        elif name == \"resnet18\":\n            base_net = resnet.resnet18(pretrained=False)\n        elif name == \"resnet34\":\n            base_net = resnet.resnet34(pretrained=False)\n\n        else:\n            print(\" base model is not support !\")\n\n        if pretrain:\n            print(\"load the {} weight from ./cache\".format(name))\n            base_net.load_state_dict(model_zoo.load_url(model_urls[name], model_dir=\"./cache\",\n                                                        map_location=torch.device(cfg.device)), strict=False)\n        # print(base_net)\n        self.stage1 = nn.Sequential(\n            base_net.conv1,\n            base_net.bn1,\n            base_net.relu,\n            base_net.maxpool\n        )\n        self.stage2 = base_net.layer1\n        self.stage3 = base_net.layer2\n        self.stage4 = base_net.layer3\n        self.stage5 = base_net.layer4\n        self.up2 = nn.ConvTranspose2d(64, 64, kernel_size=4, stride=2, padding=1)\n\n    def forward(self, x):\n        C1 = self.stage1(x)\n        C2 = self.stage2(C1)\n        C3 = self.stage3(C2)\n        C4 = self.stage4(C3)\n        C5 = self.stage5(C4)\n\n        if cfg.scale == 2 or cfg.scale == 1:\n            # up2 --> 1/2\n            C1 = self.up2(C1)\n\n        return C1, C2, C3, C4, C5\n\n\nif __name__ == '__main__':\n    import torch\n    input = torch.randn((4, 3, 512, 512))\n    net = ResNet()\n    C1, C2, C3, C4, C5 = net(input)\n    print(C1.size())\n    print(C2.size())\n    print(C3.size())\n    print(C4.size())\n    print(C5.size())\n"
  },
  {
    "path": "network/layers/resnet_dcn.py",
    "content": "import torch\nimport torch.nn as nn\nfrom network.backbone.resnet import deformable_resnet18,deformable_resnet50\nimport torch.utils.model_zoo as model_zoo\nfrom cfglib.config import config as cfg\n\nmodel_urls = {\n    'resnet18': 'https://download.pytorch.org/models/resnet18-5c106cde.pth',\n    'resnet50': 'https://download.pytorch.org/models/resnet50-19c8e357.pth',\n    'resnet34': 'https://download.pytorch.org/models/resnet34-333f7ec4.pth',\n    'resnet101': 'https://download.pytorch.org/models/resnet101-5d3b4d8f.pth',\n    'resnet152': 'https://download.pytorch.org/models/resnet152-b121ed2d.pth',\n\n}\n\n\nclass ResNet_DCN(nn.Module):\n    def __init__(self, name=\"deformable_resnet18\", pretrain=False):\n        super().__init__()\n\n        if name == \"deformable_resnet18\":\n            self.base_net = deformable_resnet18(pretrained=False)\n            if pretrain:\n                print(\"load the {} weight from ./cache\".format(name))\n                self.base_net.load_state_dict(\n                    model_zoo.load_url(model_urls[\"resnet18\"], model_dir=\"./cache\",\n                                       map_location=torch.device(cfg.device)), strict=False)\n\n        elif name == \"deformable_resnet50\":\n            self.base_net = deformable_resnet50(pretrained=False)\n            if pretrain:\n                print(\"load the {} weight from ./cache\".format(name))\n                self.base_net.load_state_dict(\n                    model_zoo.load_url(model_urls[\"resnet50\"], model_dir=\"./cache\",\n                                       map_location=torch.device(cfg.device)), strict=False)\n        else:\n            print(\" base model is not support !\")\n\n        # print(base_net)\n        self.up2 = nn.ConvTranspose2d(64, 64, kernel_size=4, stride=2, padding=1)\n\n    def forward(self, x):\n        C1, C2, C3, C4, C5 = self.base_net(x)\n        # up2 --> 1/2\n        C1 = self.up2(C1)\n\n        return C1, C2, C3, C4, C5\n\n\nif __name__ == '__main__':\n    import torch\n    input = torch.randn((4, 3, 512, 512))\n    net = ResNet_DCN()\n    C1, C2, C3, C4, C5 = net(input)\n    print(C1.size())\n    print(C2.size())\n    print(C3.size())\n    print(C4.size())\n    print(C5.size())\n"
  },
  {
    "path": "network/layers/vgg.py",
    "content": "import torch.nn as nn\nimport torch.utils.model_zoo as model_zoo\nimport torchvision.models as models\nfrom cfglib.config import config as cfg\n\nmodel_urls = {\n    'vgg11': 'https://download.pytorch.org/models/vgg11-bbd30ac9.pth',\n    'vgg16': 'https://download.pytorch.org/models/vgg16-397923af.pth',\n    'vgg19': 'https://download.pytorch.org/models/vgg19-dcbb9e9d.pth',\n    'vgg11_bn': 'https://download.pytorch.org/models/vgg11_bn-6002323d.pth',\n    'vgg16_bn': 'https://download.pytorch.org/models/vgg16_bn-6c64b313.pth',\n    'vgg19_bn': 'https://download.pytorch.org/models/vgg19_bn-c79401a0.pth',\n}\n\n\nclass VggNet(nn.Module):\n    def __init__(self, name=\"vgg16\", pretrain=True):\n        super().__init__()\n        if name == \"vgg16\":\n            base_net = models.vgg16(pretrained=False)\n        elif name == \"vgg16_bn\":\n            base_net = models.vgg16_bn(pretrained=False)\n        else:\n            print(\" base model is not support !\")\n        if pretrain:\n            print(\"load the {} weight from ./cache\".format(name))\n            base_net.load_state_dict(model_zoo.load_url(model_urls[name],\n                                                        model_dir=\"./cache\",map_location=torch.device(cfg.device)))\n\n        if name == \"vgg16\":\n            self.stage1 = nn.Sequential(*[base_net.features[layer] for layer in range(0, 5)])\n            self.stage2 = nn.Sequential(*[base_net.features[layer] for layer in range(5, 10)])\n            self.stage3 = nn.Sequential(*[base_net.features[layer] for layer in range(10, 17)])\n            self.stage4 = nn.Sequential(*[base_net.features[layer] for layer in range(17, 24)])\n            self.stage5 = nn.Sequential(*[base_net.features[layer] for layer in range(24, 31)])\n        elif name == \"vgg16_bn\":\n            self.stage1 = nn.Sequential(*[base_net.features[layer] for layer in range(0, 7)])\n            self.stage2 = nn.Sequential(*[base_net.features[layer] for layer in range(7, 14)])\n            self.stage3 = nn.Sequential(*[base_net.features[layer] for layer in range(14, 24)])\n            self.stage4 = nn.Sequential(*[base_net.features[layer] for layer in range(24, 34)])\n            self.stage5 = nn.Sequential(*[base_net.features[layer] for layer in range(34, 44)])\n\n    def forward(self, x):\n        C1 = self.stage1(x)\n        C2 = self.stage2(C1)\n        C3 = self.stage3(C2)\n        C4 = self.stage4(C3)\n        C5 = self.stage5(C4)\n\n        return C1, C2, C3, C4, C5\n\n\nif __name__ == '__main__':\n    import torch\n    input = torch.randn((4, 3, 512, 512))\n    net = VggNet()\n    C1, C2, C3, C4, C5 = net(input)\n    print(C1.size())\n    print(C2.size())\n    print(C3.size())\n    print(C4.size())\n    print(C5.size())\n"
  },
  {
    "path": "network/loss.py",
    "content": "# -*- coding: utf-8 -*-\n# @Time    : 10/1/21\n# @Author  : GXYM\nimport torch\nimport torch.nn as nn\nfrom cfglib.config import config as cfg\nfrom network.Seg_loss import SegmentLoss\nfrom network.Reg_loss import PolyMatchingLoss\nimport torch.nn.functional as F\n\n\nclass TextLoss(nn.Module):\n\n    def __init__(self):\n        super().__init__()\n        self.MSE_loss = torch.nn.MSELoss(reduce=False, size_average=False)\n        self.BCE_loss = torch.nn.BCELoss(reduce=False, size_average=False)\n        self.PolyMatchingLoss = PolyMatchingLoss(cfg.num_points, cfg.device)\n        self.KL_loss = torch.nn.KLDivLoss(reduce=False, size_average=False)\n\n    @staticmethod\n    def single_image_loss(pre_loss, loss_label):\n        batch_size = pre_loss.shape[0]\n        sum_loss = torch.mean(pre_loss.view(-1)) * 0\n        pre_loss = pre_loss.view(batch_size, -1)\n        loss_label = loss_label.view(batch_size, -1)\n        eps = 0.001\n        for i in range(batch_size):\n            average_number = 0\n            positive_pixel = len(pre_loss[i][(loss_label[i] >= eps)])\n            average_number += positive_pixel\n            if positive_pixel != 0:\n                posi_loss = torch.mean(pre_loss[i][(loss_label[i] >= eps)])\n                sum_loss += posi_loss\n                if len(pre_loss[i][(loss_label[i] < eps)]) < 3 * positive_pixel:\n                    nega_loss = torch.mean(pre_loss[i][(loss_label[i] < eps)])\n                    average_number += len(pre_loss[i][(loss_label[i] < eps)])\n                else:\n                    nega_loss = torch.mean(torch.topk(pre_loss[i][(loss_label[i] < eps)], 3 * positive_pixel)[0])\n                    average_number += 3 * positive_pixel\n                sum_loss += nega_loss\n            else:\n                nega_loss = torch.mean(torch.topk(pre_loss[i], 100)[0])\n                average_number += 100\n                sum_loss += nega_loss\n            # sum_loss += loss/average_number\n\n        return sum_loss/batch_size\n\n    def cls_ohem(self, predict, target, train_mask, negative_ratio=3.):\n        pos = (target * train_mask).bool()\n        neg = ((1 - target) * train_mask).bool()\n\n        n_pos = pos.float().sum()\n        if n_pos.item() > 0:\n            loss_pos = self.BCE_loss(predict[pos], target[pos]).sum()\n            loss_neg = self.BCE_loss(predict[neg], target[neg])\n            n_neg = min(int(neg.float().sum().item()), int(negative_ratio * n_pos.float()))\n        else:\n            loss_pos = torch.tensor(0.)\n            loss_neg = self.BCE_loss(predict[neg], target[neg])\n            n_neg = 100\n        loss_neg, _ = torch.topk(loss_neg, n_neg)\n\n        return (loss_pos + loss_neg.sum()) / (n_pos + n_neg).float()\n\n    @staticmethod\n    def loss_calc_flux(pred_flux, gt_flux, weight_matrix, mask, train_mask):\n\n        # norm loss\n        gt_flux = 0.999999 * gt_flux / (gt_flux.norm(p=2, dim=1).unsqueeze(1) + 1e-3)\n        norm_loss = weight_matrix * torch.mean((pred_flux - gt_flux) ** 2, dim=1)*train_mask\n        norm_loss = norm_loss.sum(-1).mean()\n        # norm_loss = norm_loss.sum()\n\n        # angle loss\n        mask = train_mask * mask\n        pred_flux = 0.999999 * pred_flux / (pred_flux.norm(p=2, dim=1).unsqueeze(1) + 1e-3)\n        # angle_loss = weight_matrix * (torch.acos(torch.sum(pred_flux * gt_flux, dim=1))) ** 2\n        # angle_loss = angle_loss.sum(-1).mean()\n        angle_loss = (1 - torch.cosine_similarity(pred_flux, gt_flux, dim=1))\n        angle_loss = angle_loss[mask].mean()\n\n        return norm_loss, angle_loss\n\n    @staticmethod\n    def get_poly_energy(energy_field, img_poly, ind, h, w):\n        img_poly = img_poly.clone().float()\n        img_poly[..., 0] = img_poly[..., 0] / (w / 2.) - 1\n        img_poly[..., 1] = img_poly[..., 1] / (h / 2.) - 1\n\n        batch_size = energy_field.size(0)\n        gcn_feature = torch.zeros([img_poly.size(0), energy_field.size(1), img_poly.size(1)]).to(img_poly.device)\n        for i in range(batch_size):\n            poly = img_poly[ind == i].unsqueeze(0)\n            gcn_feature[ind == i] = torch.nn.functional.grid_sample(energy_field[i:i + 1], poly)[0].permute(1, 0, 2)\n        return gcn_feature\n\n    def loss_energy_regularization(self, energy_field, img_poly, inds, h, w):\n        energys = []\n        for i, py in enumerate(img_poly):\n            energy = self.get_poly_energy(energy_field.unsqueeze(1), py, inds, h, w)\n            energys.append(energy.squeeze(1).sum(-1))\n\n        regular_loss = torch.tensor(0.)\n        energy_loss = torch.tensor(0.)\n        for i, e in enumerate(energys[1:]):\n            regular_loss += torch.clamp(e - energys[i], min=0.0).mean()\n            energy_loss += torch.where(e <= 0.01, torch.tensor(0.), e).mean()\n\n        return (energy_loss+regular_loss)/len(energys[1:])\n\n    def forward(self, input_dict, output_dict, eps=None):\n        \"\"\"\n          calculate boundary proposal network loss\n        \"\"\"\n        # tr_mask = tr_mask.permute(0, 3, 1, 2).contiguous()\n\n        fy_preds = output_dict[\"fy_preds\"]\n        py_preds = output_dict[\"py_preds\"]\n        inds = output_dict[\"inds\"]\n\n        train_mask = input_dict['train_mask']\n        tr_mask = input_dict['tr_mask'] > 0\n        distance_field = input_dict['distance_field']\n        direction_field = input_dict['direction_field']\n        weight_matrix = input_dict['weight_matrix']\n        gt_tags = input_dict['gt_points']\n\n        # # scale the prediction map\n        # fy_preds = F.interpolate(fy_preds, scale_factor=cfg.scale, mode='bilinear')\n        \n        if cfg.scale > 1:\n            train_mask = F.interpolate(train_mask.float().unsqueeze(1),\n                                       scale_factor=1/cfg.scale, mode='bilinear').squeeze().bool()\n            tr_mask = F.interpolate(tr_mask.float().unsqueeze(1),\n                                    scale_factor=1/cfg.scale, mode='bilinear').squeeze().bool()\n\n            distance_field = F.interpolate(distance_field.unsqueeze(1),\n                                           scale_factor=1/cfg.scale, mode='bilinear').squeeze()\n            direction_field = F.interpolate(direction_field,\n                                            scale_factor=1 / cfg.scale, mode='bilinear')\n            weight_matrix = F.interpolate(weight_matrix.unsqueeze(1),\n                                          scale_factor=1/cfg.scale, mode='bilinear').squeeze()\n\n        # pixel class loss\n        # cls_loss = self.cls_ohem(fy_preds[:, 0, :, :], tr_mask.float(), train_mask)\n        cls_loss = self.BCE_loss(fy_preds[:, 0, :, :],  tr_mask.float())\n        cls_loss = torch.mul(cls_loss, train_mask.float()).mean()\n\n        # distance field loss\n        dis_loss = self.MSE_loss(fy_preds[:, 1, :, :], distance_field)\n        dis_loss = torch.mul(dis_loss, train_mask.float())\n        dis_loss = self.single_image_loss(dis_loss, distance_field)\n\n        # # direction field loss\n        norm_loss, angle_loss = self.loss_calc_flux(fy_preds[:, 2:4, :, :], direction_field, \n                                                           weight_matrix, tr_mask, train_mask)\n\n        # boundary point loss\n        point_loss = self.PolyMatchingLoss(py_preds[1:], gt_tags[inds])\n\n        #  Minimum energy loss regularization\n        h, w = distance_field.size(1) * cfg.scale, distance_field.size(2) * cfg.scale\n        energy_loss = self.loss_energy_regularization(distance_field, py_preds, inds[0], h, w)\n\n        if eps is None:\n            alpha = 1.0; beta = 3.0; theta=0.5; gama = 0.05\n        else:\n            alpha = 1.0; beta = 3.0; theta=0.5;\n            gama = 0.1*torch.sigmoid(torch.tensor((eps - cfg.max_epoch)/cfg.max_epoch))\n        loss = alpha*cls_loss + beta*dis_loss + theta*(norm_loss + angle_loss) + gama*(point_loss + energy_loss)\n\n        loss_dict = {\n            'total_loss': loss,\n            'cls_loss': alpha*cls_loss,\n            'distance loss': beta*dis_loss,\n            'dir_loss': theta*(norm_loss + angle_loss),\n            'norm_loss': theta*norm_loss,\n            'angle_loss': theta*angle_loss,\n            'point_loss': gama*point_loss,\n            'energy_loss': gama*energy_loss,\n\n        }\n\n        return loss_dict\n\n"
  },
  {
    "path": "network/loss_org.py",
    "content": "# -*- coding: utf-8 -*-\n# @Time    : 10/1/21\n# @Author  : GXYM\nimport torch\nimport torch.nn as nn\nfrom cfglib.config import config as cfg\nfrom network.Seg_loss import SegmentLoss\nfrom network.Reg_loss import PolyMatchingLoss\n\n\nclass TextLoss(nn.Module):\n\n    def __init__(self):\n        super().__init__()\n        self.MSE_loss = torch.nn.MSELoss(reduce=False, size_average=False)\n        self.BCE_loss = torch.nn.BCELoss(reduce=False, size_average=False)\n        self.PolyMatchingLoss = PolyMatchingLoss(cfg.num_points, cfg.device)\n        self.KL_loss = torch.nn.KLDivLoss(reduce=False, size_average=False)\n\n    @staticmethod\n    def single_image_loss(pre_loss, loss_label):\n        batch_size = pre_loss.shape[0]\n        sum_loss = torch.mean(pre_loss.view(-1)) * 0\n        pre_loss = pre_loss.view(batch_size, -1)\n        loss_label = loss_label.view(batch_size, -1)\n        eps = 0.001\n        for i in range(batch_size):\n            average_number = 0\n            positive_pixel = len(pre_loss[i][(loss_label[i] >= eps)])\n            average_number += positive_pixel\n            if positive_pixel != 0:\n                posi_loss = torch.mean(pre_loss[i][(loss_label[i] >= eps)])\n                sum_loss += posi_loss\n                if len(pre_loss[i][(loss_label[i] < eps)]) < 3 * positive_pixel:\n                    nega_loss = torch.mean(pre_loss[i][(loss_label[i] < eps)])\n                    average_number += len(pre_loss[i][(loss_label[i] < eps)])\n                else:\n                    nega_loss = torch.mean(torch.topk(pre_loss[i][(loss_label[i] < eps)], 3 * positive_pixel)[0])\n                    average_number += 3 * positive_pixel\n                sum_loss += nega_loss\n            else:\n                nega_loss = torch.mean(torch.topk(pre_loss[i], 100)[0])\n                average_number += 100\n                sum_loss += nega_loss\n            # sum_loss += loss/average_number\n\n        return sum_loss/batch_size\n\n    def cls_ohem(self, predict, target, train_mask, negative_ratio=3.):\n        pos = (target * train_mask).bool()\n        neg = ((1 - target) * train_mask).bool()\n\n        n_pos = pos.float().sum()\n\n        if n_pos.item() > 0:\n            loss_pos = self.BCE_loss(predict[pos], target[pos]).sum()\n            loss_neg = self.BCE_loss(predict[neg], target[neg])\n            n_neg = min(int(neg.float().sum().item()), int(negative_ratio * n_pos.float()))\n        else:\n            loss_pos = torch.tensor(0.)\n            loss_neg = self.BCE_loss(predict[neg], target[neg])\n            n_neg = 100\n        loss_neg, _ = torch.topk(loss_neg, n_neg)\n\n        return (loss_pos + loss_neg.sum()) / (n_pos + n_neg).float()\n\n    @staticmethod\n    def loss_calc_flux(pred_flux, gt_flux, weight_matrix, mask, train_mask):\n\n        # norm loss\n        gt_flux = 0.999999 * gt_flux / (gt_flux.norm(p=2, dim=1).unsqueeze(1) + 1e-9)\n        norm_loss = weight_matrix * torch.sum((pred_flux - gt_flux) ** 2, dim=1)*train_mask\n        norm_loss = norm_loss.sum(-1).mean()\n\n        # angle loss\n        mask = train_mask * mask\n        pred_flux = 0.999999 * pred_flux / (pred_flux.norm(p=2, dim=1).unsqueeze(1) + 1e-9)\n        # angle_loss = weight_matrix * (torch.acos(torch.sum(pred_flux * gt_flux, dim=1))) ** 2\n        # angle_loss = angle_loss.sum(-1).mean()\n        angle_loss = (1 - torch.cosine_similarity(pred_flux, gt_flux, dim=1))\n        angle_loss = angle_loss[mask].mean()\n\n        return norm_loss, angle_loss\n\n    def forward(self, input_dict, output_dict, eps=None):\n        \"\"\"\n          calculate boundary proposal network loss\n        \"\"\"\n        # tr_mask = tr_mask.permute(0, 3, 1, 2).contiguous()\n\n        fy_preds = output_dict[\"fy_preds\"]\n        py_preds = output_dict[\"py_preds\"]\n        inds = output_dict[\"inds\"]\n\n        train_mask = input_dict['train_mask']\n        tr_mask = input_dict['tr_mask'] > 0\n        distance_field = input_dict['distance_field']\n        direction_field = input_dict['direction_field']\n        weight_matrix = input_dict['weight_matrix']\n        gt_tags = input_dict['gt_points']\n\n        # pixel class loss\n        cls_loss = self.cls_ohem(fy_preds[:, 0, :, :], tr_mask.float(), train_mask.bool())\n\n        # distance field loss\n        dis_loss = self.MSE_loss(fy_preds[:, 1, :, :], distance_field)\n        dis_loss = torch.mul(dis_loss, train_mask.float())\n        dis_loss = self.single_image_loss(dis_loss, distance_field)\n\n        # direction field loss\n        norm_loss, angle_loss = self.loss_calc_flux(fy_preds[:, 2:4, :, :],\n                                                    direction_field, weight_matrix, tr_mask, train_mask)\n\n        # boundary point loss\n        point_loss = self.PolyMatchingLoss(py_preds, gt_tags[inds])\n\n        if eps is None:\n            loss_b = 0.05*point_loss\n            loss = cls_loss + 3.0*dis_loss + norm_loss + angle_loss + loss_b\n        else:\n            loss_b = 0.1*(torch.sigmoid(torch.tensor((eps - cfg.max_epoch)/cfg.max_epoch))) * point_loss\n            loss = cls_loss + 3.0*dis_loss + norm_loss + angle_loss + loss_b\n\n        loss_dict = {\n            'total_loss': loss,\n            'cls_loss': cls_loss,\n            'distance loss': 3.0*dis_loss,\n            'dir_loss': norm_loss + angle_loss,\n            'point_loss': loss_b,\n            'norm_loss': norm_loss,\n            'angle_loss': angle_loss,\n\n        }\n\n        return loss_dict\n\n"
  },
  {
    "path": "network/textnet.py",
    "content": "# -*- coding: utf-8 -*-\n# @Time    : 10/1/21\n# @Author  : GXYM\nimport torch\nimport torch.nn as nn\nfrom network.layers.model_block import FPN\nfrom cfglib.config import config as cfg\nimport numpy as np\nfrom network.layers.CircConv import DeepSnake\nfrom network.layers.GCN import GCN\nfrom network.layers.RNN import RNN\nfrom network.layers.Adaptive_Deformation import AdaptiveDeformation\n# from network.layers.Transformer_old import Transformer_old\nfrom network.layers.Transformer import Transformer\nimport cv2\nfrom util.misc import get_sample_point, fill_hole\nfrom network.layers.gcn_utils import get_node_feature, \\\n    get_adj_mat, get_adj_ind, coord_embedding, normalize_adj\nimport torch.nn.functional as F\nimport time\n\n\nclass Evolution(nn.Module):\n    def __init__(self, node_num, adj_num, is_training=True, device=None, model=\"snake\"):\n        super(Evolution, self).__init__()\n        self.node_num = node_num\n        self.adj_num = adj_num\n        self.device = device\n        self.is_training = is_training\n        self.clip_dis = 16\n\n        self.iter = 3\n        if model == \"gcn\":\n            self.adj = get_adj_mat(self.adj_num, self.node_num)\n            self.adj = normalize_adj(self.adj, type=\"DAD\").float().to(self.device)\n            for i in range(self.iter):\n                evolve_gcn = GCN(36, 128)\n                self.__setattr__('evolve_gcn' + str(i), evolve_gcn)\n        elif model == \"rnn\":\n            self.adj = None\n            for i in range(self.iter):\n                evolve_gcn = RNN(36, 128)\n                self.__setattr__('evolve_gcn' + str(i), evolve_gcn)\n        elif model == \"AD\":\n            self.adj = get_adj_mat(self.adj_num, self.node_num)\n            self.adj = normalize_adj(self.adj, type=\"DAD\").float().to(self.device)\n            for i in range(self.iter):\n                evolve_gcn = AdaptiveDeformation(36, 128)\n                self.__setattr__('evolve_gcn' + str(i), evolve_gcn)\n        # elif model == \"BT_old\":\n        #     self.adj = None\n        #     for i in range(self.iter):\n        #         evolve_gcn = Transformer_old(36, 512, num_heads=8,\n        #                                  dim_feedforward=2048, drop_rate=0.0, if_resi=True, block_nums=4)\n        #         self.__setattr__('evolve_gcn' + str(i), evolve_gcn)\n        elif model == \"BT\":\n            self.adj = None\n            for i in range(self.iter):\n                evolve_gcn = Transformer(36, 128, num_heads=8,\n                                         dim_feedforward=1024, drop_rate=0.0, if_resi=True, block_nums=3)\n                self.__setattr__('evolve_gcn' + str(i), evolve_gcn)\n        else:\n            self.adj = get_adj_ind(self.adj_num, self.node_num, self.device)\n            for i in range(self.iter):\n                evolve_gcn = DeepSnake(state_dim=128, feature_dim=36, conv_type='dgrid')\n                self.__setattr__('evolve_gcn' + str(i), evolve_gcn)\n\n        for m in self.modules():\n            if isinstance(m, nn.Conv1d) or isinstance(m, nn.Conv2d):\n                m.weight.data.normal_(0.0, 0.02)\n                # nn.init.kaiming_normal_(m.weight, mode='fan_in')\n                if m.bias is not None:\n                    nn.init.constant_(m.bias, 0)\n\n    @staticmethod\n    def get_boundary_proposal(input=None, seg_preds=None, switch=\"gt\"):\n\n        if switch == \"gt\":\n            inds = torch.where(input['ignore_tags'] > 0)\n            # if len(inds[0]) > 320:\n            #    inds = (inds[0][:320], inds[1][:320])\n            init_polys = input['proposal_points'][inds]\n        else:\n            tr_masks = input['tr_mask'].cpu().numpy()\n            tcl_masks = seg_preds[:, 0, :, :].detach().cpu().numpy() > cfg.threshold\n            inds = []\n            init_polys = []\n            for bid, tcl_mask in enumerate(tcl_masks):\n                ret, labels = cv2.connectedComponents(tcl_mask.astype(np.uint8), connectivity=8)\n                for idx in range(1, ret):\n                    text_mask = labels == idx\n                    ist_id = int(np.sum(text_mask*tr_masks[bid])/np.sum(text_mask))-1\n                    inds.append([bid, ist_id])\n                    poly = get_sample_point(text_mask, cfg.num_points, cfg.approx_factor)\n                    init_polys.append(poly)\n            inds = torch.from_numpy(np.array(inds)).permute(1, 0).to(input[\"img\"].device)\n            init_polys = torch.from_numpy(np.array(init_polys)).to(input[\"img\"].device)\n\n        return init_polys, inds, None\n\n    def get_boundary_proposal_eval(self, input=None, seg_preds=None):\n\n        # if cfg.scale > 1:\n        #     seg_preds = F.interpolate(seg_preds, scale_factor=cfg.scale, mode='bilinear')\n        cls_preds = seg_preds[:, 0, :, :].detach().cpu().numpy()\n        dis_preds = seg_preds[:, 1, :, ].detach().cpu().numpy()\n\n        inds = []\n        init_polys = []\n        confidences = []\n        for bid, dis_pred in enumerate(dis_preds):\n            # # dis_mask = (dis_pred / np.max(dis_pred)) > cfg.dis_threshold\n            dis_mask = dis_pred > cfg.dis_threshold\n            # dis_mask = fill_hole(dis_mask)\n            ret, labels = cv2.connectedComponents(dis_mask.astype(np.uint8), connectivity=8, ltype=cv2.CV_16U)\n            for idx in range(1, ret):\n                text_mask = labels == idx\n                confidence = round(cls_preds[bid][text_mask].mean(), 3)\n                # 50 for MLT2017 and ArT (or DCN is used in backone); else is all 150;\n                # just can set to 50, which has little effect on the performance\n                if np.sum(text_mask) < 50/(cfg.scale*cfg.scale) or confidence < cfg.cls_threshold:\n                    continue\n                confidences.append(confidence)\n                inds.append([bid, 0])\n                \n                poly = get_sample_point(text_mask, cfg.num_points,\n                                        cfg.approx_factor, scales=np.array([cfg.scale, cfg.scale]))\n                init_polys.append(poly)\n\n        if len(inds) > 0:\n            inds = torch.from_numpy(np.array(inds)).permute(1, 0).to(input[\"img\"].device, non_blocking=True)\n            init_polys = torch.from_numpy(np.array(init_polys)).to(input[\"img\"].device, non_blocking=True).float()\n        else:\n            init_polys = torch.from_numpy(np.array(init_polys)).to(input[\"img\"].device, non_blocking=True).float()\n            inds = torch.from_numpy(np.array(inds)).to(input[\"img\"].device, non_blocking=True)\n\n        return init_polys, inds, confidences\n\n    def evolve_poly(self, snake, cnn_feature, i_it_poly, ind):\n        if len(i_it_poly) == 0:\n            return torch.zeros_like(i_it_poly)\n        h, w = cnn_feature.size(2)*cfg.scale, cnn_feature.size(3)*cfg.scale\n        node_feats = get_node_feature(cnn_feature, i_it_poly, ind, h, w)\n        i_poly = i_it_poly + torch.clamp(snake(node_feats, self.adj).permute(0, 2, 1), -self.clip_dis, self.clip_dis)\n        if self.is_training:\n            i_poly = torch.clamp(i_poly, 0, w-1)\n        else:\n            i_poly[:, :, 0] = torch.clamp(i_poly[:, :, 0], 0, w - 1)\n            i_poly[:, :, 1] = torch.clamp(i_poly[:, :, 1], 0, h - 1)\n        return i_poly\n\n    def forward(self, embed_feature, input=None, seg_preds=None, switch=\"gt\"):\n        if self.is_training:\n            init_polys, inds, confidences = self.get_boundary_proposal(input=input, seg_preds=seg_preds, switch=switch)\n            # TODO sample fix number\n        else:\n            init_polys, inds, confidences = self.get_boundary_proposal_eval(input=input, seg_preds=seg_preds)\n            if init_polys.shape[0] == 0:\n                return [init_polys for i in range(self.iter+1)], inds, confidences\n\n        py_preds = [init_polys, ]\n        for i in range(self.iter):\n            evolve_gcn = self.__getattr__('evolve_gcn' + str(i))\n            init_polys = self.evolve_poly(evolve_gcn, embed_feature, init_polys, inds[0])\n            py_preds.append(init_polys)\n\n        return py_preds, inds, confidences\n\n\nclass TextNet(nn.Module):\n\n    def __init__(self, backbone='vgg', is_training=True):\n        super().__init__()\n        self.is_training = is_training\n        self.backbone_name = backbone\n        self.fpn = FPN(self.backbone_name, is_training=(not cfg.resume and is_training))\n\n        self.seg_head = nn.Sequential(\n            nn.Conv2d(32, 16, kernel_size=3, padding=2, dilation=2),\n            nn.PReLU(),\n            nn.Conv2d(16, 16, kernel_size=3, padding=4, dilation=4),\n            nn.PReLU(),\n            nn.Conv2d(16, 4, kernel_size=1, stride=1, padding=0),\n        )\n        self.BPN = Evolution(cfg.num_points, adj_num=4,\n                             is_training=is_training, device=cfg.device, model=\"BT\")\n\n    def load_model(self, model_path):\n        print('Loading from {}'.format(model_path))\n        state_dict = torch.load(model_path, map_location=torch.device(cfg.device))\n        checkpoint = state_dict.get('model', state_dict)\n        missing_keys, unexpected_keys = self.load_state_dict(checkpoint, strict=False)\n        if missing_keys:\n            print('[WARN] Missing keys when loading {}: {}'.format(model_path, missing_keys))\n        if unexpected_keys:\n            print('[WARN] Unexpected keys skipped from {}: {}'.format(model_path, unexpected_keys))\n\n    def forward(self, input_dict, test_speed=False):\n        output = {}\n        b, c, h, w = input_dict[\"img\"].shape\n        if self.is_training or cfg.exp_name in ['ArT', 'MLT2017', \"MLT2019\"] or test_speed:\n            image = input_dict[\"img\"]\n        else:\n            image = torch.zeros((b, c, cfg.test_size[1], cfg.test_size[1]), dtype=torch.float32).to(cfg.device)\n            image[:, :, :h, :w] = input_dict[\"img\"][:, :, :, :]\n\n        up1, _, _, _, _ = self.fpn(image)\n        up1 = up1[:, :, :h // cfg.scale, :w // cfg.scale]\n\n        preds = self.seg_head(up1)\n        fy_preds = torch.cat([torch.sigmoid(preds[:, 0:2, :, :]), preds[:, 2:4, :, :]], dim=1)\n        cnn_feats = torch.cat([up1, fy_preds], dim=1)\n\n        py_preds, inds, confidences = self.BPN(cnn_feats, input=input_dict, seg_preds=fy_preds, switch=\"gt\")\n        \n        output[\"fy_preds\"] = fy_preds\n        output[\"py_preds\"] = py_preds\n        output[\"inds\"] = inds\n        output[\"confidences\"] = confidences\n\n        return output\n"
  },
  {
    "path": "output/Analysis/ctw1500_eval.txt",
    "content": "640_1024_100 0.35/0.825 ALL :: AP=0.6632 - Precision=0.8515 - Recall=0.7722 - Fscore=0.8099\n640_1024_105 0.35/0.825 ALL :: AP=0.6544 - Precision=0.8395 - Recall=0.7774 - Fscore=0.8072\n640_1024_110 0.35/0.825 ALL :: AP=0.6883 - Precision=0.8492 - Recall=0.8018 - Fscore=0.8248\n640_1024_115 0.35/0.825 ALL :: AP=0.6657 - Precision=0.8621 - Recall=0.7663 - Fscore=0.8114\n640_1024_120 0.35/0.825 ALL :: AP=0.6580 - Precision=0.8451 - Recall=0.7699 - Fscore=0.8057\n640_1024_125 0.35/0.825 ALL :: AP=0.6577 - Precision=0.8537 - Recall=0.7683 - Fscore=0.8087\n640_1024_130 0.35/0.825 ALL :: AP=0.6809 - Precision=0.8626 - Recall=0.7839 - Fscore=0.8214\n640_1024_135 0.35/0.825 ALL :: AP=0.6669 - Precision=0.8518 - Recall=0.7754 - Fscore=0.8118\n640_1024_140 0.35/0.825 ALL :: AP=0.6896 - Precision=0.8660 - Recall=0.7920 - Fscore=0.8274\n640_1024_145 0.35/0.825 ALL :: AP=0.6684 - Precision=0.8572 - Recall=0.7751 - Fscore=0.8141\n640_1024_150 0.35/0.825 ALL :: AP=0.6791 - Precision=0.8726 - Recall=0.7705 - Fscore=0.8184\n640_1024_155 0.35/0.825 ALL :: AP=0.6903 - Precision=0.8451 - Recall=0.8093 - Fscore=0.8268\n640_1024_160 0.35/0.825 ALL :: AP=0.6880 - Precision=0.8580 - Recall=0.7937 - Fscore=0.8246\n640_1024_165 0.35/0.825 ALL :: AP=0.6772 - Precision=0.8463 - Recall=0.7953 - Fscore=0.8200\n640_1024_170 0.35/0.825 ALL :: AP=0.6857 - Precision=0.8497 - Recall=0.8018 - Fscore=0.8251\n640_1024_175 0.35/0.825 ALL :: AP=0.6968 - Precision=0.8787 - Recall=0.7885 - Fscore=0.8311\n640_1024_180 0.35/0.825 ALL :: AP=0.6832 - Precision=0.8473 - Recall=0.7995 - Fscore=0.8227\n640_1024_185 0.35/0.825 ALL :: AP=0.6979 - Precision=0.8508 - Recall=0.8123 - Fscore=0.8311\n640_1024_190 0.35/0.825 ALL :: AP=0.6934 - Precision=0.8686 - Recall=0.7927 - Fscore=0.8289\n640_1024_195 0.35/0.825 ALL :: AP=0.6944 - Precision=0.8657 - Recall=0.7966 - Fscore=0.8297\n640_1024_200 0.35/0.825 ALL :: AP=0.6974 - Precision=0.8494 - Recall=0.8165 - Fscore=0.8326\n640_1024_205 0.35/0.825 ALL :: AP=0.6833 - Precision=0.8523 - Recall=0.7956 - Fscore=0.8230\n640_1024_210 0.35/0.825 ALL :: AP=0.6821 - Precision=0.8675 - Recall=0.7790 - Fscore=0.8209\n640_1024_215 0.35/0.825 ALL :: AP=0.7005 - Precision=0.8669 - Recall=0.7986 - Fscore=0.8314\n640_1024_220 0.35/0.825 ALL :: AP=0.6879 - Precision=0.8670 - Recall=0.7859 - Fscore=0.8244\n640_1024_225 0.35/0.825 ALL :: AP=0.6713 - Precision=0.8678 - Recall=0.7663 - Fscore=0.8139\n640_1024_230 0.35/0.825 ALL :: AP=0.6994 - Precision=0.8444 - Recall=0.8175 - Fscore=0.8307\n640_1024_235 0.35/0.825 ALL :: AP=0.7070 - Precision=0.8759 - Recall=0.8008 - Fscore=0.8367\n640_1024_240 0.35/0.825 ALL :: AP=0.6847 - Precision=0.8722 - Recall=0.7787 - Fscore=0.8228\n640_1024_245 0.35/0.825 ALL :: AP=0.6970 - Precision=0.8541 - Recall=0.8093 - Fscore=0.8311\n640_1024_250 0.35/0.825 ALL :: AP=0.7015 - Precision=0.8534 - Recall=0.8119 - Fscore=0.8321\n640_1024_255 0.35/0.825 ALL :: AP=0.6975 - Precision=0.8485 - Recall=0.8106 - Fscore=0.8291\n640_1024_260 0.35/0.825 ALL :: AP=0.6754 - Precision=0.8572 - Recall=0.7787 - Fscore=0.8161\n640_1024_265 0.35/0.825 ALL :: AP=0.6902 - Precision=0.8610 - Recall=0.7953 - Fscore=0.8268\n640_1024_270 0.35/0.825 ALL :: AP=0.6844 - Precision=0.8678 - Recall=0.7829 - Fscore=0.8232\n640_1024_275 0.35/0.825 ALL :: AP=0.7021 - Precision=0.8690 - Recall=0.8018 - Fscore=0.8340\n640_1024_280 0.35/0.825 ALL :: AP=0.6991 - Precision=0.8442 - Recall=0.8198 - Fscore=0.8318\n640_1024_285 0.35/0.825 ALL :: AP=0.7004 - Precision=0.8603 - Recall=0.8067 - Fscore=0.8326\n640_1024_290 0.35/0.825 ALL :: AP=0.6953 - Precision=0.8449 - Recall=0.8129 - Fscore=0.8286\n640_1024_295 0.35/0.825 ALL :: AP=0.7120 - Precision=0.8629 - Recall=0.8207 - Fscore=0.8413\n640_1024_300 0.35/0.825 ALL :: AP=0.7091 - Precision=0.8422 - Recall=0.8315 - Fscore=0.8368\n640_1024_305 0.35/0.825 ALL :: AP=0.7105 - Precision=0.8420 - Recall=0.8357 - Fscore=0.8389\n640_1024_310 0.35/0.825 ALL :: AP=0.7002 - Precision=0.8469 - Recall=0.8220 - Fscore=0.8343\n640_1024_315 0.35/0.825 ALL :: AP=0.6916 - Precision=0.8551 - Recall=0.8018 - Fscore=0.8276\n640_1024_320 0.35/0.825 ALL :: AP=0.7021 - Precision=0.8529 - Recall=0.8142 - Fscore=0.8331\n640_1024_325 0.35/0.825 ALL :: AP=0.7042 - Precision=0.8406 - Recall=0.8269 - Fscore=0.8337\n640_1024_330 0.35/0.825 ALL :: AP=0.7052 - Precision=0.8578 - Recall=0.8139 - Fscore=0.8353\n640_1024_335 0.35/0.825 ALL :: AP=0.7005 - Precision=0.8608 - Recall=0.8083 - Fscore=0.8338\n640_1024_340 0.35/0.825 ALL :: AP=0.6979 - Precision=0.8683 - Recall=0.7976 - Fscore=0.8315\n640_1024_345 0.35/0.825 ALL :: AP=0.6818 - Precision=0.8432 - Recall=0.7995 - Fscore=0.8208\n640_1024_350 0.35/0.825 ALL :: AP=0.6933 - Precision=0.8465 - Recall=0.8090 - Fscore=0.8273\n640_1024_355 0.35/0.825 ALL :: AP=0.6955 - Precision=0.8534 - Recall=0.8067 - Fscore=0.8294\n640_1024_360 0.35/0.825 ALL :: AP=0.6930 - Precision=0.8762 - Recall=0.7819 - Fscore=0.8264\n640_1024_365 0.35/0.825 ALL :: AP=0.6974 - Precision=0.8499 - Recall=0.8123 - Fscore=0.8307\n640_1024_370 0.35/0.825 ALL :: AP=0.6947 - Precision=0.8456 - Recall=0.8139 - Fscore=0.8294\n640_1024_375 0.35/0.825 ALL :: AP=0.6806 - Precision=0.8414 - Recall=0.7989 - Fscore=0.8196\n640_1024_380 0.35/0.825 ALL :: AP=0.6957 - Precision=0.8538 - Recall=0.8070 - Fscore=0.8298\n640_1024_385 0.35/0.825 ALL :: AP=0.6993 - Precision=0.8491 - Recall=0.8158 - Fscore=0.8321\n640_1024_390 0.35/0.825 ALL :: AP=0.6997 - Precision=0.8530 - Recall=0.8113 - Fscore=0.8316\n640_1024_395 0.35/0.825 ALL :: AP=0.6755 - Precision=0.8327 - Recall=0.8015 - Fscore=0.8168\n640_1024_400 0.35/0.825 ALL :: AP=0.7079 - Precision=0.8689 - Recall=0.8057 - Fscore=0.8361\n640_1024_405 0.35/0.825 ALL :: AP=0.6922 - Precision=0.8531 - Recall=0.8048 - Fscore=0.8282\n640_1024_410 0.35/0.825 ALL :: AP=0.7096 - Precision=0.8579 - Recall=0.8204 - Fscore=0.8387\n640_1024_415 0.35/0.825 ALL :: AP=0.7033 - Precision=0.8450 - Recall=0.8230 - Fscore=0.8339\n640_1024_420 0.35/0.825 ALL :: AP=0.6921 - Precision=0.8343 - Recall=0.8220 - Fscore=0.8281\n640_1024_425 0.35/0.825 ALL :: AP=0.6949 - Precision=0.8466 - Recall=0.8129 - Fscore=0.8294\n640_1024_430 0.35/0.825 ALL :: AP=0.6924 - Precision=0.8455 - Recall=0.8100 - Fscore=0.8274\n640_1024_435 0.35/0.825 ALL :: AP=0.6899 - Precision=0.8387 - Recall=0.8119 - Fscore=0.8251\n640_1024_440 0.35/0.825 ALL :: AP=0.7132 - Precision=0.8779 - Recall=0.8061 - Fscore=0.8404\n640_1024_445 0.35/0.825 ALL :: AP=0.6952 - Precision=0.8301 - Recall=0.8331 - Fscore=0.8316\n640_1024_450 0.35/0.825 ALL :: AP=0.7022 - Precision=0.8593 - Recall=0.8083 - Fscore=0.8331\n640_1024_455 0.35/0.825 ALL :: AP=0.6989 - Precision=0.8478 - Recall=0.8155 - Fscore=0.8314\n640_1024_460 0.35/0.825 ALL :: AP=0.7011 - Precision=0.8488 - Recall=0.8181 - Fscore=0.8332\n640_1024_465 0.35/0.825 ALL :: AP=0.7035 - Precision=0.8504 - Recall=0.8191 - Fscore=0.8345\n640_1024_470 0.35/0.825 ALL :: AP=0.6974 - Precision=0.8583 - Recall=0.8057 - Fscore=0.8312\n640_1024_475 0.35/0.825 ALL :: AP=0.7064 - Precision=0.8565 - Recall=0.8171 - Fscore=0.8364\n640_1024_480 0.35/0.825 ALL :: AP=0.6938 - Precision=0.8571 - Recall=0.8015 - Fscore=0.8284\n640_1024_485 0.35/0.825 ALL :: AP=0.6903 - Precision=0.8358 - Recall=0.8181 - Fscore=0.8269\n640_1024_490 0.35/0.825 ALL :: AP=0.7216 - Precision=0.8653 - Recall=0.8269 - Fscore=0.8457\n640_1024_495 0.35/0.825 ALL :: AP=0.6949 - Precision=0.8535 - Recall=0.8054 - Fscore=0.8288\n640_1024_500 0.35/0.825 ALL :: AP=0.7022 - Precision=0.8380 - Recall=0.8295 - Fscore=0.8337\n640_1024_505 0.35/0.825 ALL :: AP=0.7058 - Precision=0.8461 - Recall=0.8259 - Fscore=0.8359\n640_1024_510 0.35/0.825 ALL :: AP=0.7020 - Precision=0.8359 - Recall=0.8302 - Fscore=0.8330\n640_1024_515 0.35/0.825 ALL :: AP=0.7108 - Precision=0.8612 - Recall=0.8188 - Fscore=0.8394\n640_1024_520 0.35/0.825 ALL :: AP=0.6952 - Precision=0.8337 - Recall=0.8250 - Fscore=0.8293\n640_1024_525 0.35/0.825 ALL :: AP=0.6513 - Precision=0.7806 - Recall=0.8256 - Fscore=0.8025\n640_1024_530 0.35/0.825 ALL :: AP=0.7150 - Precision=0.8620 - Recall=0.8227 - Fscore=0.8419\n640_1024_535 0.35/0.825 ALL :: AP=0.7025 - Precision=0.8563 - Recall=0.8136 - Fscore=0.8344\n640_1024_540 0.35/0.825 ALL :: AP=0.6968 - Precision=0.8436 - Recall=0.8207 - Fscore=0.8320\n640_1024_545 0.35/0.825 ALL :: AP=0.7020 - Precision=0.8393 - Recall=0.8289 - Fscore=0.8340\n640_1024_550 0.35/0.825 ALL :: AP=0.7073 - Precision=0.8600 - Recall=0.8152 - Fscore=0.8370\n640_1024_555 0.35/0.825 ALL :: AP=0.7098 - Precision=0.8350 - Recall=0.8413 - Fscore=0.8381\n640_1024_560 0.35/0.825 ALL :: AP=0.6873 - Precision=0.8209 - Recall=0.8279 - Fscore=0.8244\n640_1024_565 0.35/0.825 ALL :: AP=0.7031 - Precision=0.8477 - Recall=0.8237 - Fscore=0.8355\n640_1024_570 0.35/0.825 ALL :: AP=0.7034 - Precision=0.8444 - Recall=0.8240 - Fscore=0.8340\n640_1024_575 0.35/0.825 ALL :: AP=0.6792 - Precision=0.8153 - Recall=0.8217 - Fscore=0.8185\n640_1024_580 0.35/0.825 ALL :: AP=0.6882 - Precision=0.8304 - Recall=0.8217 - Fscore=0.8260\n640_1024_585 0.35/0.825 ALL :: AP=0.6805 - Precision=0.8316 - Recall=0.8110 - Fscore=0.8211\n640_1024_590 0.35/0.825 ALL :: AP=0.7039 - Precision=0.8354 - Recall=0.8321 - Fscore=0.8338\n640_1024_595 0.35/0.825 ALL :: AP=0.7045 - Precision=0.8492 - Recall=0.8204 - Fscore=0.8345\n640_1024_600 0.35/0.825 ALL :: AP=0.6956 - Precision=0.8333 - Recall=0.8259 - Fscore=0.8296\n640_1024_605 0.35/0.825 ALL :: AP=0.7063 - Precision=0.8498 - Recall=0.8227 - Fscore=0.8360\n640_1024_610 0.35/0.825 ALL :: AP=0.6922 - Precision=0.8296 - Recall=0.8250 - Fscore=0.8273\n640_1024_615 0.35/0.825 ALL :: AP=0.6970 - Precision=0.8459 - Recall=0.8175 - Fscore=0.8314\n640_1024_620 0.35/0.825 ALL :: AP=0.6971 - Precision=0.8452 - Recall=0.8152 - Fscore=0.8299\n640_1024_625 0.35/0.825 ALL :: AP=0.7046 - Precision=0.8491 - Recall=0.8214 - Fscore=0.8350\n640_1024_630 0.35/0.825 ALL :: AP=0.6983 - Precision=0.8487 - Recall=0.8139 - Fscore=0.8309\n640_1024_635 0.35/0.825 ALL :: AP=0.6959 - Precision=0.8364 - Recall=0.8233 - Fscore=0.8298\n640_1024_640 0.35/0.825 ALL :: AP=0.6992 - Precision=0.8524 - Recall=0.8129 - Fscore=0.8322\n640_1024_645 0.35/0.825 ALL :: AP=0.6922 - Precision=0.8312 - Recall=0.8220 - Fscore=0.8266\n640_1024_650 0.35/0.825 ALL :: AP=0.6848 - Precision=0.8387 - Recall=0.8087 - Fscore=0.8234\n640_1024_655 0.35/0.825 ALL :: AP=0.6947 - Precision=0.8377 - Recall=0.8211 - Fscore=0.8293\n640_1024_660 0.35/0.825 ALL :: AP=0.6893 - Precision=0.8338 - Recall=0.8175 - Fscore=0.8255\n"
  },
  {
    "path": "output/Analysis/totalText_eval.txt",
    "content": "640_1024_55_0.7_0.6 0.3/0.85 ALL :: Precision=0.8919 - Recall=0.5764 - Fscore=0.7003\n640_1024_55_0.8_0.4 0.3/0.85 ALL :: Precision=0.7440 - Recall=0.4755 - Fscore=0.5802\n640_1024_60_0.7_0.6 0.3/0.85 ALL :: Precision=0.8882 - Recall=0.7482 - Fscore=0.8122\n640_1024_60_0.8_0.4 0.3/0.85 ALL :: Precision=0.8648 - Recall=0.6999 - Fscore=0.7737\n640_1024_65_0.7_0.6 0.3/0.85 ALL :: Precision=0.8836 - Recall=0.7422 - Fscore=0.8067\n640_1024_65_0.8_0.4 0.3/0.85 ALL :: Precision=0.8580 - Recall=0.6985 - Fscore=0.7701\n640_1024_70_0.7_0.6 0.3/0.85 ALL :: Precision=0.8578 - Recall=0.7809 - Fscore=0.8175\n640_1024_70_0.8_0.4 0.3/0.85 ALL :: Precision=0.8510 - Recall=0.7440 - Fscore=0.7939\n640_1024_75_0.7_0.6 0.3/0.85 ALL :: Precision=0.9005 - Recall=0.7093 - Fscore=0.7935\n640_1024_75_0.8_0.4 0.3/0.85 ALL :: Precision=0.7720 - Recall=0.5996 - Fscore=0.6750\n640_1024_80_0.7_0.6 0.3/0.85 ALL :: Precision=0.8779 - Recall=0.7393 - Fscore=0.8026\n640_1024_80_0.8_0.4 0.3/0.85 ALL :: Precision=0.9035 - Recall=0.7136 - Fscore=0.7974\n640_1024_85_0.7_0.6 0.3/0.85 ALL :: Precision=0.9066 - Recall=0.7356 - Fscore=0.8122\n640_1024_85_0.8_0.4 0.3/0.85 ALL :: Precision=0.9056 - Recall=0.6985 - Fscore=0.7887\n640_1024_90_0.7_0.6 0.3/0.85 ALL :: Precision=0.9147 - Recall=0.7342 - Fscore=0.8146\n640_1024_90_0.8_0.4 0.3/0.85 ALL :: Precision=0.8714 - Recall=0.6778 - Fscore=0.7625\n640_1024_95_0.7_0.6 0.3/0.85 ALL :: Precision=0.9105 - Recall=0.7541 - Fscore=0.8249\n640_1024_95_0.8_0.4 0.3/0.85 ALL :: Precision=0.8974 - Recall=0.7189 - Fscore=0.7983\n640_1024_100_0.7_0.6 0.3/0.85 ALL :: Precision=0.8707 - Recall=0.7873 - Fscore=0.8269\n640_1024_100_0.8_0.4 0.3/0.85 ALL :: Precision=0.8760 - Recall=0.7581 - Fscore=0.8128\n640_1024_105_0.7_0.6 0.3/0.85 ALL :: Precision=0.8976 - Recall=0.7715 - Fscore=0.8298\n640_1024_105_0.8_0.4 0.3/0.85 ALL :: Precision=0.8745 - Recall=0.7299 - Fscore=0.7956\n640_1024_110_0.7_0.6 0.3/0.85 ALL :: Precision=0.8657 - Recall=0.7544 - Fscore=0.8062\n640_1024_110_0.8_0.4 0.3/0.85 ALL :: Precision=0.8370 - Recall=0.7028 - Fscore=0.7641\n640_1024_115_0.7_0.6 0.3/0.85 ALL :: Precision=0.8907 - Recall=0.7803 - Fscore=0.8318\n640_1024_115_0.8_0.4 0.3/0.85 ALL :: Precision=0.8775 - Recall=0.7494 - Fscore=0.8084\n640_1024_120_0.7_0.6 0.3/0.85 ALL :: Precision=0.9044 - Recall=0.7874 - Fscore=0.8419\n640_1024_120_0.8_0.4 0.3/0.85 ALL :: Precision=0.8854 - Recall=0.7429 - Fscore=0.8079\n640_1024_125_0.7_0.6 0.3/0.85 ALL :: Precision=0.8920 - Recall=0.7837 - Fscore=0.8343\n640_1024_125_0.8_0.4 0.3/0.85 ALL :: Precision=0.8616 - Recall=0.7341 - Fscore=0.7927\n640_1024_130_0.7_0.6 0.3/0.85 ALL :: Precision=0.9021 - Recall=0.7599 - Fscore=0.8249\n640_1024_130_0.8_0.4 0.3/0.85 ALL :: Precision=0.8815 - Recall=0.7221 - Fscore=0.7939\n640_1024_135_0.7_0.6 0.3/0.85 ALL :: Precision=0.9095 - Recall=0.7417 - Fscore=0.8170\n640_1024_135_0.8_0.4 0.3/0.85 ALL :: Precision=0.8822 - Recall=0.6976 - Fscore=0.7792\n640_1024_140_0.7_0.6 0.3/0.85 ALL :: Precision=0.9052 - Recall=0.7681 - Fscore=0.8310\n640_1024_140_0.8_0.4 0.3/0.85 ALL :: Precision=0.8372 - Recall=0.7023 - Fscore=0.7638\n640_1024_145_0.7_0.6 0.3/0.85 ALL :: Precision=0.9037 - Recall=0.7492 - Fscore=0.8192\n640_1024_145_0.8_0.4 0.3/0.85 ALL :: Precision=0.8498 - Recall=0.6900 - Fscore=0.7616\n640_1024_150_0.7_0.6 0.3/0.85 ALL :: Precision=0.8820 - Recall=0.7822 - Fscore=0.8291\n640_1024_150_0.8_0.4 0.3/0.85 ALL :: Precision=0.7898 - Recall=0.6825 - Fscore=0.7322\n640_1024_155_0.7_0.6 0.3/0.85 ALL :: Precision=0.9289 - Recall=0.7593 - Fscore=0.8356\n640_1024_155_0.8_0.4 0.3/0.85 ALL :: Precision=0.8968 - Recall=0.7114 - Fscore=0.7934\n640_1024_160_0.7_0.6 0.3/0.85 ALL :: Precision=0.8956 - Recall=0.7883 - Fscore=0.8385\n640_1024_160_0.8_0.4 0.3/0.85 ALL :: Precision=0.8661 - Recall=0.7422 - Fscore=0.7994\n640_1024_165_0.7_0.6 0.3/0.85 ALL :: Precision=0.8919 - Recall=0.7734 - Fscore=0.8284\n640_1024_165_0.8_0.4 0.3/0.85 ALL :: Precision=0.9038 - Recall=0.7515 - Fscore=0.8206\n640_1024_170_0.7_0.6 0.3/0.85 ALL :: Precision=0.9014 - Recall=0.7495 - Fscore=0.8184\n640_1024_170_0.8_0.4 0.3/0.85 ALL :: Precision=0.8178 - Recall=0.6698 - Fscore=0.7364\n640_1024_175_0.7_0.6 0.3/0.85 ALL :: Precision=0.8883 - Recall=0.7802 - Fscore=0.8307\n640_1024_175_0.8_0.4 0.3/0.85 ALL :: Precision=0.8175 - Recall=0.7062 - Fscore=0.7578\n640_1024_180_0.7_0.6 0.3/0.85 ALL :: Precision=0.8944 - Recall=0.8206 - Fscore=0.8559\n640_1024_180_0.8_0.4 0.3/0.85 ALL :: Precision=0.8799 - Recall=0.7856 - Fscore=0.8300\n640_1024_185_0.7_0.6 0.3/0.85 ALL :: Precision=0.9049 - Recall=0.8038 - Fscore=0.8514\n640_1024_185_0.8_0.4 0.3/0.85 ALL :: Precision=0.8634 - Recall=0.7460 - Fscore=0.8004\n640_1024_190_0.7_0.6 0.3/0.85 ALL :: Precision=0.9025 - Recall=0.7893 - Fscore=0.8421\n640_1024_190_0.8_0.4 0.3/0.85 ALL :: Precision=0.8476 - Recall=0.7310 - Fscore=0.7850\n640_1024_195_0.7_0.6 0.3/0.85 ALL :: Precision=0.8633 - Recall=0.8346 - Fscore=0.8487\n640_1024_195_0.8_0.4 0.3/0.85 ALL :: Precision=0.8211 - Recall=0.7784 - Fscore=0.7992\n640_1024_200_0.7_0.6 0.3/0.85 ALL :: Precision=0.8903 - Recall=0.7979 - Fscore=0.8416\n640_1024_200_0.8_0.4 0.3/0.85 ALL :: Precision=0.8883 - Recall=0.7704 - Fscore=0.8251\n640_1024_205_0.7_0.6 0.3/0.85 ALL :: Precision=0.8790 - Recall=0.8166 - Fscore=0.8467\n640_1024_205_0.8_0.4 0.3/0.85 ALL :: Precision=0.8131 - Recall=0.7435 - Fscore=0.7767\n640_1024_210_0.7_0.6 0.3/0.85 ALL :: Precision=0.8907 - Recall=0.7886 - Fscore=0.8365\n640_1024_210_0.8_0.4 0.3/0.85 ALL :: Precision=0.8386 - Recall=0.7358 - Fscore=0.7839\n640_1024_215_0.7_0.6 0.3/0.85 ALL :: Precision=0.9042 - Recall=0.8074 - Fscore=0.8530\n640_1024_215_0.8_0.4 0.3/0.85 ALL :: Precision=0.8894 - Recall=0.7689 - Fscore=0.8247\n640_1024_220_0.7_0.6 0.3/0.85 ALL :: Precision=0.9024 - Recall=0.8124 - Fscore=0.8551\n640_1024_220_0.8_0.4 0.3/0.85 ALL :: Precision=0.8894 - Recall=0.7789 - Fscore=0.8305\n640_1024_225_0.7_0.6 0.3/0.85 ALL :: Precision=0.9243 - Recall=0.7753 - Fscore=0.8433\n640_1024_225_0.8_0.4 0.3/0.85 ALL :: Precision=0.8966 - Recall=0.7331 - Fscore=0.8067\n640_1024_230_0.7_0.6 0.3/0.85 ALL :: Precision=0.9038 - Recall=0.7887 - Fscore=0.8423\n640_1024_230_0.8_0.4 0.3/0.85 ALL :: Precision=0.8479 - Recall=0.7240 - Fscore=0.7811\n640_1024_235_0.7_0.6 0.3/0.85 ALL :: Precision=0.8893 - Recall=0.8155 - Fscore=0.8508\n640_1024_235_0.8_0.4 0.3/0.85 ALL :: Precision=0.8738 - Recall=0.7748 - Fscore=0.8213\n640_1024_240_0.7_0.6 0.3/0.85 ALL :: Precision=0.9134 - Recall=0.7917 - Fscore=0.8482\n640_1024_240_0.8_0.4 0.3/0.85 ALL :: Precision=0.8867 - Recall=0.7480 - Fscore=0.8115\n640_1024_245_0.7_0.6 0.3/0.85 ALL :: Precision=0.8786 - Recall=0.8240 - Fscore=0.8505\n640_1024_245_0.8_0.4 0.3/0.85 ALL :: Precision=0.8876 - Recall=0.8023 - Fscore=0.8428\n640_1024_250_0.7_0.6 0.3/0.85 ALL :: Precision=0.9029 - Recall=0.8116 - Fscore=0.8548\n640_1024_250_0.8_0.4 0.3/0.85 ALL :: Precision=0.8781 - Recall=0.7730 - Fscore=0.8222\n640_1024_255_0.7_0.6 0.3/0.85 ALL :: Precision=0.8993 - Recall=0.8044 - Fscore=0.8492\n640_1024_255_0.8_0.4 0.3/0.85 ALL :: Precision=0.8817 - Recall=0.7730 - Fscore=0.8237\n640_1024_260_0.7_0.6 0.3/0.85 ALL :: Precision=0.9177 - Recall=0.7869 - Fscore=0.8473\n640_1024_260_0.8_0.4 0.3/0.85 ALL :: Precision=0.8965 - Recall=0.7519 - Fscore=0.8179\n640_1024_265_0.7_0.6 0.3/0.85 ALL :: Precision=0.9016 - Recall=0.7657 - Fscore=0.8281\n640_1024_265_0.8_0.4 0.3/0.85 ALL :: Precision=0.8101 - Recall=0.6797 - Fscore=0.7392\n640_1024_270_0.7_0.6 0.3/0.85 ALL :: Precision=0.9182 - Recall=0.7625 - Fscore=0.8332\n640_1024_270_0.8_0.4 0.3/0.85 ALL :: Precision=0.8800 - Recall=0.7181 - Fscore=0.7908\n640_1024_275_0.7_0.6 0.3/0.85 ALL :: Precision=0.8808 - Recall=0.8362 - Fscore=0.8579\n640_1024_275_0.8_0.4 0.3/0.85 ALL :: Precision=0.8450 - Recall=0.7865 - Fscore=0.8147\n640_1024_280_0.7_0.6 0.3/0.85 ALL :: Precision=0.8764 - Recall=0.8334 - Fscore=0.8544\n640_1024_280_0.8_0.4 0.3/0.85 ALL :: Precision=0.8273 - Recall=0.7779 - Fscore=0.8018\n640_1024_285_0.7_0.6 0.3/0.85 ALL :: Precision=0.9112 - Recall=0.7925 - Fscore=0.8477\n640_1024_285_0.8_0.4 0.3/0.85 ALL :: Precision=0.8581 - Recall=0.7381 - Fscore=0.7936\n640_1024_290_0.7_0.6 0.3/0.85 ALL :: Precision=0.9156 - Recall=0.8065 - Fscore=0.8576\n640_1024_290_0.8_0.4 0.3/0.85 ALL :: Precision=0.8930 - Recall=0.7752 - Fscore=0.8299\n640_1024_295_0.7_0.6 0.3/0.85 ALL :: Precision=0.8806 - Recall=0.8238 - Fscore=0.8512\n640_1024_295_0.8_0.4 0.3/0.85 ALL :: Precision=0.8351 - Recall=0.7685 - Fscore=0.8004\n640_1024_300_0.7_0.6 0.3/0.85 ALL :: Precision=0.9062 - Recall=0.8168 - Fscore=0.8592\n640_1024_300_0.8_0.4 0.3/0.85 ALL :: Precision=0.8715 - Recall=0.7657 - Fscore=0.8152\n640_1024_305_0.7_0.6 0.3/0.85 ALL :: Precision=0.8940 - Recall=0.8250 - Fscore=0.8581\n640_1024_305_0.8_0.4 0.3/0.85 ALL :: Precision=0.8665 - Recall=0.7877 - Fscore=0.8252\n640_1024_310_0.7_0.6 0.3/0.85 ALL :: Precision=0.9030 - Recall=0.8124 - Fscore=0.8553\n640_1024_310_0.8_0.4 0.3/0.85 ALL :: Precision=0.8219 - Recall=0.7302 - Fscore=0.7734\n640_1024_315_0.7_0.6 0.3/0.85 ALL :: Precision=0.8854 - Recall=0.8052 - Fscore=0.8434\n640_1024_315_0.8_0.4 0.3/0.85 ALL :: Precision=0.8243 - Recall=0.7396 - Fscore=0.7797\n640_1024_320_0.7_0.6 0.3/0.85 ALL :: Precision=0.9020 - Recall=0.8080 - Fscore=0.8524\n640_1024_320_0.8_0.4 0.3/0.85 ALL :: Precision=0.8380 - Recall=0.7368 - Fscore=0.7842\n640_1024_325_0.7_0.6 0.3/0.85 ALL :: Precision=0.9207 - Recall=0.7591 - Fscore=0.8321\n640_1024_325_0.8_0.4 0.3/0.85 ALL :: Precision=0.8780 - Recall=0.7137 - Fscore=0.7874\n640_1024_330_0.7_0.6 0.3/0.85 ALL :: Precision=0.9092 - Recall=0.8017 - Fscore=0.8521\n640_1024_330_0.8_0.4 0.3/0.85 ALL :: Precision=0.8613 - Recall=0.7506 - Fscore=0.8022\n640_1024_335_0.7_0.6 0.3/0.85 ALL :: Precision=0.9124 - Recall=0.7884 - Fscore=0.8459\n640_1024_335_0.8_0.4 0.3/0.85 ALL :: Precision=0.8746 - Recall=0.7389 - Fscore=0.8010\n640_1024_340_0.7_0.6 0.3/0.85 ALL :: Precision=0.8851 - Recall=0.8261 - Fscore=0.8546\n640_1024_340_0.8_0.4 0.3/0.85 ALL :: Precision=0.8666 - Recall=0.7851 - Fscore=0.8238\n640_1024_345_0.7_0.6 0.3/0.85 ALL :: Precision=0.8868 - Recall=0.8367 - Fscore=0.8610\n640_1024_345_0.8_0.4 0.3/0.85 ALL :: Precision=0.8489 - Recall=0.7883 - Fscore=0.8175\n640_1024_350_0.7_0.6 0.3/0.85 ALL :: Precision=0.8902 - Recall=0.8297 - Fscore=0.8589\n640_1024_350_0.8_0.4 0.3/0.85 ALL :: Precision=0.8778 - Recall=0.8043 - Fscore=0.8394\n640_1024_355_0.7_0.6 0.3/0.85 ALL :: Precision=0.9133 - Recall=0.8091 - Fscore=0.8580\n640_1024_355_0.8_0.4 0.3/0.85 ALL :: Precision=0.8882 - Recall=0.7691 - Fscore=0.8244\n640_1024_360_0.7_0.6 0.3/0.85 ALL :: Precision=0.9170 - Recall=0.8006 - Fscore=0.8549\n640_1024_360_0.8_0.4 0.3/0.85 ALL :: Precision=0.8782 - Recall=0.7532 - Fscore=0.8109\n640_1024_365_0.7_0.6 0.3/0.85 ALL :: Precision=0.9107 - Recall=0.8159 - Fscore=0.8607\n640_1024_365_0.8_0.4 0.3/0.85 ALL :: Precision=0.8958 - Recall=0.7883 - Fscore=0.8386\n640_1024_370_0.7_0.6 0.3/0.85 ALL :: Precision=0.9239 - Recall=0.8065 - Fscore=0.8612\n640_1024_370_0.8_0.4 0.3/0.85 ALL :: Precision=0.8984 - Recall=0.7693 - Fscore=0.8289\n640_1024_375_0.7_0.6 0.3/0.85 ALL :: Precision=0.9133 - Recall=0.8207 - Fscore=0.8645\n640_1024_375_0.8_0.4 0.3/0.85 ALL :: Precision=0.8985 - Recall=0.7885 - Fscore=0.8399\n640_1024_380_0.7_0.6 0.3/0.85 ALL :: Precision=0.9233 - Recall=0.7982 - Fscore=0.8562\n640_1024_380_0.8_0.4 0.3/0.85 ALL :: Precision=0.8988 - Recall=0.7631 - Fscore=0.8254\n640_1024_385_0.7_0.6 0.3/0.85 ALL :: Precision=0.9085 - Recall=0.8270 - Fscore=0.8659\n640_1024_385_0.8_0.4 0.3/0.85 ALL :: Precision=0.8804 - Recall=0.7867 - Fscore=0.8309\n640_1024_390_0.7_0.6 0.3/0.85 ALL :: Precision=0.8920 - Recall=0.8430 - Fscore=0.8668\n640_1024_390_0.8_0.4 0.3/0.85 ALL :: Precision=0.8527 - Recall=0.7916 - Fscore=0.8210\n640_1024_395_0.7_0.6 0.3/0.85 ALL :: Precision=0.9081 - Recall=0.8270 - Fscore=0.8657\n640_1024_395_0.8_0.4 0.3/0.85 ALL :: Precision=0.8728 - Recall=0.7782 - Fscore=0.8228\n640_1024_400_0.7_0.6 0.3/0.85 ALL :: Precision=0.9075 - Recall=0.8230 - Fscore=0.8632\n640_1024_400_0.8_0.4 0.3/0.85 ALL :: Precision=0.8488 - Recall=0.7623 - Fscore=0.8032\n640_1024_405_0.7_0.6 0.3/0.85 ALL :: Precision=0.8951 - Recall=0.8366 - Fscore=0.8648\n640_1024_405_0.8_0.4 0.3/0.85 ALL :: Precision=0.8294 - Recall=0.7659 - Fscore=0.7964\n640_1024_410_0.7_0.6 0.3/0.85 ALL :: Precision=0.9087 - Recall=0.8168 - Fscore=0.8603\n640_1024_410_0.8_0.4 0.3/0.85 ALL :: Precision=0.8503 - Recall=0.7520 - Fscore=0.7981\n640_1024_415_0.7_0.6 0.3/0.85 ALL :: Precision=0.8821 - Recall=0.8457 - Fscore=0.8635\n640_1024_415_0.8_0.4 0.3/0.85 ALL :: Precision=0.8498 - Recall=0.8046 - Fscore=0.8266\n640_1024_420_0.7_0.6 0.3/0.85 ALL :: Precision=0.9138 - Recall=0.8333 - Fscore=0.8717\n640_1024_420_0.8_0.4 0.3/0.85 ALL :: Precision=0.8890 - Recall=0.7927 - Fscore=0.8381\n640_1024_425_0.7_0.6 0.3/0.85 ALL :: Precision=0.9073 - Recall=0.8247 - Fscore=0.8640\n640_1024_425_0.8_0.4 0.3/0.85 ALL :: Precision=0.8860 - Recall=0.7914 - Fscore=0.8360\n640_1024_430_0.7_0.6 0.3/0.85 ALL :: Precision=0.9150 - Recall=0.8278 - Fscore=0.8692\n640_1024_430_0.8_0.4 0.3/0.85 ALL :: Precision=0.8908 - Recall=0.7936 - Fscore=0.8394\n640_1024_435_0.7_0.6 0.3/0.85 ALL :: Precision=0.9052 - Recall=0.8284 - Fscore=0.8651\n640_1024_435_0.8_0.4 0.3/0.85 ALL :: Precision=0.8501 - Recall=0.7658 - Fscore=0.8057\n640_1024_440_0.7_0.6 0.3/0.85 ALL :: Precision=0.9019 - Recall=0.8144 - Fscore=0.8559\n640_1024_440_0.8_0.4 0.3/0.85 ALL :: Precision=0.8591 - Recall=0.7656 - Fscore=0.8097\n640_1024_445_0.7_0.6 0.3/0.85 ALL :: Precision=0.9054 - Recall=0.8309 - Fscore=0.8665\n640_1024_445_0.8_0.4 0.3/0.85 ALL :: Precision=0.8720 - Recall=0.7837 - Fscore=0.8255\n640_1024_450_0.7_0.6 0.3/0.85 ALL :: Precision=0.9026 - Recall=0.8360 - Fscore=0.8680\n640_1024_450_0.8_0.4 0.3/0.85 ALL :: Precision=0.8756 - Recall=0.7980 - Fscore=0.8350\n640_1024_455_0.7_0.6 0.3/0.85 ALL :: Precision=0.9142 - Recall=0.8088 - Fscore=0.8583\n640_1024_455_0.8_0.4 0.3/0.85 ALL :: Precision=0.8801 - Recall=0.7687 - Fscore=0.8206\n640_1024_460_0.7_0.6 0.3/0.85 ALL :: Precision=0.9003 - Recall=0.8290 - Fscore=0.8632\n640_1024_460_0.8_0.4 0.3/0.85 ALL :: Precision=0.8814 - Recall=0.7963 - Fscore=0.8367\n640_1024_465_0.7_0.6 0.3/0.85 ALL :: Precision=0.9083 - Recall=0.8284 - Fscore=0.8665\n640_1024_465_0.8_0.4 0.3/0.85 ALL :: Precision=0.8463 - Recall=0.7624 - Fscore=0.8022\n640_1024_470_0.7_0.6 0.3/0.85 ALL :: Precision=0.9052 - Recall=0.8264 - Fscore=0.8640\n640_1024_470_0.8_0.4 0.3/0.85 ALL :: Precision=0.8668 - Recall=0.7785 - Fscore=0.8203\n640_1024_475_0.7_0.6 0.3/0.85 ALL :: Precision=0.9228 - Recall=0.8064 - Fscore=0.8607\n640_1024_475_0.8_0.4 0.3/0.85 ALL :: Precision=0.9016 - Recall=0.7737 - Fscore=0.8328\n640_1024_480_0.7_0.6 0.3/0.85 ALL :: Precision=0.9117 - Recall=0.8129 - Fscore=0.8594\n640_1024_480_0.8_0.4 0.3/0.85 ALL :: Precision=0.9108 - Recall=0.7998 - Fscore=0.8517\n640_1024_485_0.7_0.6 0.3/0.85 ALL :: Precision=0.8764 - Recall=0.8402 - Fscore=0.8579\n640_1024_485_0.8_0.4 0.3/0.85 ALL :: Precision=0.8354 - Recall=0.7776 - Fscore=0.8055\n640_1024_490_0.7_0.6 0.3/0.85 ALL :: Precision=0.9033 - Recall=0.8417 - Fscore=0.8714\n640_1024_490_0.8_0.4 0.3/0.85 ALL :: Precision=0.8577 - Recall=0.7890 - Fscore=0.8219\n640_1024_495_0.7_0.6 0.3/0.85 ALL :: Precision=0.9119 - Recall=0.8284 - Fscore=0.8682\n640_1024_495_0.8_0.4 0.3/0.85 ALL :: Precision=0.8496 - Recall=0.7620 - Fscore=0.8034\n640_1024_500_0.7_0.6 0.3/0.85 ALL :: Precision=0.9022 - Recall=0.8422 - Fscore=0.8712\n640_1024_500_0.8_0.4 0.3/0.85 ALL :: Precision=0.8714 - Recall=0.7998 - Fscore=0.8341\n640_1024_505_0.7_0.6 0.3/0.85 ALL :: Precision=0.9168 - Recall=0.8249 - Fscore=0.8684\n640_1024_505_0.8_0.4 0.3/0.85 ALL :: Precision=0.8742 - Recall=0.7757 - Fscore=0.8220\n640_1024_510_0.7_0.6 0.3/0.85 ALL :: Precision=0.8980 - Recall=0.8305 - Fscore=0.8629\n640_1024_510_0.8_0.4 0.3/0.85 ALL :: Precision=0.8457 - Recall=0.7719 - Fscore=0.8071\n640_1024_515_0.7_0.6 0.3/0.85 ALL :: Precision=0.9138 - Recall=0.8304 - Fscore=0.8701\n640_1024_515_0.8_0.4 0.3/0.85 ALL :: Precision=0.8791 - Recall=0.7894 - Fscore=0.8318\n640_1024_520_0.7_0.6 0.3/0.85 ALL :: Precision=0.9174 - Recall=0.8219 - Fscore=0.8670\n640_1024_520_0.8_0.4 0.3/0.85 ALL :: Precision=0.8750 - Recall=0.7736 - Fscore=0.8212\n640_1024_525_0.7_0.6 0.3/0.85 ALL :: Precision=0.9173 - Recall=0.8246 - Fscore=0.8685\n640_1024_525_0.8_0.4 0.3/0.85 ALL :: Precision=0.8903 - Recall=0.7902 - Fscore=0.8373\n640_1024_530_0.7_0.6 0.3/0.85 ALL :: Precision=0.9003 - Recall=0.8358 - Fscore=0.8668\n640_1024_530_0.8_0.4 0.3/0.85 ALL :: Precision=0.8657 - Recall=0.7912 - Fscore=0.8268\n640_1024_535_0.7_0.6 0.3/0.85 ALL :: Precision=0.9101 - Recall=0.8303 - Fscore=0.8684\n640_1024_535_0.8_0.4 0.3/0.85 ALL :: Precision=0.8425 - Recall=0.7601 - Fscore=0.7991\n640_1024_540_0.7_0.6 0.3/0.85 ALL :: Precision=0.9070 - Recall=0.8270 - Fscore=0.8651\n640_1024_540_0.8_0.4 0.3/0.85 ALL :: Precision=0.8880 - Recall=0.7922 - Fscore=0.8374\n640_1024_545_0.7_0.6 0.3/0.85 ALL :: Precision=0.8862 - Recall=0.8309 - Fscore=0.8577\n640_1024_545_0.8_0.4 0.3/0.85 ALL :: Precision=0.8246 - Recall=0.7681 - Fscore=0.7953\n640_1024_550_0.7_0.6 0.3/0.85 ALL :: Precision=0.9061 - Recall=0.8276 - Fscore=0.8651\n640_1024_550_0.8_0.4 0.3/0.85 ALL :: Precision=0.8729 - Recall=0.7858 - Fscore=0.8271\n640_1024_555_0.7_0.6 0.3/0.85 ALL :: Precision=0.9012 - Recall=0.8410 - Fscore=0.8701\n640_1024_555_0.8_0.4 0.3/0.85 ALL :: Precision=0.8597 - Recall=0.7856 - Fscore=0.8210\n640_1024_560_0.7_0.6 0.3/0.85 ALL :: Precision=0.8993 - Recall=0.8335 - Fscore=0.8651\n640_1024_560_0.8_0.4 0.3/0.85 ALL :: Precision=0.8532 - Recall=0.7789 - Fscore=0.8144\n640_1024_565_0.7_0.6 0.3/0.85 ALL :: Precision=0.9188 - Recall=0.8422 - Fscore=0.8788\n640_1024_565_0.8_0.4 0.3/0.85 ALL :: Precision=0.8820 - Recall=0.7945 - Fscore=0.8359\n640_1024_570_0.7_0.6 0.3/0.85 ALL :: Precision=0.9077 - Recall=0.8271 - Fscore=0.8656\n640_1024_570_0.8_0.4 0.3/0.85 ALL :: Precision=0.8659 - Recall=0.7768 - Fscore=0.8189\n640_1024_575_0.7_0.6 0.3/0.85 ALL :: Precision=0.9108 - Recall=0.8214 - Fscore=0.8638\n640_1024_575_0.8_0.4 0.3/0.85 ALL :: Precision=0.8447 - Recall=0.7512 - Fscore=0.7952\n640_1024_580_0.7_0.6 0.3/0.85 ALL :: Precision=0.9136 - Recall=0.8044 - Fscore=0.8555\n640_1024_580_0.8_0.4 0.3/0.85 ALL :: Precision=0.8763 - Recall=0.7612 - Fscore=0.8147\n640_1024_585_0.7_0.6 0.3/0.85 ALL :: Precision=0.9097 - Recall=0.8334 - Fscore=0.8699\n640_1024_585_0.8_0.4 0.3/0.85 ALL :: Precision=0.8672 - Recall=0.7802 - Fscore=0.8214\n640_1024_590_0.7_0.6 0.3/0.85 ALL :: Precision=0.9086 - Recall=0.8271 - Fscore=0.8659\n640_1024_590_0.8_0.4 0.3/0.85 ALL :: Precision=0.8544 - Recall=0.7687 - Fscore=0.8093\n640_1024_595_0.7_0.6 0.3/0.85 ALL :: Precision=0.9148 - Recall=0.8355 - Fscore=0.8734\n640_1024_595_0.8_0.4 0.3/0.85 ALL :: Precision=0.8840 - Recall=0.7904 - Fscore=0.8346\n640_1024_600_0.7_0.6 0.3/0.85 ALL :: Precision=0.9086 - Recall=0.8419 - Fscore=0.8740\n640_1024_600_0.8_0.4 0.3/0.85 ALL :: Precision=0.8787 - Recall=0.7994 - Fscore=0.8372\n640_1024_605_0.7_0.6 0.3/0.85 ALL :: Precision=0.8997 - Recall=0.8450 - Fscore=0.8715\n640_1024_605_0.8_0.4 0.3/0.85 ALL :: Precision=0.8769 - Recall=0.8091 - Fscore=0.8416\n640_1024_610_0.7_0.6 0.3/0.85 ALL :: Precision=0.9126 - Recall=0.8148 - Fscore=0.8609\n640_1024_610_0.8_0.4 0.3/0.85 ALL :: Precision=0.8784 - Recall=0.7695 - Fscore=0.8204\n640_1024_615_0.7_0.6 0.3/0.85 ALL :: Precision=0.9046 - Recall=0.8327 - Fscore=0.8671\n640_1024_615_0.8_0.4 0.3/0.85 ALL :: Precision=0.8632 - Recall=0.7802 - Fscore=0.8196\n640_1024_620_0.7_0.6 0.3/0.85 ALL :: Precision=0.9005 - Recall=0.8495 - Fscore=0.8743\n640_1024_620_0.8_0.4 0.3/0.85 ALL :: Precision=0.8641 - Recall=0.8016 - Fscore=0.8317\n640_1024_625_0.7_0.6 0.3/0.85 ALL :: Precision=0.9048 - Recall=0.8413 - Fscore=0.8719\n640_1024_625_0.8_0.4 0.3/0.85 ALL :: Precision=0.8578 - Recall=0.7865 - Fscore=0.8206\n640_1024_630_0.7_0.6 0.3/0.85 ALL :: Precision=0.9182 - Recall=0.8181 - Fscore=0.8653\n640_1024_630_0.8_0.4 0.3/0.85 ALL :: Precision=0.8621 - Recall=0.7583 - Fscore=0.8069\n640_1024_635_0.7_0.6 0.3/0.85 ALL :: Precision=0.9102 - Recall=0.8383 - Fscore=0.8728\n640_1024_635_0.8_0.4 0.3/0.85 ALL :: Precision=0.8753 - Recall=0.7914 - Fscore=0.8312\n640_1024_640_0.7_0.6 0.3/0.85 ALL :: Precision=0.9160 - Recall=0.8320 - Fscore=0.8720\n640_1024_640_0.8_0.4 0.3/0.85 ALL :: Precision=0.8804 - Recall=0.7851 - Fscore=0.8300\n640_1024_645_0.7_0.6 0.3/0.85 ALL :: Precision=0.9050 - Recall=0.8472 - Fscore=0.8751\n640_1024_645_0.8_0.4 0.3/0.85 ALL :: Precision=0.8799 - Recall=0.8103 - Fscore=0.8437\n640_1024_650_0.7_0.6 0.3/0.85 ALL :: Precision=0.9122 - Recall=0.8394 - Fscore=0.8743\n640_1024_650_0.8_0.4 0.3/0.85 ALL :: Precision=0.8630 - Recall=0.7830 - Fscore=0.8211\n640_1024_655_0.7_0.6 0.3/0.85 ALL :: Precision=0.9052 - Recall=0.8440 - Fscore=0.8735\n640_1024_655_0.8_0.4 0.3/0.85 ALL :: Precision=0.8671 - Recall=0.7947 - Fscore=0.8294\n640_1024_660_0.7_0.6 0.3/0.85 ALL :: Precision=0.9090 - Recall=0.8440 - Fscore=0.8753\n640_1024_660_0.8_0.4 0.3/0.85 ALL :: Precision=0.8612 - Recall=0.7893 - Fscore=0.8237\n"
  },
  {
    "path": "output/README.md",
    "content": "\n## Code Update Description\n**Solving the difficulties of reproducing the performance in our paper**\n\n**Reproduction records of some followers** in [here](https://github.com/GXYM/TextBPN-Plus-Plus/blob/main/output/%E5%BE%AE%E4%BF%A1%E5%9B%BE%E7%89%87_20230626221956.jpg) \n\n## Code change\n\n\n* 1.change the code in loss.py (Great impact; When there is a lot of noise in the data annotation, the balanced cross entropy with OHEM will lead to unstable training, which is also the main reason for difficult reproduction)\n```\n # pixel class loss\ncls_loss = self.cls_ohem(fy_preds[:, 0, :, :], tr_mask.float(), train_mask)\n\n```\nto \n```\n# pixel class loss\n# cls_loss = self.cls_ohem(fy_preds[:, 0, :, :], tr_mask.float(), train_mask)\ncls_loss = self.BCE_loss(fy_preds[:, 0, :, :],  tr_mask.float())\ncls_loss = torch.mul(cls_loss, train_mask.float()).mean()       \n```\n\n\n* 2.change the code in loss.py (Minor impact; The parameter theta is missing in the old version)\n```\nif eps is None:\n  alpha = 1.0; beta = 3.0; gama = 0.05\nelse:\n  alpha = 1.0; beta = 3.0; gama = 0.1*torch.sigmoid(torch.tensor((eps - cfg.max_epoch)/cfg.max_epoch))\nloss = alpha*cls_loss + beta*dis_loss + norm_loss + angle_loss + gama*(point_loss + energy_loss)\nloss_dict = {\n            'total_loss': loss,\n            'cls_loss': alpha*cls_loss,\n            'distance loss': beta*dis_loss,\n            'dir_loss': norm_loss + angle_loss,\n            'norm_loss': norm_loss,\n            'angle_loss': angle_loss,\n            'point_loss': gama*point_loss,\n            'energy_loss': gama*energy_loss,}     \n```\nto \n```\nif eps is None:\n  alpha = 1.0; beta = 3.0; theta=0.5; gama = 0.05\nelse:\n  alpha = 1.0; beta = 3.0; theta=0.5; \n  gama = 0.1*torch.sigmoid(torch.tensor((eps - cfg.max_epoch)/cfg.max_epoch))\nloss = alpha*cls_loss + beta*dis_loss + theta*(norm_loss + angle_loss) + gama*(point_loss + energy_loss)\nloss_dict = {\n            'total_loss': loss,\n            'cls_loss': alpha*cls_loss,\n            'distance loss': beta*dis_loss,\n            'dir_loss': theta*(norm_loss + angle_loss),\n            'norm_loss': theta*norm_loss,\n            'angle_loss': theta*angle_loss,\n            'point_loss': gama*point_loss,\n            'energy_loss': gama*energy_loss,}      \n```\n\n* 3.change the code in dataload.py (Uncertain impact; These parameters are used to generate the boundary proposals in training and add random noise)\n\n```\ndef __init__(self, transform, is_training=False):\n        super().__init__()\n        self.transform = transform\n        self.is_training = is_training\n        self.min_text_size = 4\n        self.jitter = 0.7\n        self.th_b = 0.4\n```\nto \n```\ndef __init__(self, transform, is_training=False):\n        super().__init__()\n        self.transform = transform\n        self.is_training = is_training\n        self.min_text_size = 4\n        self.jitter = 0.65\n        self.th_b = 0.35\n```\n\n\n\n* 4.change the code in augmentation.py (Uncertain impact; The range of angle augmentation is 60 not 30; add the RandomDistortion)\n\n```\nclass Augmentation(object):\n    def __init__(self, size, mean, std):\n        self.size = size\n        self.mean = mean\n        self.std = std\n        self.augmentation = Compose([\n            RandomCropFlip(),\n            RandomResizeScale(size=self.size, ratio=(3. / 8, 5. / 2)),\n            RandomResizedCrop(),\n            RotatePadding(up=30, colors=True),  # pretrain on Syn is \"up=15\", else is \"up=30\"\n            ResizeLimitSquare(size=self.size),\n            # # RandomBrightness(),\n            # # RandomContrast(),\n            RandomMirror(),\n            Normalize(mean=self.mean, std=self.std),\n        ])\n```\nto \n```\nclass Augmentation(object):\n    def __init__(self, size, mean, std):\n        self.size = size\n        self.mean = mean\n        self.std = std\n        self._transform_dict = {'brightness': 0.5, 'contrast': 0.5, 'sharpness': 0.8386, 'color': 0.5}\n        self.augmentation = Compose([\n            RandomCropFlip(),\n            RandomResizeScale(size=self.size, ratio=(3. / 8, 5. / 2)),\n            RandomResizedCrop(),\n            RotatePadding(up=60, colors=True),  # pretrain on Syn is \"up=15\", else is \"up=30\"\n            ResizeLimitSquare(size=self.size),\n            RandomMirror(),\n            RandomDistortion(self._transform_dict),\n            Normalize(mean=self.mean, std=self.std),\n        ])\n```\n\n## Recurrence details\n\n* model：Res50-1s\n* train dataset：CTW1500 trainset(1000 imgs) or Total-Text trainset(1225 imgs)，**without extra data**  \n* dataset versuion: old version([CTW1500](https://github.com/GXYM/TextBPN-Plus-Plus/blob/main/dataset/ctw1500_text.py), [Total-Text](https://github.com/GXYM/TextBPN-Plus-Plus/blob/main/dataset/Total_Text.py))  \n* train script：[train_CTW1500_res50_1s.sh](https://github.com/GXYM/TextBPN-Plus-Plus/blob/main/scripts-train/train_CTW1500_res50_1s.sh), [train_Totaltext_res50_1s.sh](https://github.com/GXYM/TextBPN-Plus-Plus/blob/main/scripts-train/train_Totaltext_res50_1s.sh)  \n\n**Our replicated performance (without extra data for pre-training) by using the updated codes:**   \n\n|  Datasets   |  pre-training |  Model        |recall | precision |F-measure| FPS |  \n|:----------:\t|:-----------:  |:-----------:\t|:-------:|:-----:|:-------:|:---:|  \n| Total-Text \t|       -       | [Res50-1s](https://drive.google.com/file/d/1wZOXGrM74CbrGY_lo4yF907VpbwnLf9d/view?usp=sharing) \t    | 84.22 |91.88 |87.88 |13|  \n| CTW-1500  \t|       -       | [Res50-1s](https://drive.google.com/file/d/1wZOXGrM74CbrGY_lo4yF907VpbwnLf9d/view?usp=sharing)\t      | 82.69 |86.53 |84.57 |14|  \n\n\n## Test Log\n\n**1.** [CTW1500](https://github.com/GXYM/TextBPN-Plus-Plus/blob/main/output/Analysis/ctw1500_eval.txt)  \n```\n640_1024_100 0.35/0.825 ALL :: AP=0.6632 - Precision=0.8515 - Recall=0.7722 - Fscore=0.8099\n640_1024_105 0.35/0.825 ALL :: AP=0.6544 - Precision=0.8395 - Recall=0.7774 - Fscore=0.8072\n640_1024_110 0.35/0.825 ALL :: AP=0.6883 - Precision=0.8492 - Recall=0.8018 - Fscore=0.8248\n640_1024_115 0.35/0.825 ALL :: AP=0.6657 - Precision=0.8621 - Recall=0.7663 - Fscore=0.8114\n640_1024_120 0.35/0.825 ALL :: AP=0.6580 - Precision=0.8451 - Recall=0.7699 - Fscore=0.8057\n640_1024_125 0.35/0.825 ALL :: AP=0.6577 - Precision=0.8537 - Recall=0.7683 - Fscore=0.8087\n640_1024_130 0.35/0.825 ALL :: AP=0.6809 - Precision=0.8626 - Recall=0.7839 - Fscore=0.8214\n640_1024_135 0.35/0.825 ALL :: AP=0.6669 - Precision=0.8518 - Recall=0.7754 - Fscore=0.8118\n640_1024_140 0.35/0.825 ALL :: AP=0.6896 - Precision=0.8660 - Recall=0.7920 - Fscore=0.8274\n640_1024_145 0.35/0.825 ALL :: AP=0.6684 - Precision=0.8572 - Recall=0.7751 - Fscore=0.8141\n640_1024_150 0.35/0.825 ALL :: AP=0.6791 - Precision=0.8726 - Recall=0.7705 - Fscore=0.8184\n640_1024_155 0.35/0.825 ALL :: AP=0.6903 - Precision=0.8451 - Recall=0.8093 - Fscore=0.8268\n640_1024_160 0.35/0.825 ALL :: AP=0.6880 - Precision=0.8580 - Recall=0.7937 - Fscore=0.8246\n640_1024_165 0.35/0.825 ALL :: AP=0.6772 - Precision=0.8463 - Recall=0.7953 - Fscore=0.8200\n640_1024_170 0.35/0.825 ALL :: AP=0.6857 - Precision=0.8497 - Recall=0.8018 - Fscore=0.8251\n640_1024_175 0.35/0.825 ALL :: AP=0.6968 - Precision=0.8787 - Recall=0.7885 - Fscore=0.8311\n640_1024_180 0.35/0.825 ALL :: AP=0.6832 - Precision=0.8473 - Recall=0.7995 - Fscore=0.8227\n640_1024_185 0.35/0.825 ALL :: AP=0.6979 - Precision=0.8508 - Recall=0.8123 - Fscore=0.8311\n640_1024_190 0.35/0.825 ALL :: AP=0.6934 - Precision=0.8686 - Recall=0.7927 - Fscore=0.8289\n640_1024_195 0.35/0.825 ALL :: AP=0.6944 - Precision=0.8657 - Recall=0.7966 - Fscore=0.8297\n640_1024_200 0.35/0.825 ALL :: AP=0.6974 - Precision=0.8494 - Recall=0.8165 - Fscore=0.8326\n640_1024_205 0.35/0.825 ALL :: AP=0.6833 - Precision=0.8523 - Recall=0.7956 - Fscore=0.8230\n640_1024_210 0.35/0.825 ALL :: AP=0.6821 - Precision=0.8675 - Recall=0.7790 - Fscore=0.8209\n640_1024_215 0.35/0.825 ALL :: AP=0.7005 - Precision=0.8669 - Recall=0.7986 - Fscore=0.8314\n640_1024_220 0.35/0.825 ALL :: AP=0.6879 - Precision=0.8670 - Recall=0.7859 - Fscore=0.8244\n640_1024_225 0.35/0.825 ALL :: AP=0.6713 - Precision=0.8678 - Recall=0.7663 - Fscore=0.8139\n640_1024_230 0.35/0.825 ALL :: AP=0.6994 - Precision=0.8444 - Recall=0.8175 - Fscore=0.8307\n640_1024_235 0.35/0.825 ALL :: AP=0.7070 - Precision=0.8759 - Recall=0.8008 - Fscore=0.8367\n640_1024_240 0.35/0.825 ALL :: AP=0.6847 - Precision=0.8722 - Recall=0.7787 - Fscore=0.8228\n640_1024_245 0.35/0.825 ALL :: AP=0.6970 - Precision=0.8541 - Recall=0.8093 - Fscore=0.8311\n640_1024_250 0.35/0.825 ALL :: AP=0.7015 - Precision=0.8534 - Recall=0.8119 - Fscore=0.8321\n640_1024_255 0.35/0.825 ALL :: AP=0.6975 - Precision=0.8485 - Recall=0.8106 - Fscore=0.8291\n640_1024_260 0.35/0.825 ALL :: AP=0.6754 - Precision=0.8572 - Recall=0.7787 - Fscore=0.8161\n640_1024_265 0.35/0.825 ALL :: AP=0.6902 - Precision=0.8610 - Recall=0.7953 - Fscore=0.8268\n640_1024_270 0.35/0.825 ALL :: AP=0.6844 - Precision=0.8678 - Recall=0.7829 - Fscore=0.8232\n640_1024_275 0.35/0.825 ALL :: AP=0.7021 - Precision=0.8690 - Recall=0.8018 - Fscore=0.8340\n640_1024_280 0.35/0.825 ALL :: AP=0.6991 - Precision=0.8442 - Recall=0.8198 - Fscore=0.8318\n640_1024_285 0.35/0.825 ALL :: AP=0.7004 - Precision=0.8603 - Recall=0.8067 - Fscore=0.8326\n640_1024_290 0.35/0.825 ALL :: AP=0.6953 - Precision=0.8449 - Recall=0.8129 - Fscore=0.8286\n640_1024_295 0.35/0.825 ALL :: AP=0.7120 - Precision=0.8629 - Recall=0.8207 - Fscore=0.8413\n640_1024_300 0.35/0.825 ALL :: AP=0.7091 - Precision=0.8422 - Recall=0.8315 - Fscore=0.8368\n640_1024_305 0.35/0.825 ALL :: AP=0.7105 - Precision=0.8420 - Recall=0.8357 - Fscore=0.8389\n640_1024_310 0.35/0.825 ALL :: AP=0.7002 - Precision=0.8469 - Recall=0.8220 - Fscore=0.8343\n640_1024_315 0.35/0.825 ALL :: AP=0.6916 - Precision=0.8551 - Recall=0.8018 - Fscore=0.8276\n640_1024_320 0.35/0.825 ALL :: AP=0.7021 - Precision=0.8529 - Recall=0.8142 - Fscore=0.8331\n640_1024_325 0.35/0.825 ALL :: AP=0.7042 - Precision=0.8406 - Recall=0.8269 - Fscore=0.8337\n640_1024_330 0.35/0.825 ALL :: AP=0.7052 - Precision=0.8578 - Recall=0.8139 - Fscore=0.8353\n640_1024_335 0.35/0.825 ALL :: AP=0.7005 - Precision=0.8608 - Recall=0.8083 - Fscore=0.8338\n640_1024_340 0.35/0.825 ALL :: AP=0.6979 - Precision=0.8683 - Recall=0.7976 - Fscore=0.8315\n640_1024_345 0.35/0.825 ALL :: AP=0.6818 - Precision=0.8432 - Recall=0.7995 - Fscore=0.8208\n640_1024_350 0.35/0.825 ALL :: AP=0.6933 - Precision=0.8465 - Recall=0.8090 - Fscore=0.8273\n640_1024_355 0.35/0.825 ALL :: AP=0.6955 - Precision=0.8534 - Recall=0.8067 - Fscore=0.8294\n640_1024_360 0.35/0.825 ALL :: AP=0.6930 - Precision=0.8762 - Recall=0.7819 - Fscore=0.8264\n640_1024_365 0.35/0.825 ALL :: AP=0.6974 - Precision=0.8499 - Recall=0.8123 - Fscore=0.8307\n640_1024_370 0.35/0.825 ALL :: AP=0.6947 - Precision=0.8456 - Recall=0.8139 - Fscore=0.8294\n640_1024_375 0.35/0.825 ALL :: AP=0.6806 - Precision=0.8414 - Recall=0.7989 - Fscore=0.8196\n640_1024_380 0.35/0.825 ALL :: AP=0.6957 - Precision=0.8538 - Recall=0.8070 - Fscore=0.8298\n640_1024_385 0.35/0.825 ALL :: AP=0.6993 - Precision=0.8491 - Recall=0.8158 - Fscore=0.8321\n640_1024_390 0.35/0.825 ALL :: AP=0.6997 - Precision=0.8530 - Recall=0.8113 - Fscore=0.8316\n640_1024_395 0.35/0.825 ALL :: AP=0.6755 - Precision=0.8327 - Recall=0.8015 - Fscore=0.8168\n640_1024_400 0.35/0.825 ALL :: AP=0.7079 - Precision=0.8689 - Recall=0.8057 - Fscore=0.8361\n640_1024_405 0.35/0.825 ALL :: AP=0.6922 - Precision=0.8531 - Recall=0.8048 - Fscore=0.8282\n640_1024_410 0.35/0.825 ALL :: AP=0.7096 - Precision=0.8579 - Recall=0.8204 - Fscore=0.8387\n640_1024_415 0.35/0.825 ALL :: AP=0.7033 - Precision=0.8450 - Recall=0.8230 - Fscore=0.8339\n640_1024_420 0.35/0.825 ALL :: AP=0.6921 - Precision=0.8343 - Recall=0.8220 - Fscore=0.8281\n640_1024_425 0.35/0.825 ALL :: AP=0.6949 - Precision=0.8466 - Recall=0.8129 - Fscore=0.8294\n640_1024_430 0.35/0.825 ALL :: AP=0.6924 - Precision=0.8455 - Recall=0.8100 - Fscore=0.8274\n640_1024_435 0.35/0.825 ALL :: AP=0.6899 - Precision=0.8387 - Recall=0.8119 - Fscore=0.8251\n640_1024_440 0.35/0.825 ALL :: AP=0.7132 - Precision=0.8779 - Recall=0.8061 - Fscore=0.8404\n640_1024_445 0.35/0.825 ALL :: AP=0.6952 - Precision=0.8301 - Recall=0.8331 - Fscore=0.8316\n640_1024_450 0.35/0.825 ALL :: AP=0.7022 - Precision=0.8593 - Recall=0.8083 - Fscore=0.8331\n640_1024_455 0.35/0.825 ALL :: AP=0.6989 - Precision=0.8478 - Recall=0.8155 - Fscore=0.8314\n640_1024_460 0.35/0.825 ALL :: AP=0.7011 - Precision=0.8488 - Recall=0.8181 - Fscore=0.8332\n640_1024_465 0.35/0.825 ALL :: AP=0.7035 - Precision=0.8504 - Recall=0.8191 - Fscore=0.8345\n640_1024_470 0.35/0.825 ALL :: AP=0.6974 - Precision=0.8583 - Recall=0.8057 - Fscore=0.8312\n640_1024_475 0.35/0.825 ALL :: AP=0.7064 - Precision=0.8565 - Recall=0.8171 - Fscore=0.8364\n640_1024_480 0.35/0.825 ALL :: AP=0.6938 - Precision=0.8571 - Recall=0.8015 - Fscore=0.8284\n640_1024_485 0.35/0.825 ALL :: AP=0.6903 - Precision=0.8358 - Recall=0.8181 - Fscore=0.8269\n640_1024_490 0.35/0.825 ALL :: AP=0.7216 - Precision=0.8653 - Recall=0.8269 - Fscore=0.8457\n640_1024_495 0.35/0.825 ALL :: AP=0.6949 - Precision=0.8535 - Recall=0.8054 - Fscore=0.8288\n640_1024_500 0.35/0.825 ALL :: AP=0.7022 - Precision=0.8380 - Recall=0.8295 - Fscore=0.8337\n640_1024_505 0.35/0.825 ALL :: AP=0.7058 - Precision=0.8461 - Recall=0.8259 - Fscore=0.8359\n640_1024_510 0.35/0.825 ALL :: AP=0.7020 - Precision=0.8359 - Recall=0.8302 - Fscore=0.8330\n640_1024_515 0.35/0.825 ALL :: AP=0.7108 - Precision=0.8612 - Recall=0.8188 - Fscore=0.8394\n640_1024_520 0.35/0.825 ALL :: AP=0.6952 - Precision=0.8337 - Recall=0.8250 - Fscore=0.8293\n640_1024_525 0.35/0.825 ALL :: AP=0.6513 - Precision=0.7806 - Recall=0.8256 - Fscore=0.8025\n640_1024_530 0.35/0.825 ALL :: AP=0.7150 - Precision=0.8620 - Recall=0.8227 - Fscore=0.8419\n640_1024_535 0.35/0.825 ALL :: AP=0.7025 - Precision=0.8563 - Recall=0.8136 - Fscore=0.8344\n640_1024_540 0.35/0.825 ALL :: AP=0.6968 - Precision=0.8436 - Recall=0.8207 - Fscore=0.8320\n640_1024_545 0.35/0.825 ALL :: AP=0.7020 - Precision=0.8393 - Recall=0.8289 - Fscore=0.8340\n640_1024_550 0.35/0.825 ALL :: AP=0.7073 - Precision=0.8600 - Recall=0.8152 - Fscore=0.8370\n640_1024_555 0.35/0.825 ALL :: AP=0.7098 - Precision=0.8350 - Recall=0.8413 - Fscore=0.8381\n640_1024_560 0.35/0.825 ALL :: AP=0.6873 - Precision=0.8209 - Recall=0.8279 - Fscore=0.8244\n640_1024_565 0.35/0.825 ALL :: AP=0.7031 - Precision=0.8477 - Recall=0.8237 - Fscore=0.8355\n640_1024_570 0.35/0.825 ALL :: AP=0.7034 - Precision=0.8444 - Recall=0.8240 - Fscore=0.8340\n640_1024_575 0.35/0.825 ALL :: AP=0.6792 - Precision=0.8153 - Recall=0.8217 - Fscore=0.8185\n640_1024_580 0.35/0.825 ALL :: AP=0.6882 - Precision=0.8304 - Recall=0.8217 - Fscore=0.8260\n640_1024_585 0.35/0.825 ALL :: AP=0.6805 - Precision=0.8316 - Recall=0.8110 - Fscore=0.8211\n640_1024_590 0.35/0.825 ALL :: AP=0.7039 - Precision=0.8354 - Recall=0.8321 - Fscore=0.8338\n640_1024_595 0.35/0.825 ALL :: AP=0.7045 - Precision=0.8492 - Recall=0.8204 - Fscore=0.8345\n640_1024_600 0.35/0.825 ALL :: AP=0.6956 - Precision=0.8333 - Recall=0.8259 - Fscore=0.8296\n640_1024_605 0.35/0.825 ALL :: AP=0.7063 - Precision=0.8498 - Recall=0.8227 - Fscore=0.8360\n640_1024_610 0.35/0.825 ALL :: AP=0.6922 - Precision=0.8296 - Recall=0.8250 - Fscore=0.8273\n640_1024_615 0.35/0.825 ALL :: AP=0.6970 - Precision=0.8459 - Recall=0.8175 - Fscore=0.8314\n640_1024_620 0.35/0.825 ALL :: AP=0.6971 - Precision=0.8452 - Recall=0.8152 - Fscore=0.8299\n640_1024_625 0.35/0.825 ALL :: AP=0.7046 - Precision=0.8491 - Recall=0.8214 - Fscore=0.8350\n640_1024_630 0.35/0.825 ALL :: AP=0.6983 - Precision=0.8487 - Recall=0.8139 - Fscore=0.8309\n640_1024_635 0.35/0.825 ALL :: AP=0.6959 - Precision=0.8364 - Recall=0.8233 - Fscore=0.8298\n640_1024_640 0.35/0.825 ALL :: AP=0.6992 - Precision=0.8524 - Recall=0.8129 - Fscore=0.8322\n640_1024_645 0.35/0.825 ALL :: AP=0.6922 - Precision=0.8312 - Recall=0.8220 - Fscore=0.8266\n640_1024_650 0.35/0.825 ALL :: AP=0.6848 - Precision=0.8387 - Recall=0.8087 - Fscore=0.8234\n640_1024_655 0.35/0.825 ALL :: AP=0.6947 - Precision=0.8377 - Recall=0.8211 - Fscore=0.8293\n640_1024_660 0.35/0.825 ALL :: AP=0.6893 - Precision=0.8338 - Recall=0.8175 - Fscore=0.8255\n```\n\n**2.** [Total-Text](https://github.com/GXYM/TextBPN-Plus-Plus/blob/main/output/Analysis/totalText_eval.txt)  \n```\n640_1024_55_0.7_0.6 0.3/0.85 ALL :: Precision=0.8919 - Recall=0.5764 - Fscore=0.7003\n640_1024_55_0.8_0.4 0.3/0.85 ALL :: Precision=0.7440 - Recall=0.4755 - Fscore=0.5802\n640_1024_60_0.7_0.6 0.3/0.85 ALL :: Precision=0.8882 - Recall=0.7482 - Fscore=0.8122\n640_1024_60_0.8_0.4 0.3/0.85 ALL :: Precision=0.8648 - Recall=0.6999 - Fscore=0.7737\n640_1024_65_0.7_0.6 0.3/0.85 ALL :: Precision=0.8836 - Recall=0.7422 - Fscore=0.8067\n640_1024_65_0.8_0.4 0.3/0.85 ALL :: Precision=0.8580 - Recall=0.6985 - Fscore=0.7701\n640_1024_70_0.7_0.6 0.3/0.85 ALL :: Precision=0.8578 - Recall=0.7809 - Fscore=0.8175\n640_1024_70_0.8_0.4 0.3/0.85 ALL :: Precision=0.8510 - Recall=0.7440 - Fscore=0.7939\n640_1024_75_0.7_0.6 0.3/0.85 ALL :: Precision=0.9005 - Recall=0.7093 - Fscore=0.7935\n640_1024_75_0.8_0.4 0.3/0.85 ALL :: Precision=0.7720 - Recall=0.5996 - Fscore=0.6750\n640_1024_80_0.7_0.6 0.3/0.85 ALL :: Precision=0.8779 - Recall=0.7393 - Fscore=0.8026\n640_1024_80_0.8_0.4 0.3/0.85 ALL :: Precision=0.9035 - Recall=0.7136 - Fscore=0.7974\n640_1024_85_0.7_0.6 0.3/0.85 ALL :: Precision=0.9066 - Recall=0.7356 - Fscore=0.8122\n640_1024_85_0.8_0.4 0.3/0.85 ALL :: Precision=0.9056 - Recall=0.6985 - Fscore=0.7887\n640_1024_90_0.7_0.6 0.3/0.85 ALL :: Precision=0.9147 - Recall=0.7342 - Fscore=0.8146\n640_1024_90_0.8_0.4 0.3/0.85 ALL :: Precision=0.8714 - Recall=0.6778 - Fscore=0.7625\n640_1024_95_0.7_0.6 0.3/0.85 ALL :: Precision=0.9105 - Recall=0.7541 - Fscore=0.8249\n640_1024_95_0.8_0.4 0.3/0.85 ALL :: Precision=0.8974 - Recall=0.7189 - Fscore=0.7983\n640_1024_100_0.7_0.6 0.3/0.85 ALL :: Precision=0.8707 - Recall=0.7873 - Fscore=0.8269\n640_1024_100_0.8_0.4 0.3/0.85 ALL :: Precision=0.8760 - Recall=0.7581 - Fscore=0.8128\n640_1024_105_0.7_0.6 0.3/0.85 ALL :: Precision=0.8976 - Recall=0.7715 - Fscore=0.8298\n640_1024_105_0.8_0.4 0.3/0.85 ALL :: Precision=0.8745 - Recall=0.7299 - Fscore=0.7956\n640_1024_110_0.7_0.6 0.3/0.85 ALL :: Precision=0.8657 - Recall=0.7544 - Fscore=0.8062\n640_1024_110_0.8_0.4 0.3/0.85 ALL :: Precision=0.8370 - Recall=0.7028 - Fscore=0.7641\n640_1024_115_0.7_0.6 0.3/0.85 ALL :: Precision=0.8907 - Recall=0.7803 - Fscore=0.8318\n640_1024_115_0.8_0.4 0.3/0.85 ALL :: Precision=0.8775 - Recall=0.7494 - Fscore=0.8084\n640_1024_120_0.7_0.6 0.3/0.85 ALL :: Precision=0.9044 - Recall=0.7874 - Fscore=0.8419\n640_1024_120_0.8_0.4 0.3/0.85 ALL :: Precision=0.8854 - Recall=0.7429 - Fscore=0.8079\n640_1024_125_0.7_0.6 0.3/0.85 ALL :: Precision=0.8920 - Recall=0.7837 - Fscore=0.8343\n640_1024_125_0.8_0.4 0.3/0.85 ALL :: Precision=0.8616 - Recall=0.7341 - Fscore=0.7927\n640_1024_130_0.7_0.6 0.3/0.85 ALL :: Precision=0.9021 - Recall=0.7599 - Fscore=0.8249\n640_1024_130_0.8_0.4 0.3/0.85 ALL :: Precision=0.8815 - Recall=0.7221 - Fscore=0.7939\n640_1024_135_0.7_0.6 0.3/0.85 ALL :: Precision=0.9095 - Recall=0.7417 - Fscore=0.8170\n640_1024_135_0.8_0.4 0.3/0.85 ALL :: Precision=0.8822 - Recall=0.6976 - Fscore=0.7792\n640_1024_140_0.7_0.6 0.3/0.85 ALL :: Precision=0.9052 - Recall=0.7681 - Fscore=0.8310\n640_1024_140_0.8_0.4 0.3/0.85 ALL :: Precision=0.8372 - Recall=0.7023 - Fscore=0.7638\n640_1024_145_0.7_0.6 0.3/0.85 ALL :: Precision=0.9037 - Recall=0.7492 - Fscore=0.8192\n640_1024_145_0.8_0.4 0.3/0.85 ALL :: Precision=0.8498 - Recall=0.6900 - Fscore=0.7616\n640_1024_150_0.7_0.6 0.3/0.85 ALL :: Precision=0.8820 - Recall=0.7822 - Fscore=0.8291\n640_1024_150_0.8_0.4 0.3/0.85 ALL :: Precision=0.7898 - Recall=0.6825 - Fscore=0.7322\n640_1024_155_0.7_0.6 0.3/0.85 ALL :: Precision=0.9289 - Recall=0.7593 - Fscore=0.8356\n640_1024_155_0.8_0.4 0.3/0.85 ALL :: Precision=0.8968 - Recall=0.7114 - Fscore=0.7934\n640_1024_160_0.7_0.6 0.3/0.85 ALL :: Precision=0.8956 - Recall=0.7883 - Fscore=0.8385\n640_1024_160_0.8_0.4 0.3/0.85 ALL :: Precision=0.8661 - Recall=0.7422 - Fscore=0.7994\n640_1024_165_0.7_0.6 0.3/0.85 ALL :: Precision=0.8919 - Recall=0.7734 - Fscore=0.8284\n640_1024_165_0.8_0.4 0.3/0.85 ALL :: Precision=0.9038 - Recall=0.7515 - Fscore=0.8206\n640_1024_170_0.7_0.6 0.3/0.85 ALL :: Precision=0.9014 - Recall=0.7495 - Fscore=0.8184\n640_1024_170_0.8_0.4 0.3/0.85 ALL :: Precision=0.8178 - Recall=0.6698 - Fscore=0.7364\n640_1024_175_0.7_0.6 0.3/0.85 ALL :: Precision=0.8883 - Recall=0.7802 - Fscore=0.8307\n640_1024_175_0.8_0.4 0.3/0.85 ALL :: Precision=0.8175 - Recall=0.7062 - Fscore=0.7578\n640_1024_180_0.7_0.6 0.3/0.85 ALL :: Precision=0.8944 - Recall=0.8206 - Fscore=0.8559\n640_1024_180_0.8_0.4 0.3/0.85 ALL :: Precision=0.8799 - Recall=0.7856 - Fscore=0.8300\n640_1024_185_0.7_0.6 0.3/0.85 ALL :: Precision=0.9049 - Recall=0.8038 - Fscore=0.8514\n640_1024_185_0.8_0.4 0.3/0.85 ALL :: Precision=0.8634 - Recall=0.7460 - Fscore=0.8004\n640_1024_190_0.7_0.6 0.3/0.85 ALL :: Precision=0.9025 - Recall=0.7893 - Fscore=0.8421\n640_1024_190_0.8_0.4 0.3/0.85 ALL :: Precision=0.8476 - Recall=0.7310 - Fscore=0.7850\n640_1024_195_0.7_0.6 0.3/0.85 ALL :: Precision=0.8633 - Recall=0.8346 - Fscore=0.8487\n640_1024_195_0.8_0.4 0.3/0.85 ALL :: Precision=0.8211 - Recall=0.7784 - Fscore=0.7992\n640_1024_200_0.7_0.6 0.3/0.85 ALL :: Precision=0.8903 - Recall=0.7979 - Fscore=0.8416\n640_1024_200_0.8_0.4 0.3/0.85 ALL :: Precision=0.8883 - Recall=0.7704 - Fscore=0.8251\n640_1024_205_0.7_0.6 0.3/0.85 ALL :: Precision=0.8790 - Recall=0.8166 - Fscore=0.8467\n640_1024_205_0.8_0.4 0.3/0.85 ALL :: Precision=0.8131 - Recall=0.7435 - Fscore=0.7767\n640_1024_210_0.7_0.6 0.3/0.85 ALL :: Precision=0.8907 - Recall=0.7886 - Fscore=0.8365\n640_1024_210_0.8_0.4 0.3/0.85 ALL :: Precision=0.8386 - Recall=0.7358 - Fscore=0.7839\n640_1024_215_0.7_0.6 0.3/0.85 ALL :: Precision=0.9042 - Recall=0.8074 - Fscore=0.8530\n640_1024_215_0.8_0.4 0.3/0.85 ALL :: Precision=0.8894 - Recall=0.7689 - Fscore=0.8247\n640_1024_220_0.7_0.6 0.3/0.85 ALL :: Precision=0.9024 - Recall=0.8124 - Fscore=0.8551\n640_1024_220_0.8_0.4 0.3/0.85 ALL :: Precision=0.8894 - Recall=0.7789 - Fscore=0.8305\n640_1024_225_0.7_0.6 0.3/0.85 ALL :: Precision=0.9243 - Recall=0.7753 - Fscore=0.8433\n640_1024_225_0.8_0.4 0.3/0.85 ALL :: Precision=0.8966 - Recall=0.7331 - Fscore=0.8067\n640_1024_230_0.7_0.6 0.3/0.85 ALL :: Precision=0.9038 - Recall=0.7887 - Fscore=0.8423\n640_1024_230_0.8_0.4 0.3/0.85 ALL :: Precision=0.8479 - Recall=0.7240 - Fscore=0.7811\n640_1024_235_0.7_0.6 0.3/0.85 ALL :: Precision=0.8893 - Recall=0.8155 - Fscore=0.8508\n640_1024_235_0.8_0.4 0.3/0.85 ALL :: Precision=0.8738 - Recall=0.7748 - Fscore=0.8213\n640_1024_240_0.7_0.6 0.3/0.85 ALL :: Precision=0.9134 - Recall=0.7917 - Fscore=0.8482\n640_1024_240_0.8_0.4 0.3/0.85 ALL :: Precision=0.8867 - Recall=0.7480 - Fscore=0.8115\n640_1024_245_0.7_0.6 0.3/0.85 ALL :: Precision=0.8786 - Recall=0.8240 - Fscore=0.8505\n640_1024_245_0.8_0.4 0.3/0.85 ALL :: Precision=0.8876 - Recall=0.8023 - Fscore=0.8428\n640_1024_250_0.7_0.6 0.3/0.85 ALL :: Precision=0.9029 - Recall=0.8116 - Fscore=0.8548\n640_1024_250_0.8_0.4 0.3/0.85 ALL :: Precision=0.8781 - Recall=0.7730 - Fscore=0.8222\n640_1024_255_0.7_0.6 0.3/0.85 ALL :: Precision=0.8993 - Recall=0.8044 - Fscore=0.8492\n640_1024_255_0.8_0.4 0.3/0.85 ALL :: Precision=0.8817 - Recall=0.7730 - Fscore=0.8237\n640_1024_260_0.7_0.6 0.3/0.85 ALL :: Precision=0.9177 - Recall=0.7869 - Fscore=0.8473\n640_1024_260_0.8_0.4 0.3/0.85 ALL :: Precision=0.8965 - Recall=0.7519 - Fscore=0.8179\n640_1024_265_0.7_0.6 0.3/0.85 ALL :: Precision=0.9016 - Recall=0.7657 - Fscore=0.8281\n640_1024_265_0.8_0.4 0.3/0.85 ALL :: Precision=0.8101 - Recall=0.6797 - Fscore=0.7392\n640_1024_270_0.7_0.6 0.3/0.85 ALL :: Precision=0.9182 - Recall=0.7625 - Fscore=0.8332\n640_1024_270_0.8_0.4 0.3/0.85 ALL :: Precision=0.8800 - Recall=0.7181 - Fscore=0.7908\n640_1024_275_0.7_0.6 0.3/0.85 ALL :: Precision=0.8808 - Recall=0.8362 - Fscore=0.8579\n640_1024_275_0.8_0.4 0.3/0.85 ALL :: Precision=0.8450 - Recall=0.7865 - Fscore=0.8147\n640_1024_280_0.7_0.6 0.3/0.85 ALL :: Precision=0.8764 - Recall=0.8334 - Fscore=0.8544\n640_1024_280_0.8_0.4 0.3/0.85 ALL :: Precision=0.8273 - Recall=0.7779 - Fscore=0.8018\n640_1024_285_0.7_0.6 0.3/0.85 ALL :: Precision=0.9112 - Recall=0.7925 - Fscore=0.8477\n640_1024_285_0.8_0.4 0.3/0.85 ALL :: Precision=0.8581 - Recall=0.7381 - Fscore=0.7936\n640_1024_290_0.7_0.6 0.3/0.85 ALL :: Precision=0.9156 - Recall=0.8065 - Fscore=0.8576\n640_1024_290_0.8_0.4 0.3/0.85 ALL :: Precision=0.8930 - Recall=0.7752 - Fscore=0.8299\n640_1024_295_0.7_0.6 0.3/0.85 ALL :: Precision=0.8806 - Recall=0.8238 - Fscore=0.8512\n640_1024_295_0.8_0.4 0.3/0.85 ALL :: Precision=0.8351 - Recall=0.7685 - Fscore=0.8004\n640_1024_300_0.7_0.6 0.3/0.85 ALL :: Precision=0.9062 - Recall=0.8168 - Fscore=0.8592\n640_1024_300_0.8_0.4 0.3/0.85 ALL :: Precision=0.8715 - Recall=0.7657 - Fscore=0.8152\n640_1024_305_0.7_0.6 0.3/0.85 ALL :: Precision=0.8940 - Recall=0.8250 - Fscore=0.8581\n640_1024_305_0.8_0.4 0.3/0.85 ALL :: Precision=0.8665 - Recall=0.7877 - Fscore=0.8252\n640_1024_310_0.7_0.6 0.3/0.85 ALL :: Precision=0.9030 - Recall=0.8124 - Fscore=0.8553\n640_1024_310_0.8_0.4 0.3/0.85 ALL :: Precision=0.8219 - Recall=0.7302 - Fscore=0.7734\n640_1024_315_0.7_0.6 0.3/0.85 ALL :: Precision=0.8854 - Recall=0.8052 - Fscore=0.8434\n640_1024_315_0.8_0.4 0.3/0.85 ALL :: Precision=0.8243 - Recall=0.7396 - Fscore=0.7797\n640_1024_320_0.7_0.6 0.3/0.85 ALL :: Precision=0.9020 - Recall=0.8080 - Fscore=0.8524\n640_1024_320_0.8_0.4 0.3/0.85 ALL :: Precision=0.8380 - Recall=0.7368 - Fscore=0.7842\n640_1024_325_0.7_0.6 0.3/0.85 ALL :: Precision=0.9207 - Recall=0.7591 - Fscore=0.8321\n640_1024_325_0.8_0.4 0.3/0.85 ALL :: Precision=0.8780 - Recall=0.7137 - Fscore=0.7874\n640_1024_330_0.7_0.6 0.3/0.85 ALL :: Precision=0.9092 - Recall=0.8017 - Fscore=0.8521\n640_1024_330_0.8_0.4 0.3/0.85 ALL :: Precision=0.8613 - Recall=0.7506 - Fscore=0.8022\n640_1024_335_0.7_0.6 0.3/0.85 ALL :: Precision=0.9124 - Recall=0.7884 - Fscore=0.8459\n640_1024_335_0.8_0.4 0.3/0.85 ALL :: Precision=0.8746 - Recall=0.7389 - Fscore=0.8010\n640_1024_340_0.7_0.6 0.3/0.85 ALL :: Precision=0.8851 - Recall=0.8261 - Fscore=0.8546\n640_1024_340_0.8_0.4 0.3/0.85 ALL :: Precision=0.8666 - Recall=0.7851 - Fscore=0.8238\n640_1024_345_0.7_0.6 0.3/0.85 ALL :: Precision=0.8868 - Recall=0.8367 - Fscore=0.8610\n640_1024_345_0.8_0.4 0.3/0.85 ALL :: Precision=0.8489 - Recall=0.7883 - Fscore=0.8175\n640_1024_350_0.7_0.6 0.3/0.85 ALL :: Precision=0.8902 - Recall=0.8297 - Fscore=0.8589\n640_1024_350_0.8_0.4 0.3/0.85 ALL :: Precision=0.8778 - Recall=0.8043 - Fscore=0.8394\n640_1024_355_0.7_0.6 0.3/0.85 ALL :: Precision=0.9133 - Recall=0.8091 - Fscore=0.8580\n640_1024_355_0.8_0.4 0.3/0.85 ALL :: Precision=0.8882 - Recall=0.7691 - Fscore=0.8244\n640_1024_360_0.7_0.6 0.3/0.85 ALL :: Precision=0.9170 - Recall=0.8006 - Fscore=0.8549\n640_1024_360_0.8_0.4 0.3/0.85 ALL :: Precision=0.8782 - Recall=0.7532 - Fscore=0.8109\n640_1024_365_0.7_0.6 0.3/0.85 ALL :: Precision=0.9107 - Recall=0.8159 - Fscore=0.8607\n640_1024_365_0.8_0.4 0.3/0.85 ALL :: Precision=0.8958 - Recall=0.7883 - Fscore=0.8386\n640_1024_370_0.7_0.6 0.3/0.85 ALL :: Precision=0.9239 - Recall=0.8065 - Fscore=0.8612\n640_1024_370_0.8_0.4 0.3/0.85 ALL :: Precision=0.8984 - Recall=0.7693 - Fscore=0.8289\n640_1024_375_0.7_0.6 0.3/0.85 ALL :: Precision=0.9133 - Recall=0.8207 - Fscore=0.8645\n640_1024_375_0.8_0.4 0.3/0.85 ALL :: Precision=0.8985 - Recall=0.7885 - Fscore=0.8399\n640_1024_380_0.7_0.6 0.3/0.85 ALL :: Precision=0.9233 - Recall=0.7982 - Fscore=0.8562\n640_1024_380_0.8_0.4 0.3/0.85 ALL :: Precision=0.8988 - Recall=0.7631 - Fscore=0.8254\n640_1024_385_0.7_0.6 0.3/0.85 ALL :: Precision=0.9085 - Recall=0.8270 - Fscore=0.8659\n640_1024_385_0.8_0.4 0.3/0.85 ALL :: Precision=0.8804 - Recall=0.7867 - Fscore=0.8309\n640_1024_390_0.7_0.6 0.3/0.85 ALL :: Precision=0.8920 - Recall=0.8430 - Fscore=0.8668\n640_1024_390_0.8_0.4 0.3/0.85 ALL :: Precision=0.8527 - Recall=0.7916 - Fscore=0.8210\n640_1024_395_0.7_0.6 0.3/0.85 ALL :: Precision=0.9081 - Recall=0.8270 - Fscore=0.8657\n640_1024_395_0.8_0.4 0.3/0.85 ALL :: Precision=0.8728 - Recall=0.7782 - Fscore=0.8228\n640_1024_400_0.7_0.6 0.3/0.85 ALL :: Precision=0.9075 - Recall=0.8230 - Fscore=0.8632\n640_1024_400_0.8_0.4 0.3/0.85 ALL :: Precision=0.8488 - Recall=0.7623 - Fscore=0.8032\n640_1024_405_0.7_0.6 0.3/0.85 ALL :: Precision=0.8951 - Recall=0.8366 - Fscore=0.8648\n640_1024_405_0.8_0.4 0.3/0.85 ALL :: Precision=0.8294 - Recall=0.7659 - Fscore=0.7964\n640_1024_410_0.7_0.6 0.3/0.85 ALL :: Precision=0.9087 - Recall=0.8168 - Fscore=0.8603\n640_1024_410_0.8_0.4 0.3/0.85 ALL :: Precision=0.8503 - Recall=0.7520 - Fscore=0.7981\n640_1024_415_0.7_0.6 0.3/0.85 ALL :: Precision=0.8821 - Recall=0.8457 - Fscore=0.8635\n640_1024_415_0.8_0.4 0.3/0.85 ALL :: Precision=0.8498 - Recall=0.8046 - Fscore=0.8266\n640_1024_420_0.7_0.6 0.3/0.85 ALL :: Precision=0.9138 - Recall=0.8333 - Fscore=0.8717\n640_1024_420_0.8_0.4 0.3/0.85 ALL :: Precision=0.8890 - Recall=0.7927 - Fscore=0.8381\n640_1024_425_0.7_0.6 0.3/0.85 ALL :: Precision=0.9073 - Recall=0.8247 - Fscore=0.8640\n640_1024_425_0.8_0.4 0.3/0.85 ALL :: Precision=0.8860 - Recall=0.7914 - Fscore=0.8360\n640_1024_430_0.7_0.6 0.3/0.85 ALL :: Precision=0.9150 - Recall=0.8278 - Fscore=0.8692\n640_1024_430_0.8_0.4 0.3/0.85 ALL :: Precision=0.8908 - Recall=0.7936 - Fscore=0.8394\n640_1024_435_0.7_0.6 0.3/0.85 ALL :: Precision=0.9052 - Recall=0.8284 - Fscore=0.8651\n640_1024_435_0.8_0.4 0.3/0.85 ALL :: Precision=0.8501 - Recall=0.7658 - Fscore=0.8057\n640_1024_440_0.7_0.6 0.3/0.85 ALL :: Precision=0.9019 - Recall=0.8144 - Fscore=0.8559\n640_1024_440_0.8_0.4 0.3/0.85 ALL :: Precision=0.8591 - Recall=0.7656 - Fscore=0.8097\n640_1024_445_0.7_0.6 0.3/0.85 ALL :: Precision=0.9054 - Recall=0.8309 - Fscore=0.8665\n640_1024_445_0.8_0.4 0.3/0.85 ALL :: Precision=0.8720 - Recall=0.7837 - Fscore=0.8255\n640_1024_450_0.7_0.6 0.3/0.85 ALL :: Precision=0.9026 - Recall=0.8360 - Fscore=0.8680\n640_1024_450_0.8_0.4 0.3/0.85 ALL :: Precision=0.8756 - Recall=0.7980 - Fscore=0.8350\n640_1024_455_0.7_0.6 0.3/0.85 ALL :: Precision=0.9142 - Recall=0.8088 - Fscore=0.8583\n640_1024_455_0.8_0.4 0.3/0.85 ALL :: Precision=0.8801 - Recall=0.7687 - Fscore=0.8206\n640_1024_460_0.7_0.6 0.3/0.85 ALL :: Precision=0.9003 - Recall=0.8290 - Fscore=0.8632\n640_1024_460_0.8_0.4 0.3/0.85 ALL :: Precision=0.8814 - Recall=0.7963 - Fscore=0.8367\n640_1024_465_0.7_0.6 0.3/0.85 ALL :: Precision=0.9083 - Recall=0.8284 - Fscore=0.8665\n640_1024_465_0.8_0.4 0.3/0.85 ALL :: Precision=0.8463 - Recall=0.7624 - Fscore=0.8022\n640_1024_470_0.7_0.6 0.3/0.85 ALL :: Precision=0.9052 - Recall=0.8264 - Fscore=0.8640\n640_1024_470_0.8_0.4 0.3/0.85 ALL :: Precision=0.8668 - Recall=0.7785 - Fscore=0.8203\n640_1024_475_0.7_0.6 0.3/0.85 ALL :: Precision=0.9228 - Recall=0.8064 - Fscore=0.8607\n640_1024_475_0.8_0.4 0.3/0.85 ALL :: Precision=0.9016 - Recall=0.7737 - Fscore=0.8328\n640_1024_480_0.7_0.6 0.3/0.85 ALL :: Precision=0.9117 - Recall=0.8129 - Fscore=0.8594\n640_1024_480_0.8_0.4 0.3/0.85 ALL :: Precision=0.9108 - Recall=0.7998 - Fscore=0.8517\n640_1024_485_0.7_0.6 0.3/0.85 ALL :: Precision=0.8764 - Recall=0.8402 - Fscore=0.8579\n640_1024_485_0.8_0.4 0.3/0.85 ALL :: Precision=0.8354 - Recall=0.7776 - Fscore=0.8055\n640_1024_490_0.7_0.6 0.3/0.85 ALL :: Precision=0.9033 - Recall=0.8417 - Fscore=0.8714\n640_1024_490_0.8_0.4 0.3/0.85 ALL :: Precision=0.8577 - Recall=0.7890 - Fscore=0.8219\n640_1024_495_0.7_0.6 0.3/0.85 ALL :: Precision=0.9119 - Recall=0.8284 - Fscore=0.8682\n640_1024_495_0.8_0.4 0.3/0.85 ALL :: Precision=0.8496 - Recall=0.7620 - Fscore=0.8034\n640_1024_500_0.7_0.6 0.3/0.85 ALL :: Precision=0.9022 - Recall=0.8422 - Fscore=0.8712\n640_1024_500_0.8_0.4 0.3/0.85 ALL :: Precision=0.8714 - Recall=0.7998 - Fscore=0.8341\n640_1024_505_0.7_0.6 0.3/0.85 ALL :: Precision=0.9168 - Recall=0.8249 - Fscore=0.8684\n640_1024_505_0.8_0.4 0.3/0.85 ALL :: Precision=0.8742 - Recall=0.7757 - Fscore=0.8220\n640_1024_510_0.7_0.6 0.3/0.85 ALL :: Precision=0.8980 - Recall=0.8305 - Fscore=0.8629\n640_1024_510_0.8_0.4 0.3/0.85 ALL :: Precision=0.8457 - Recall=0.7719 - Fscore=0.8071\n640_1024_515_0.7_0.6 0.3/0.85 ALL :: Precision=0.9138 - Recall=0.8304 - Fscore=0.8701\n640_1024_515_0.8_0.4 0.3/0.85 ALL :: Precision=0.8791 - Recall=0.7894 - Fscore=0.8318\n640_1024_520_0.7_0.6 0.3/0.85 ALL :: Precision=0.9174 - Recall=0.8219 - Fscore=0.8670\n640_1024_520_0.8_0.4 0.3/0.85 ALL :: Precision=0.8750 - Recall=0.7736 - Fscore=0.8212\n640_1024_525_0.7_0.6 0.3/0.85 ALL :: Precision=0.9173 - Recall=0.8246 - Fscore=0.8685\n640_1024_525_0.8_0.4 0.3/0.85 ALL :: Precision=0.8903 - Recall=0.7902 - Fscore=0.8373\n640_1024_530_0.7_0.6 0.3/0.85 ALL :: Precision=0.9003 - Recall=0.8358 - Fscore=0.8668\n640_1024_530_0.8_0.4 0.3/0.85 ALL :: Precision=0.8657 - Recall=0.7912 - Fscore=0.8268\n640_1024_535_0.7_0.6 0.3/0.85 ALL :: Precision=0.9101 - Recall=0.8303 - Fscore=0.8684\n640_1024_535_0.8_0.4 0.3/0.85 ALL :: Precision=0.8425 - Recall=0.7601 - Fscore=0.7991\n640_1024_540_0.7_0.6 0.3/0.85 ALL :: Precision=0.9070 - Recall=0.8270 - Fscore=0.8651\n640_1024_540_0.8_0.4 0.3/0.85 ALL :: Precision=0.8880 - Recall=0.7922 - Fscore=0.8374\n640_1024_545_0.7_0.6 0.3/0.85 ALL :: Precision=0.8862 - Recall=0.8309 - Fscore=0.8577\n640_1024_545_0.8_0.4 0.3/0.85 ALL :: Precision=0.8246 - Recall=0.7681 - Fscore=0.7953\n640_1024_550_0.7_0.6 0.3/0.85 ALL :: Precision=0.9061 - Recall=0.8276 - Fscore=0.8651\n640_1024_550_0.8_0.4 0.3/0.85 ALL :: Precision=0.8729 - Recall=0.7858 - Fscore=0.8271\n640_1024_555_0.7_0.6 0.3/0.85 ALL :: Precision=0.9012 - Recall=0.8410 - Fscore=0.8701\n640_1024_555_0.8_0.4 0.3/0.85 ALL :: Precision=0.8597 - Recall=0.7856 - Fscore=0.8210\n640_1024_560_0.7_0.6 0.3/0.85 ALL :: Precision=0.8993 - Recall=0.8335 - Fscore=0.8651\n640_1024_560_0.8_0.4 0.3/0.85 ALL :: Precision=0.8532 - Recall=0.7789 - Fscore=0.8144\n640_1024_565_0.7_0.6 0.3/0.85 ALL :: Precision=0.9188 - Recall=0.8422 - Fscore=0.8788\n640_1024_565_0.8_0.4 0.3/0.85 ALL :: Precision=0.8820 - Recall=0.7945 - Fscore=0.8359\n640_1024_570_0.7_0.6 0.3/0.85 ALL :: Precision=0.9077 - Recall=0.8271 - Fscore=0.8656\n640_1024_570_0.8_0.4 0.3/0.85 ALL :: Precision=0.8659 - Recall=0.7768 - Fscore=0.8189\n640_1024_575_0.7_0.6 0.3/0.85 ALL :: Precision=0.9108 - Recall=0.8214 - Fscore=0.8638\n640_1024_575_0.8_0.4 0.3/0.85 ALL :: Precision=0.8447 - Recall=0.7512 - Fscore=0.7952\n640_1024_580_0.7_0.6 0.3/0.85 ALL :: Precision=0.9136 - Recall=0.8044 - Fscore=0.8555\n640_1024_580_0.8_0.4 0.3/0.85 ALL :: Precision=0.8763 - Recall=0.7612 - Fscore=0.8147\n640_1024_585_0.7_0.6 0.3/0.85 ALL :: Precision=0.9097 - Recall=0.8334 - Fscore=0.8699\n640_1024_585_0.8_0.4 0.3/0.85 ALL :: Precision=0.8672 - Recall=0.7802 - Fscore=0.8214\n640_1024_590_0.7_0.6 0.3/0.85 ALL :: Precision=0.9086 - Recall=0.8271 - Fscore=0.8659\n640_1024_590_0.8_0.4 0.3/0.85 ALL :: Precision=0.8544 - Recall=0.7687 - Fscore=0.8093\n640_1024_595_0.7_0.6 0.3/0.85 ALL :: Precision=0.9148 - Recall=0.8355 - Fscore=0.8734\n640_1024_595_0.8_0.4 0.3/0.85 ALL :: Precision=0.8840 - Recall=0.7904 - Fscore=0.8346\n640_1024_600_0.7_0.6 0.3/0.85 ALL :: Precision=0.9086 - Recall=0.8419 - Fscore=0.8740\n640_1024_600_0.8_0.4 0.3/0.85 ALL :: Precision=0.8787 - Recall=0.7994 - Fscore=0.8372\n640_1024_605_0.7_0.6 0.3/0.85 ALL :: Precision=0.8997 - Recall=0.8450 - Fscore=0.8715\n640_1024_605_0.8_0.4 0.3/0.85 ALL :: Precision=0.8769 - Recall=0.8091 - Fscore=0.8416\n640_1024_610_0.7_0.6 0.3/0.85 ALL :: Precision=0.9126 - Recall=0.8148 - Fscore=0.8609\n640_1024_610_0.8_0.4 0.3/0.85 ALL :: Precision=0.8784 - Recall=0.7695 - Fscore=0.8204\n640_1024_615_0.7_0.6 0.3/0.85 ALL :: Precision=0.9046 - Recall=0.8327 - Fscore=0.8671\n640_1024_615_0.8_0.4 0.3/0.85 ALL :: Precision=0.8632 - Recall=0.7802 - Fscore=0.8196\n640_1024_620_0.7_0.6 0.3/0.85 ALL :: Precision=0.9005 - Recall=0.8495 - Fscore=0.8743\n640_1024_620_0.8_0.4 0.3/0.85 ALL :: Precision=0.8641 - Recall=0.8016 - Fscore=0.8317\n640_1024_625_0.7_0.6 0.3/0.85 ALL :: Precision=0.9048 - Recall=0.8413 - Fscore=0.8719\n640_1024_625_0.8_0.4 0.3/0.85 ALL :: Precision=0.8578 - Recall=0.7865 - Fscore=0.8206\n640_1024_630_0.7_0.6 0.3/0.85 ALL :: Precision=0.9182 - Recall=0.8181 - Fscore=0.8653\n640_1024_630_0.8_0.4 0.3/0.85 ALL :: Precision=0.8621 - Recall=0.7583 - Fscore=0.8069\n640_1024_635_0.7_0.6 0.3/0.85 ALL :: Precision=0.9102 - Recall=0.8383 - Fscore=0.8728\n640_1024_635_0.8_0.4 0.3/0.85 ALL :: Precision=0.8753 - Recall=0.7914 - Fscore=0.8312\n640_1024_640_0.7_0.6 0.3/0.85 ALL :: Precision=0.9160 - Recall=0.8320 - Fscore=0.8720\n640_1024_640_0.8_0.4 0.3/0.85 ALL :: Precision=0.8804 - Recall=0.7851 - Fscore=0.8300\n640_1024_645_0.7_0.6 0.3/0.85 ALL :: Precision=0.9050 - Recall=0.8472 - Fscore=0.8751\n640_1024_645_0.8_0.4 0.3/0.85 ALL :: Precision=0.8799 - Recall=0.8103 - Fscore=0.8437\n640_1024_650_0.7_0.6 0.3/0.85 ALL :: Precision=0.9122 - Recall=0.8394 - Fscore=0.8743\n640_1024_650_0.8_0.4 0.3/0.85 ALL :: Precision=0.8630 - Recall=0.7830 - Fscore=0.8211\n640_1024_655_0.7_0.6 0.3/0.85 ALL :: Precision=0.9052 - Recall=0.8440 - Fscore=0.8735\n640_1024_655_0.8_0.4 0.3/0.85 ALL :: Precision=0.8671 - Recall=0.7947 - Fscore=0.8294\n640_1024_660_0.7_0.6 0.3/0.85 ALL :: Precision=0.9090 - Recall=0.8440 - Fscore=0.8753\n640_1024_660_0.8_0.4 0.3/0.85 ALL :: Precision=0.8612 - Recall=0.7893 - Fscore=0.8237\n```\n\n\n\n\n\n##  Note：Save the quantitative results in output\n"
  },
  {
    "path": "requirements.txt",
    "content": "opencv-python\npillow-simd\nscipy\neasydict\nmatplotlib\nshapely\nlxml\ntorchvision\n"
  },
  {
    "path": "scripts-eval/Eval_ArT.sh",
    "content": "#!/bin/bash\ncd ../\n##################### eval for ArT with ResNet50 1s ###################################\n#CUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN.py --net resnet50 --scale 1 --exp_name ArT --checkepoch 605 --test_size 960 2880 --dis_threshold 0.4 --cls_threshold 0.4 --gpu 0;\n\n\n##################### eval for ArT with ResNet50-DCN 1s ###################################\nCUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN.py --net deformable_resnet50 --scale 1 --exp_name ArT --checkepoch 480 --test_size 960 2880 --dis_threshold 0.4 --cls_threshold 0.8 --gpu 0;\n\n\n##################### batch eval for ArT ###################################\n#for ((i=660; i>=300; i=i-5));\n#do \n#CUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN.py --exp_name ArT --net deformable_resnet50 --checkepoch $i --test_size 960 2880 --dis_threshold 0.45 --cls_threshold 0.8 --gpu 0;\n#done\n \n"
  },
  {
    "path": "scripts-eval/Eval_CTW1500.sh",
    "content": "#!/bin/bash\ncd ../\n\n##################### eval for Ctw1500 with ResNet18 4s ###################################\n#CUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN.py --net resnet18 --scale 4 --exp_name Ctw1500 --checkepoch 105 --test_size 640 1024 --dis_threshold 0.35 --cls_threshold 0.85 --gpu 0;\n\n##################### eval for Ctw1500 with ResNet50 1s ###################################\n#CUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN.py --net resnet50 --scale 1 --exp_name Ctw1500 --checkepoch 155 --test_size 640 1024 --dis_threshold 0.375 --cls_threshold 0.8 --gpu 0;\n\n\n##################### eval for Ctw1500 with ResNet50-DCN 1s ###################################\nCUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN.py --net deformable_resnet50 --scale 1 --exp_name Ctw1500 --checkepoch 565 --test_size 640 1024 --dis_threshold 0.3 --cls_threshold 0.925 --gpu 0;\n\n\n##################### test speed for Ctw1500 ###################################\n#CUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN_speed.py --net resnet18 --scale 4 --exp_name Ctw1500 --checkepoch 570 --test_size 640 1024 --dis_threshold 0.25 --cls_threshold 0.85 --gpu 0;\n\n\n##################### batch eval for Ctw1500 ###################################\n#for ((i=55; i<=660; i=i+5))\n# do CUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN.py --net resnet18 --scale 4 --exp_name Ctw1500 --checkepoch $i --test_size 640 1024 --dis_threshold 0.25 --cls_threshold 0.85 --gpu 1;\n#done\n \n"
  },
  {
    "path": "scripts-eval/Eval_MLT2017.sh",
    "content": "#!/bin/bash\ncd ../\n\n##################### eval for MLT2017 with ResNet50 1s ###################################\nCUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN.py --net resnet50 --scale 1 --exp_name MLT2017 --checkepoch 600 --test_size 960 2048 --dis_threshold 0.5 --cls_threshold 0.75 --gpu 1;\n\n#CUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN.py --net resnet50 --scale 1 --exp_name MLT2017 --checkepoch 600 --test_size 960 2048 --dis_threshold 0.5 --cls_threshold 0.85 --gpu 1;\n\n\n\n##################### eval for Total-Text with ResNet50-DCN 1s ###################################\n#CUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN.py --net deformable_resnet50 --scale 1 --exp_name MLT2017 --checkepoch 545 --test_size 960 2048 --dis_threshold 0.5 --cls_threshold 0.85 --gpu 0;\n\n\n\n##################### batch eval for MLT2017 ###################################\n#for ((i=100; i<=660; i=i+5))\n#do \n# CUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN.py --exp_name MLT2017 --net deformable_resnet50  --scale 1 --checkepoch $i --test_size 960 2048 --dis_threshold 0.5 --cls_threshold 0.8 --gpu 0;\n#done\n \n \n"
  },
  {
    "path": "scripts-eval/Eval_TD500.sh",
    "content": "#!/bin/bash\ncd ../\n\n##################### eval for TD500 with ResNet18 4s ###################################\nCUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN.py --net resnet18 --scale 4 --exp_name TD500 --checkepoch 1135 --test_size 640 960 --dis_threshold 0.35 --cls_threshold 0.9 --gpu 0; #--viz\n\n##################### eval for TD500 with ResNet50 1s ###################################\n#CUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN.py --net resnet50 --scale 1 --exp_name TD500 --checkepoch 1070 --test_size 640 960 --dis_threshold 0.35 --cls_threshold 0.875 --gpu 0;\n\n\n##################### eval for TD500 with ResNet50-DCN 1s ###################################\n#CUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN.py --net deformable_resnet50 --scale 1 --exp_name TD500 --checkepoch 355 --test_size 640 960 --dis_threshold 0.3 --cls_threshold 0.95 --gpu 0;\n\n\n##################### test speed for TD500 ###################################\n#CUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN_speed.py --net resnet18 --scale 4 --exp_name TD500 --checkepoch 1135 --test_size 640 960 --dis_threshold 0.35 --cls_threshold 0.9 --gpu 1;\n\n\n##################### batch eval for TD500 ###################################\n#for ((i=55; i<=1200; i=i+5))\n# do CUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN.py --net resnet18 --scale 4 --exp_name TD500 --checkepoch $i --test_size 640 960 --dis_threshold 0.35 --cls_threshold 0.9 --gpu 1;\n#done\n \n"
  },
  {
    "path": "scripts-eval/Eval_Totaltext.sh",
    "content": "#!/bin/bash\ncd ../\n\n##################### eval for Total-Text with ResNet18 4s ###################################\n#CUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN.py --net resnet18 --scale 4 --exp_name Totaltext --checkepoch 570 --test_size 640 1024 --dis_threshold 0.25 --cls_threshold 0.85 --gpu 0;\n\n##################### eval for Total-Text with ResNet50 1s ###################################\n#CUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN.py --net resnet50 --scale 1 --exp_name Totaltext --checkepoch 285 --test_size 640 1024 --dis_threshold 0.325 --cls_threshold 0.85 --gpu 0;\n\n\n##################### eval for Total-Text with ResNet50-DCN 1s ###################################\nCUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN.py --net deformable_resnet50 --scale 1 --exp_name Totaltext --checkepoch 480 --test_size 640 1024 --dis_threshold 0.325 --cls_threshold 0.9 --gpu 0;\n\n\n##################### test speed for Total-Text ###################################\n#CUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN_speed.py --net resnet18 --scale 4 --exp_name Totaltext --checkepoch 570 --test_size 640 1024 --dis_threshold 0.25 --cls_threshold 0.85 --gpu 0;\n\n\n##################### batch eval for Total-Text ###################################\n#for ((i=55; i<=660; i=i+5))\n# do CUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN.py --net resnet18 --scale 4 --exp_name Totaltext --checkepoch $i --test_size 640 1024 --dis_threshold 0.25 --cls_threshold 0.85 --gpu 1;\n#done\n \n"
  },
  {
    "path": "scripts-train/train_ALL_res50_dcn_1s.sh",
    "content": "#!/bin/bash\ncd ../\nCUDA_LAUNCH_BLOCKING=1 python3 train_textBPN.py --exp_name ALL --net deformable_resnet50 --scale 1 --max_epoch 660 --batch_size 14 --gpu 0 --input_size 640 --optim Adam --lr 0.0001 --num_workers 30 --resume model/ArT-DCN/TextBPN_deformable_resnet50_600.pth \n\n\n#--resume model/Synthtext/TextBPN_deformable_resnet50_0.pth --viz --viz_freq 500\n\n"
  },
  {
    "path": "scripts-train/train_Art_res50_1s_fine_mlt.sh",
    "content": "#!/bin/bash\ncd ../\nCUDA_LAUNCH_BLOCKING=1 python train_textBPN.py --exp_name ArT --net resnet50 --scale 1 --max_epoch 660 --batch_size 12 --gpu 0 --input_size 640 --optim Adam --lr 0.0001 --num_workers 30 --resume model/pretrain/MLT/TextBPN_resnet50_300.pth --viz --viz_freq 300\n\n\n#--viz --viz_freq 80\n#--start_epoch 300\n#--load_memory True\n"
  },
  {
    "path": "scripts-train/train_Art_res50_dcn_1s_fine_mlt.sh",
    "content": "#!/bin/bash\ncd ../\nCUDA_LAUNCH_BLOCKING=1 python3 train_textBPN.py --exp_name ArT --net deformable_resnet50 --scale 1 --max_epoch 660 --batch_size 12 --gpu 0 --input_size 640 --optim Adam --lr 0.0001 --num_workers 30 --resume model/pretrain/MLT/TextBPN_deformable_resnet50_300.pth --viz --viz_freq 300\n\n\n#--viz --viz_freq 80\n#--start_epoch 300\n#--load_memory True\n"
  },
  {
    "path": "scripts-train/train_CTW1500_res18_4s_fine_mlt.sh",
    "content": "#!/bin/bash\ncd ../\nCUDA_LAUNCH_BLOCKING=1 python3 train_textBPN.py --exp_name Ctw1500 --net resnet18 --scale 4 --max_epoch 660 --batch_size 48 --gpu 0 --input_size 640 --optim Adam --lr 0.0001 --num_workers 30 --resume model/pretrain/MLT/TextBPN_resnet18_300.pth --load_memory True\n\n\n#--viz --viz_freq 80\n#--start_epoch 300\n\nfor ((i=55; i<=660; i=i+5))\n do CUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN.py --net resnet18 --scale 4 --exp_name Ctw1500 --checkepoch $i --test_size 640 1024 --dis_threshold 0.35 --cls_threshold 0.875 --gpu 0;\ndone\n\n"
  },
  {
    "path": "scripts-train/train_CTW1500_res50_1s.sh",
    "content": "#!/bin/bash\ncd ../\nCUDA_LAUNCH_BLOCKING=1 python3 train_textBPN.py --exp_name Ctw1500 --net resnet50 --scale 1 --max_epoch 660 --batch_size 12 --gpu 0 --input_size 640 --optim Adam --lr 0.001 --num_workers 28 --viz --viz_freq 80\n\n#--resume model/Ctw1500/TextBPN_resnet50_390.pth --start_epoch 665\n#--resume model/Synthtext/TextBPN_resnet50_0.pth \n#--viz --viz_freq 80\n#--start_epoch 300\n\n###### test eval ############\nfor ((i=100; i<=660; i=i+5))\n do CUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN.py --exp_name Ctw1500 --checkepoch $i --test_size 640 1024 --dis_threshold 0.35 --cls_threshold 0.825 --gpu 0;\ndone\n\n\n\n"
  },
  {
    "path": "scripts-train/train_CTW1500_res50_1s_fine_mlt.sh",
    "content": "#!/bin/bash\ncd ../\nCUDA_LAUNCH_BLOCKING=1 python3 train_textBPN.py --exp_name Ctw1500 --net resnet50 --scale 1 --max_epoch 660 --batch_size 12 --gpu 0 --input_size 640 --optim Adam --lr 0.0001 --num_workers 30 --resume model/pretrain/MLT/TextBPN_resnet50_300.pth --load_memory True \n\n\n#--viz --viz_freq 80\n#--start_epoch 300\n\n#for ((i=55; i<=660; i=i+5))\n# do CUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN.py --net resnet50 --scale 1 --exp_name Ctw1500 --checkepoch $i --test_size 640 1024 --dis_threshold 0.375 --cls_threshold 0.8 --gpu 0;\n#done\n\n"
  },
  {
    "path": "scripts-train/train_CTW1500_res50_dcn_1s_fine_mlt.sh",
    "content": "#!/bin/bash\ncd ../\nCUDA_LAUNCH_BLOCKING=1 python3 train_textBPN.py --exp_name Ctw1500 --net deformable_resnet50 --scale 1 --max_epoch 660 --batch_size 12 --gpu 0 --input_size 640 --optim Adam --lr 0.0001 --num_workers 30 --resume model/pretrain/MLT/TextBPN_deformable_resnet50_300.pth --viz --viz_freq 50\n\n"
  },
  {
    "path": "scripts-train/train_MLT2017_res18_4s.sh",
    "content": "#!/bin/bash\ncd ../\nCUDA_LAUNCH_BLOCKING=1 python3 train_textBPN.py --exp_name MLT2017 --net resnet18 --scale 4 --max_epoch 660 --batch_size 48 --gpu 0 --input_size 640 --optim Adam --lr 0.001 --num_workers 30\n\n#--start_epoch 276\n\n\n# --resume model/MLT2017/TextBPN_resnet50_200.pth --viz --viz_freq 500\n"
  },
  {
    "path": "scripts-train/train_MLT2017_res50_1s.sh",
    "content": "#!/bin/bash\ncd ../\nCUDA_LAUNCH_BLOCKING=1 python3 train_textBPN.py --exp_name MLT2017 --net resnet50 --scale 1 --max_epoch 660 --batch_size 12 --gpu 0 --input_size 640 --optim Adam --lr 0.001 --num_workers 30 \n\n#--resume model/MLT2017/TextBPN_resnet50_300.pth --viz --viz_freq 500\n\n\n"
  },
  {
    "path": "scripts-train/train_MLT2017_res50_dcn_1s.sh",
    "content": "#!/bin/bash\ncd ../\nCUDA_LAUNCH_BLOCKING=1 python3 train_textBPN.py --exp_name MLT2017 --net deformable_resnet50 --scale 1 --max_epoch 660 --batch_size 12 --gpu 0 --input_size 640 --optim Adam --lr 0.0001 --num_workers 30 --resume model/pretrain/Syn/TextBPN_deformable_resnet50_0.pth\n\n\n#--resume model/Synthtext/TextBPN_deformable_resnet50_0.pth --viz --viz_freq 500\n\n"
  },
  {
    "path": "scripts-train/train_MLT2019_res50_1s.sh",
    "content": "#!/bin/bash\ncd ../\nCUDA_LAUNCH_BLOCKING=1 python3 train_textBPN.py --exp_name MLT2019 --net resnet50 --scale 1 --max_epoch 660 --batch_size 12 --gpu 0 --input_size 640 --optim Adam --lr 0.001 --num_workers 30\n\n#--viz --viz_freq 400\n"
  },
  {
    "path": "scripts-train/train_SynText_res18_4s.sh",
    "content": "#!/bin/bash\ncd ../\nCUDA_LAUNCH_BLOCKING=1 python3 train_textBPN.py --exp_name Synthtext --net resnet18 --scale 4 --max_epoch 1 --batch_size 48 --lr 0.001 --gpu 0 --input_size 640 --save_freq 1 --num_workers 30 \n#--viz --viz_freq 10000\n#--resume model/Synthtext/TextBPN_resnet50_0.pth --start_epoch 1\n"
  },
  {
    "path": "scripts-train/train_SynText_res50_1s.sh",
    "content": "#!/bin/bash\ncd ../\nCUDA_LAUNCH_BLOCKING=1 python3 train_textBPN.py --exp_name Synthtext --net resnet50 --scale 1 --max_epoch 1 --batch_size 12 --lr 0.001 --gpu 0 --input_size 640 --save_freq 1 --num_workers 30 \n\n#--resume model/Synthtext/TextBPN_resnet50_0.pth --start_epoch 1 --viz --viz_freq 10000\n"
  },
  {
    "path": "scripts-train/train_SynText_res50_dcn_1s.sh",
    "content": "#!/bin/bash\ncd ../\nCUDA_LAUNCH_BLOCKING=1 python3 train_textBPN.py --exp_name Synthtext --net deformable_resnet50 --scale 1 --max_epoch 1 --batch_size 14 --lr 0.0001 --gpu 1 --input_size 640 --save_freq 1 --num_workers 30 --viz --viz_freq 1000\n\n"
  },
  {
    "path": "scripts-train/train_TD500_res18_4s_fine_mlt.sh",
    "content": "#!/bin/bash\ncd ../\nCUDA_LAUNCH_BLOCKING=1 python3 train_textBPN.py --exp_name TD500 --net resnet18 --scale 4 --max_epoch 1200 --batch_size 48 --gpu 0 --input_size 640 --optim Adam --lr 0.0001 --num_workers 30 --resume model/pretrain/MLT/TextBPN_resnet18_300.pth --load_memory True\n\n\n#--viz --viz_freq 80\n#--start_epoch 300\n\nfor ((i=55; i<=1200; i=i+5))\n do CUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN.py --net resnet18 --scale 4 --exp_name TD500 --checkepoch $i --test_size 640 960 --dis_threshold 0.35 --cls_threshold 0.9 --gpu 0;\ndone\n\n"
  },
  {
    "path": "scripts-train/train_TD500_res50_1s.sh",
    "content": "#!/bin/bash\ncd ../\nCUDA_LAUNCH_BLOCKING=1 python3 train_textBPN.py --exp_name TD500 --net resnet50 --scale 1 --max_epoch 1200 --batch_size 12 --gpu 1 --input_size 640 --optim Adam --lr 0.001 --num_workers 30 --load_memory True\n\n#--resume model/MLT2017/TextBPN_deformable_resnet50_300.pth\n\n"
  },
  {
    "path": "scripts-train/train_TD500_res50_1s_fine_mlt.sh",
    "content": "#!/bin/bash\ncd ../\nCUDA_LAUNCH_BLOCKING=1 python3 train_textBPN.py --exp_name TD500 --net resnet50 --scale 1 --max_epoch 1200 --batch_size 12 --gpu 0 --input_size 640 --optim Adam --lr 0.0001 --num_workers 30 --load_memory True --resume model/pretrain/MLT/TextBPN_resnet50_300.pth\n"
  },
  {
    "path": "scripts-train/train_TD500_res50_dcn_1s_fine_mlt.sh",
    "content": "#!/bin/bash\ncd ../\nCUDA_LAUNCH_BLOCKING=1 python3 train_textBPN.py --exp_name TD500 --net deformable_resnet50 --scale 1 --max_epoch 1200 --batch_size 12 --gpu 1 --input_size 640 --optim Adam --lr 0.0001 --num_workers 30 --resume model/MLT2017/TextBPN_deformable_resnet50_300.pth\n\n"
  },
  {
    "path": "scripts-train/train_Totaltext_res18_1s.sh",
    "content": "#!/bin/bash\ncd ../\nCUDA_LAUNCH_BLOCKING=1 python3 train_textBPN.py --exp_name Totaltext --net resnet18 --max_epoch 660 --batch_size 24 --gpu 0 --input_size 640 --optim Adam --lr 0.001 --num_workers 30 \n\n#--resume model/Synthtext/TextBPN_resnet50_0.pth \n#--viz --viz_freq 80\n#--start_epoch 300\n"
  },
  {
    "path": "scripts-train/train_Totaltext_res18_2s.sh",
    "content": "#!/bin/bash\ncd ../\nCUDA_LAUNCH_BLOCKING=1 python3 train_textBPN.py --exp_name Totaltext --net resnet18 --scale 2 --max_epoch 660 --batch_size 32 --gpu 0 --input_size 640 --optim Adam --lr 0.001 --num_workers 30 \n#--resume model/MLT2017/TextBPN_resnet50_100.pth  \n#--resume model/Synthtext/TextBPN_resnet50_0.pth \n#--viz --viz_freq 80\n#--start_epoch 300\n\n###### test eval ############\n##for ((i=100; i<=660; i=i+5))\n## do CUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN.py --exp_name Totaltext --net resnet18 --scale 2 --checkepoch $i --test_size 640 1024 ##--dis_threshold 0.3 --cls_threshold 0.85 --gpu 0;\n ##done\n"
  },
  {
    "path": "scripts-train/train_Totaltext_res18_4s.sh",
    "content": "#!/bin/bash\ncd ../\nCUDA_LAUNCH_BLOCKING=1 python3 train_textBPN.py --exp_name Totaltext --net resnet18 --scale 4 --max_epoch 660 --batch_size 48 --gpu 0 --input_size 640 --optim Adam --lr 0.001 --num_workers 30 \n#--resume model/MLT2017/TextBPN_resnet50_100.pth  \n#--resume model/Synthtext/TextBPN_resnet50_0.pth \n#--viz --viz_freq 80\n#--start_epoch 300\n\nfor ((i=205; i<=660; i=i+5))\n do CUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN.py --net resnet18 --scale 4 --exp_name Totaltext --checkepoch $i --test_size 640 1024 --dis_threshold 0.25 --cls_threshold 0.9 --gpu 0;\ndone\n\n"
  },
  {
    "path": "scripts-train/train_Totaltext_res18_4s_fine_mlt.sh",
    "content": "#!/bin/bash\ncd ../\nCUDA_LAUNCH_BLOCKING=1 python3 train_textBPN.py --exp_name Totaltext --net resnet18 --scale 4 --max_epoch 660 --batch_size 48 --gpu 0 --input_size 640 --optim Adam --lr 0.0001 --num_workers 30 --load_memory True --resume model/pretrain/MLT/TextBPN_resnet18_300.pth\n\n#--viz --viz_freq 80\n#--start_epoch 300\n\nfor ((i=55; i<=660; i=i+5))\n do CUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN.py --net resnet18 --scale 4 --exp_name Totaltext --checkepoch $i --test_size 640 1024 --dis_threshold 0.25 --cls_threshold 0.85 --gpu 0;\ndone\n\n"
  },
  {
    "path": "scripts-train/train_Totaltext_res50_1s.sh",
    "content": "#!/bin/bash\ncd ../\nCUDA_LAUNCH_BLOCKING=1 python3 train_textBPN.py --exp_name Totaltext --net resnet50 --scale 1 --max_epoch 660 --batch_size 12 --gpu 0 --input_size 640 --optim Adam --lr 0.001 --num_workers 30 --viz --viz_freq 80\n#--resume model/MLT2017/TextBPN_resnet50_100.pth \n#--start_epoch 300\n#--viz \n\n###### test eval ############\nfor ((i=100; i<=660; i=i+5))\n do CUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN.py --exp_name Totaltext --checkepoch $i --test_size 640 1024 --dis_threshold 0.3 --cls_threshold 0.85 --gpu 0;\n done\n"
  },
  {
    "path": "scripts-train/train_Totaltext_res50_1s_fine_mlt.sh",
    "content": "#!/bin/bash\ncd ../\nCUDA_LAUNCH_BLOCKING=1 python3 train_textBPN.py --exp_name Totaltext --net resnet50 --scale 1 --max_epoch 660 --batch_size 12 --gpu 0 --input_size 640 --optim Adam --lr 0.0001 --num_workers 30 --load_memory True --resume model/pretrain/MLT/TextBPN_resnet50_300.pth\n\n\n#--viz --viz_freq 80\n#--start_epoch 300\n\nfor ((i=100; i<=660; i=i+5))\n do CUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN.py --net resnet50 --scale 1 --exp_name Totaltext --checkepoch $i --test_size 640 1024 --dis_threshold 0.3 --cls_threshold 0.85 --gpu 0;\ndone\n\n"
  },
  {
    "path": "scripts-train/train_Totaltext_res50_4s.sh",
    "content": "#!/bin/bash\ncd ../\nCUDA_LAUNCH_BLOCKING=1 python3 train_textBPN.py --exp_name Totaltext --net resnet50 --scale 4 --max_epoch 660 --batch_size 24 --gpu 0 --input_size 640 --optim Adam --lr 0.001 --num_workers 30 --load_memory True\n#--resume model/MLT2017/TextBPN_resnet50_100.pth  \n#--viz --viz_freq 80\n#--start_epoch 300\n\nfor ((i=205; i<=660; i=i+5))\n do CUDA_LAUNCH_BLOCKING=1 python3 eval_textBPN.py --net resnet50 --scale 4 --exp_name Totaltext --checkepoch $i --test_size 640 1024 --dis_threshold 0.3 --cls_threshold 0.9 --gpu 0;\ndone\n\n"
  },
  {
    "path": "scripts-train/train_Totaltext_res50_dcn_1s_fine_mlt.sh",
    "content": "#!/bin/bash\ncd ../\nCUDA_LAUNCH_BLOCKING=1 python train_textBPN.py --exp_name Totaltext --net deformable_resnet50 --scale 1 --max_epoch 660 --batch_size 12 --gpu 0 --input_size 640 --optim Adam --lr 0.0001 --num_workers 30 --resume model/pretrain/MLT/TextBPN_deformable_resnet50_300.pth --viz --viz_freq 50\n\n\n"
  },
  {
    "path": "train_textBPN.py",
    "content": "import os\nimport gc\nimport time\nimport torch\nimport numpy as np\nimport torch.nn as nn\nimport torch.backends.cudnn as cudnn\nimport torch.utils.data as data\nfrom torch.optim import lr_scheduler\nfrom torch.utils.data import ConcatDataset\n\nfrom dataset import SynthText, TotalText, Ctw1500Text, Icdar15Text, LsvtTextJson,\\\n    Mlt2017Text, TD500Text, ArtTextJson, Mlt2019Text, Ctw1500Text_New, TotalText_New, ArtText\nfrom network.loss import TextLoss\nfrom network.textnet import TextNet\nfrom util.augmentation import Augmentation\nfrom cfglib.config import config as cfg, update_config, print_config\nfrom util.misc import AverageMeter\nfrom util.misc import mkdirs, to_device\nfrom cfglib.option import BaseOptions\nfrom util.visualize import visualize_network_output\nfrom util.summary import LogSummary\nfrom util.shedule import FixLR\n# import multiprocessing\n# multiprocessing.set_start_method(\"spawn\", force=True)\n\nlr = None\ntrain_step = 0\n\n\ndef save_model(model, epoch, lr, optimzer):\n\n    save_dir = os.path.join(cfg.save_dir, cfg.exp_name)\n    if not os.path.exists(save_dir):\n        mkdirs(save_dir)\n\n    save_path = os.path.join(save_dir, 'TextBPN_{}_{}.pth'.format(model.backbone_name, epoch))\n    print('Saving to {}.'.format(save_path))\n    state_dict = {\n        'lr': lr,\n        'epoch': epoch,\n        'model': model.state_dict() if not cfg.mgpu else model.module.state_dict()\n        # 'optimizer': optimzer.state_dict()\n    }\n    torch.save(state_dict, save_path)\n\n\ndef load_model(model, model_path):\n    print('Loading from {}'.format(model_path))\n    state_dict = torch.load(model_path)\n    model.load_state_dict(state_dict['model'])\n\n\ndef _parse_data(inputs):\n    input_dict = {}\n    inputs = list(map(lambda x: to_device(x), inputs))\n    input_dict['img'] = inputs[0]\n    input_dict['train_mask'] = inputs[1]\n    input_dict['tr_mask'] = inputs[2]\n    input_dict['distance_field'] = inputs[3]\n    input_dict['direction_field'] = inputs[4]\n    input_dict['weight_matrix'] = inputs[5]\n    input_dict['gt_points'] = inputs[6]\n    input_dict['proposal_points'] = inputs[7]\n    input_dict['ignore_tags'] = inputs[8]\n\n    return input_dict\n\n\ndef train(model, train_loader, criterion, scheduler, optimizer, epoch):\n\n    global train_step\n\n    losses = AverageMeter()\n    batch_time = AverageMeter()\n    data_time = AverageMeter()\n    end = time.time()\n    model.train()\n    # scheduler.step()\n\n    print('Epoch: {} : LR = {}'.format(epoch, scheduler.get_lr()))\n\n    for i, inputs in enumerate(train_loader):\n\n        data_time.update(time.time() - end)\n        train_step += 1\n        input_dict = _parse_data(inputs)\n        output_dict = model(input_dict)\n        loss_dict = criterion(input_dict, output_dict, eps=epoch+1)\n        loss = loss_dict[\"total_loss\"]\n        # backward\n        try:\n            optimizer.zero_grad()\n            loss.backward()\n        except:\n            print(\"loss gg\")\n            continue\n        if cfg.grad_clip > 0:\n            torch.nn.utils.clip_grad_norm_(model.parameters(), cfg.grad_clip)\n\n        optimizer.step()\n\n        losses.update(loss.item())\n        # measure elapsed time\n        batch_time.update(time.time() - end)\n        end = time.time()\n\n        if cfg.viz and (i % cfg.viz_freq == 0 and i > 0) and epoch % 8 == 0:\n            visualize_network_output(output_dict, input_dict, mode='train')\n\n        if i % cfg.display_freq == 0:\n            gc.collect()\n            print_inform = \"({:d} / {:d}) \".format(i, len(train_loader))\n            for (k, v) in loss_dict.items():\n                print_inform += \" {}: {:.4f} \".format(k, v.item())\n            print(print_inform)\n\n    if cfg.exp_name == 'Synthtext' or cfg.exp_name == 'ALL':\n        if epoch % cfg.save_freq == 0:\n            save_model(model, epoch, scheduler.get_lr(), optimizer)\n    elif cfg.exp_name == 'MLT2019' or cfg.exp_name == 'ArT' or cfg.exp_name == 'MLT2017':\n        if epoch < 50 and cfg.max_epoch >= 200:\n            if epoch % (2*cfg.save_freq) == 0:\n                save_model(model, epoch, scheduler.get_lr(), optimizer)\n        else:\n            if epoch % cfg.save_freq == 0:\n                save_model(model, epoch, scheduler.get_lr(), optimizer)\n    else:\n        if epoch % cfg.save_freq == 0 and epoch > 50:\n            save_model(model, epoch, scheduler.get_lr(), optimizer)\n\n    print('Training Loss: {}'.format(losses.avg))\n\n\ndef main():\n\n    global lr\n    if cfg.exp_name == 'Totaltext':\n        trainset = TotalText(\n            data_root='data/total-text-mat',\n            ignore_list=None,\n            is_training=True,\n            load_memory=cfg.load_memory,\n            transform=Augmentation(size=cfg.input_size, mean=cfg.means, std=cfg.stds)\n        )\n        valset = None\n\n    elif cfg.exp_name == 'Synthtext':\n        trainset = SynthText(\n            data_root='data/SynthText',\n            is_training=True,\n            load_memory=cfg.load_memory,\n            transform=Augmentation(size=cfg.input_size, mean=cfg.means, std=cfg.stds)\n        )\n        valset = None\n\n    elif cfg.exp_name == 'Ctw1500':\n        trainset = Ctw1500Text(\n            data_root='data/ctw1500',\n            is_training=True,\n            load_memory=cfg.load_memory,\n            transform=Augmentation(size=cfg.input_size, mean=cfg.means, std=cfg.stds)\n        )\n        valset = None\n\n    elif cfg.exp_name == 'Icdar2015':\n        trainset = Icdar15Text(\n            data_root='data/Icdar2015',\n            is_training=True,\n            load_memory=cfg.load_memory,\n            transform=Augmentation(size=cfg.input_size, mean=cfg.means, std=cfg.stds)\n        )\n        valset = None\n    elif cfg.exp_name == 'MLT2017':\n        trainset = Mlt2017Text(\n            data_root='data/MLT2017',\n            is_training=True,\n            load_memory=cfg.load_memory,\n            transform=Augmentation(size=cfg.input_size, mean=cfg.means, std=cfg.stds)\n        )\n        valset = None\n\n    elif cfg.exp_name == 'TD500':\n        trainset = TD500Text(\n            data_root='data/TD500',\n            is_training=True,\n            load_memory=cfg.load_memory,\n            transform=Augmentation(size=cfg.input_size, mean=cfg.means, std=cfg.stds)\n        )\n        valset = None\n\n    elif cfg.exp_name == 'ArT':\n        trainset = ArtTextJson(\n            data_root='data/ArT',\n            is_training=True,\n            load_memory=cfg.load_memory,\n            transform=Augmentation(size=cfg.input_size, mean=cfg.means, std=cfg.stds)\n        )\n        valset = None\n\n    elif cfg.exp_name == 'MLT2019':\n        trainset = Mlt2019Text(\n            data_root='data/MLT-2019',\n            is_training=True,\n            load_memory=cfg.load_memory,\n            transform=Augmentation(size=cfg.input_size, mean=cfg.means, std=cfg.stds)\n        )\n        valset = None\n\n    elif cfg.exp_name == 'ALL':\n        trainset_art = ArtTextJson(\n            data_root='data/ArT',\n            is_training=True,\n            load_memory=cfg.load_memory,\n            transform=Augmentation(size=cfg.input_size, mean=cfg.means, std=cfg.stds))\n\n        trainset_mlt19 = Mlt2019Text(\n            data_root='data/MLT-2019',\n            is_training=True,\n            load_memory=cfg.load_memory,\n            transform=Augmentation(size=cfg.input_size, mean=cfg.means, std=cfg.stds))\n\n        trainset_lsvt = LsvtTextJson(\n            data_root=\"/home/prir1005/pubdata/LSVT\",\n            is_training=True,\n            transform=Augmentation(size=cfg.input_size, mean=cfg.means, std=cfg.stds))\n        trainset = ConcatDataset([trainset_lsvt, trainset_mlt19, trainset_art])\n        valset = None\n\n    else:\n        print(\"dataset name is not correct\")\n\n    train_loader = data.DataLoader(trainset, batch_size=cfg.batch_size,\n                                   shuffle=True, num_workers=cfg.num_workers,\n                                   pin_memory=True)  # generator=torch.Generator(device=cfg.device)\n\n    # Model\n    model = TextNet(backbone=cfg.net, is_training=True)\n    model = model.to(cfg.device)\n    criterion = TextLoss()\n    if cfg.mgpu:\n        model = nn.DataParallel(model)\n    if cfg.cuda:\n        cudnn.benchmark = True\n    if cfg.resume:\n        load_model(model, cfg.resume)\n\n    lr = cfg.lr\n    moment = cfg.momentum\n    if cfg.optim == \"Adam\" or cfg.exp_name == 'Synthtext':\n        optimizer = torch.optim.Adam(model.parameters(), lr=lr)\n    else:\n        optimizer = torch.optim.SGD(model.parameters(), lr=lr, momentum=moment)\n\n    if cfg.exp_name == 'Synthtext':\n        scheduler = FixLR(optimizer)\n    else:\n        scheduler = lr_scheduler.StepLR(optimizer, step_size=50, gamma=0.9)\n\n    print('Start training TextBPN++.')\n    for epoch in range(cfg.start_epoch, cfg.max_epoch+1):\n        scheduler.step()\n        # if epoch <= 300:\n        #     continue\n        train(model, train_loader, criterion, scheduler, optimizer, epoch)\n\n    print('End.')\n\n    if torch.cuda.is_available():\n        torch.cuda.empty_cache()\n\n\nif __name__ == \"__main__\":\n    np.random.seed(2022)\n    torch.manual_seed(2022)\n    # parse arguments\n    option = BaseOptions()\n    args = option.initialize()\n\n    update_config(cfg, args)\n    print_config(cfg)\n\n    # main\n    main()\n\n"
  },
  {
    "path": "train_textBPN_DDP.py",
    "content": "import os\nimport gc\nimport time\nimport torch\nimport numpy as np\nimport torch.nn as nn\nfrom torch.optim import lr_scheduler\nfrom torch.utils.data import ConcatDataset\nimport torch.distributed as dist\nfrom torch.utils.data import DataLoader, Dataset\nfrom torch.nn.parallel import DistributedDataParallel as DDP\nfrom torch.utils.data.distributed import DistributedSampler\n\nfrom dataset import AllTextJson\nfrom network.loss import TextLoss\nfrom network.textnet import TextNet\nfrom util.augmentation import Augmentation\nfrom cfglib.config import config as cfg, update_config, print_config\nfrom util.misc import AverageMeter\nfrom util.misc import mkdirs, to_device\nfrom cfglib.option import BaseOptions\nfrom util.visualize import visualize_network_output\nfrom util.summary import LogSummary\nfrom util.shedule import FixLR\nfrom torch.amp import GradScaler, autocast\n\n\nlr = None\ntrain_step = 0\n\nclass WarmupScheduler(lr_scheduler._LRScheduler):\n    def __init__(self, optimizer, warmup_epochs, base_lr, final_lr, after_scheduler=None):\n        self.warmup_epochs = warmup_epochs\n        self.base_lr = base_lr\n        self.final_lr = final_lr\n        self.after_scheduler = after_scheduler\n        super(WarmupScheduler, self).__init__(optimizer)\n\n    def get_last_lr(self):\n        if self.last_epoch < self.warmup_epochs:\n            warmup_lr = self.base_lr + (self.final_lr - self.base_lr) * (self.last_epoch / self.warmup_epochs)\n            return [warmup_lr]\n        if self.after_scheduler:\n            if self.last_epoch == self.warmup_epochs:\n                self.after_scheduler.base_lrs = self.final_lr\n            return self.after_scheduler.get_last_lr()\n        return [self.final_lr]\n\n    def step(self, epoch=None):\n        if self.last_epoch < self.warmup_epochs:\n            warmup_lr = self.base_lr + (self.final_lr - self.base_lr) * (self.last_epoch / self.warmup_epochs)\n            for param_group in self.optimizer.param_groups:\n                param_group['lr'] = warmup_lr\n        else:\n            if self.after_scheduler:\n                if self.last_epoch == self.warmup_epochs:\n                    self.after_scheduler.base_lrs = self.final_lr\n                self.after_scheduler.step()\n        self.last_epoch += 1\n\n\n\ndef save_model(model, epoch, lr, optimizer):\n    save_dir = os.path.join(cfg.save_dir, cfg.exp_name)\n    if not os.path.exists(save_dir):\n        mkdirs(save_dir)\n\n    save_path = os.path.join(save_dir, 'TextBPN_{}_{}.pth'.format(model.module.backbone_name, epoch))\n    print('Saving to {}.'.format(save_path))\n    state_dict = {\n        'lr': lr,\n        'epoch': epoch,\n        'model': model.state_dict() if not cfg.mgpu else model.module.state_dict(),\n        # 'optimizer': optimzer.state_dict()\n    }\n    torch.save(state_dict, save_path)\n\ndef load_model(model, model_path):\n    print('Loading from {}'.format(model_path))\n    # state_dict = torch.load(model_path)\n    state_dict = torch.load(model_path, map_location=cfg.device, weights_only=True) \n    model.load_state_dict(state_dict['model'])\n\ndef _parse_data(inputs):\n    input_dict = {}\n    inputs = list(map(lambda x: to_device(x), inputs))\n    input_dict['img'] = inputs[0]\n    input_dict['train_mask'] = inputs[1]\n    input_dict['tr_mask'] = inputs[2]\n    input_dict['distance_field'] = inputs[3]\n    input_dict['direction_field'] = inputs[4]\n    input_dict['weight_matrix'] = inputs[5]\n    input_dict['gt_points'] = inputs[6]\n    input_dict['proposal_points'] = inputs[7]\n    input_dict['ignore_tags'] = inputs[8]\n\n    return input_dict\n\ndef train(model, train_loader, criterion, scheduler, optimizer, epoch, scaler, use_amp=True, accum_grad_iters=1):\n    global train_step\n\n    losses = AverageMeter()\n    batch_time = AverageMeter()\n    data_time = AverageMeter()\n    end = time.time()\n    model.train()\n\n    print('Epoch: {} : LR = {}'.format(epoch, scheduler.get_last_lr()))\n\n    for i, inputs in enumerate(train_loader):\n        data_time.update(time.time() - end)\n        train_step += 1\n        input_dict = _parse_data(inputs)\n        with torch.amp.autocast('cuda', enabled=use_amp, dtype=torch.bfloat16):\n            # with torch.amp.autocast(enabled=use_amp, dtype=torch.bfloat16):\n            output_dict = model(input_dict)\n            loss_dict = criterion(input_dict, output_dict, eps=epoch+1)\n            loss = loss_dict[\"total_loss\"]\n            loss /= accum_grad_iters #TODO: not affect loss_dict values for logging\n\n            if torch.isnan(loss).any() and loss.item()>16.0:\n                print(f\"loss: {loss}\")\n                continue\n\n            try:\n                # after_train_step()\n                if use_amp:\n                    scaler.scale(loss).backward()\n                else:\n                    loss.backward()\n            \n                # Clip the gradient\n                torch.nn.utils.clip_grad_norm_(model.parameters(), cfg.grad_clip)\n                # update gradients every accum_grad_iters iterations\n                if (i + 1) % accum_grad_iters == 0:\n                    if use_amp:\n                        scaler.step(optimizer)\n                        scaler.update()                     \n                    else:    \n                        optimizer.step()\n                    optimizer.zero_grad()\n            except:\n                print(\"loss gg\")\n                for (k, v) in loss_dict.items():\n                    print_inform += \" {}: {:.4f} \".format(k, v.item())\n                continue\n        \n        losses.update(loss.item())\n        # measure elapsed time\n        batch_time.update(time.time() - end)\n        end = time.time()\n\n        scheduler.step()\n        \n        # 只在主进程上打印信息\n        if cfg.viz and (i % cfg.viz_freq == 0 and i > 0) and dist.get_rank() == 0:\n            visualize_network_output(output_dict, input_dict, mode='train')\n\n        if i % cfg.display_freq == 0 and dist.get_rank() == 0:\n            gc.collect()\n            print_inform = \"({:d} / {:d}) lr: {:.6f}\".format(i, len(train_loader), scheduler.get_last_lr()[0])\n            for (k, v) in loss_dict.items():\n                print_inform += \" {}: {:.4f} \".format(k, v.item())\n            print(print_inform)\n        \n        if (i+1) % 2000 == 0 and dist.get_rank() == 0:\n            save_model(model, i, scheduler.get_last_lr(), optimizer)\n\n    if cfg.exp_name == 'pre-training':\n        if epoch % 1 == 0 and dist.get_rank() == 0:\n            save_model(model, epoch, scheduler.get_last_lr(), optimizer)\n    else:\n        # if epoch % cfg.save_freq == 0 and dist.get_rank() == 0:\n        if epoch % 1 == 0 and dist.get_rank() == 0:\n            save_model(model, epoch, scheduler.get_last_lr(), optimizer)\n\n    print('Training Loss: {}'.format(losses.avg))\n\ndef main():\n    global lr\n    # 获取本地 rank\n    local_rank = cfg.local_rank\n    print(local_rank)\n\n    input_size_bucket = [512, 640, 800, 960]\n    batch_size_bucket = [48, 48, 32, 24]\n    data_root = \"/xxx/TextBPN-Plus-Plus-main/data\"\n    valset = None\n    trainset = AllTextJson(\n        data_root=data_root,\n        is_training=True,\n        transform=Augmentation(size=cfg.input_size, mean=cfg.means, std=cfg.stds)\n    )\n    print(f\"Load data: {len(trainset)}\")\n\n    # 设置种子以确保可重复性\n    seed = 42 + local_rank\n    np.random.seed(seed)\n    torch.manual_seed(seed)\n    torch.cuda.manual_seed(seed)\n\n    # g = torch.Generator(device='cuda')\n    # g.manual_seed(seed)\n    # print(torch.cuda.is_available())\n    # print(cfg.device)\n\n    \n    # 创建分布式采样器\n    train_sampler = DistributedSampler(trainset, num_replicas=dist.get_world_size(), \n                                            rank=local_rank, shuffle=True, seed=seed)\n\n    # 创建数据加载器\n    train_loader = DataLoader(trainset, batch_size=cfg.batch_size, num_workers=cfg.num_workers, \n                               sampler=train_sampler, pin_memory=True, generator=torch.Generator(device='cuda'))\n\n    # Model\n    model = TextNet(backbone=cfg.net, is_training=True)\n    model = model.to(cfg.device)\n    criterion = TextLoss()\n\n    model = DDP(model, device_ids=[cfg.local_rank], output_device=cfg.local_rank, find_unused_parameters=True)\n\n    if cfg.resume:\n        load_model(model, cfg.resume)\n\n    lr = cfg.lr\n    moment = cfg.momentum\n    if cfg.optim == \"Adam\":\n        optimizer = torch.optim.Adam(model.parameters(), lr=lr)\n    else:\n        optimizer = torch.optim.SGD(model.parameters(), lr=lr, momentum=moment)\n\n    if cfg.exp_name == 'pre-training':\n        scheduler = lr_scheduler.StepLR(optimizer, step_size=5000, gamma=0.9)\n    else:\n        scheduler = lr_scheduler.StepLR(optimizer, step_size=5000, gamma=0.9)\n\n    warmup_epochs = 5000  # 你可以根据需要调整 warmup 的 epoch 数\n    base_lr = 0.0  # warmup 初始学习率\n    final_lr = lr  # warmup 结束后的学习率\n    scheduler = WarmupScheduler(optimizer, warmup_epochs, base_lr, final_lr, scheduler)\n\n    # 确保所有进程同步\n    dist.barrier()\n    print('Start training TextBPN++.')\n    scaler = GradScaler('cuda')\n    for epoch in range(cfg.start_epoch, cfg.max_epoch + 1):\n        train_sampler.set_epoch(epoch)\n        train(model, train_loader, criterion, scheduler, optimizer, epoch, scaler, use_amp=cfg.use_amp, accum_grad_iters=cfg.accum_grad_iters)\n\n    print('End.')\n\n    if torch.cuda.is_available():\n        torch.cuda.empty_cache()\n\nif __name__ == \"__main__\":\n    # parse arguments\n    option = BaseOptions()\n    args = option.initialize()\n\n    update_config(cfg, args)\n    print_config(cfg)\n\n    # Initialize the process group\n    dist.init_process_group(backend='nccl', init_method='env://')\n    cfg.local_rank = int(os.environ['LOCAL_RANK'])\n    torch.cuda.set_device(cfg.local_rank)\n    print(cfg.local_rank)\n    \n\n    # main\n    main()\n\n    # Clean up the process group\n    dist.destroy_process_group()"
  },
  {
    "path": "util/__init__.py",
    "content": "from .visualize import *\nfrom .pbox import *\n"
  },
  {
    "path": "util/augmentation.py",
    "content": "# -*- coding: utf-8 -*-\n__author__ = \"S.X.Zhang\"\nimport numpy as np\nimport math\nimport cv2\nimport copy\nimport numpy.random as random\nfrom shapely.geometry import Polygon\nimport torchvision.transforms as transforms\nimport torchvision.transforms.functional as F\nfrom PIL import ImageEnhance, Image\n\n\n###<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<###\n###<<<<<<<<<  Function  >>>>>>>>>>>>###\n###>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>###\ndef crop_first(image, polygons, scale =10):\n    polygons_new = copy.deepcopy(polygons)\n    h, w, _ = image.shape\n    pad_h = h // scale\n    pad_w = w // scale\n    h_array = np.zeros((h + pad_h * 2), dtype=np.int32)\n    w_array = np.zeros((w + pad_w * 2), dtype=np.int32)\n\n    text_polys = []\n    pos_polys = []\n    for polygon in polygons_new:\n        rect = cv2.minAreaRect(polygon.points.astype(np.int32))\n        box = cv2.boxPoints(rect)\n        box = np.int0(box)\n        text_polys.append([box[0], box[1], box[2], box[3]])\n        if polygon.label != -1:\n            pos_polys.append([box[0], box[1], box[2], box[3]])\n\n    polys = np.array(text_polys, dtype=np.int32)\n    for poly in polys:\n        poly = np.round(poly, decimals=0).astype(np.int32)  # 四舍五入\n        minx = np.min(poly[:, 0])\n        maxx = np.max(poly[:, 0])\n        w_array[minx + pad_w:maxx + pad_w] = 1\n        miny = np.min(poly[:, 1])\n        maxy = np.max(poly[:, 1])\n        h_array[miny + pad_h:maxy + pad_h] = 1\n    # ensure the cropped area not across a text 保证截取区域不会横穿文字\n    h_axis = np.where(h_array == 0)[0]\n    w_axis = np.where(w_array == 0)[0]\n    pp_polys = np.array(pos_polys, dtype=np.int32)\n\n    return h_axis, w_axis, pp_polys\n\n####<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<####\n####<<<<<<<<<<<  Class  >>>>>>>>>>>>>####\n####>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>####\nclass Compose(object):\n    \"\"\"Composes several augmentations together.\n    Args:\n        transforms (List[Transform]): list of transforms to compose.\n    Example:\n        >>> augmentations.Compose([\n        >>>     transforms.CenterCrop(10),\n        >>>     transforms.ToTensor(),\n        >>> ])\n    \"\"\"\n\n    def __init__(self, transforms):\n        self.transforms = transforms\n\n    def __call__(self, img, pts=None):\n        for t in self.transforms:\n            img, pts = t(img, pts)\n        return img, pts\n\n\nclass Normalize(object):\n    def __init__(self, mean, std):\n        self.mean = np.array(mean)\n        self.std = np.array(std)\n\n    def __call__(self, image, polygons=None):\n        image = image.astype(np.float32)\n        image /= 255.0\n        image -= self.mean\n        image /= self.std\n        return image, polygons\n\n\nclass MinusMean(object):\n    def __init__(self, mean):\n        self.mean = np.array(mean)\n\n    def __call__(self, image, polygons=None):\n        image = image.astype(np.float32)\n        image -= self.mean\n        return image, polygons\n\n\nclass RandomMirror(object):\n    # 镜像\n    def __init__(self):\n        pass\n\n    def __call__(self, image, polygons=None):\n        if polygons is None:\n            return image, polygons\n        if random.random()< 0.3:\n            image = np.ascontiguousarray(image[:, ::-1])\n            _, width, _ = image.shape\n            for polygon in polygons:\n                polygon.points[:, 0] = width - polygon.points[:, 0]\n        return image, polygons\n\n\nclass AugmentColor(object):\n    # 颜色增强（添加噪声）\n    def __init__(self):\n        self.U = np.array([[-0.56543481, 0.71983482, 0.40240142],\n                      [-0.5989477, -0.02304967, -0.80036049],\n                      [-0.56694071, -0.6935729, 0.44423429]], dtype=np.float32)\n        self.EV = np.array([1.65513492, 0.48450358, 0.1565086], dtype=np.float32)\n        self.sigma = 0.1\n        self.color_vec = None\n\n    def __call__(self, img, polygons=None):\n        color_vec = self.color_vec\n        if self.color_vec is None:\n            if not self.sigma > 0.0:\n                color_vec = np.zeros(3, dtype=np.float32)\n            else:\n                color_vec = np.random.normal(0.0, self.sigma, 3)\n\n        alpha = color_vec.astype(np.float32) * self.EV\n        noise = np.dot(self.U, alpha.T) * 255\n        return np.clip(img + noise[np.newaxis, np.newaxis, :], 0, 255), polygons\n\n\nclass RandomContrast(object):\n    def __init__(self, lower=0.5, upper=1.5):\n        self.lower = lower\n        self.upper = upper\n        assert self.upper >= self.lower, \"contrast upper must be >= lower.\"\n        assert self.lower >= 0, \"contrast lower must be non-negative.\"\n\n    # expects float image\n    def __call__(self, image, polygons=None):\n        if random.randint(2):\n            alpha = random.uniform(self.lower, self.upper)\n            image *= alpha\n        return np.clip(image, 0, 255), polygons\n\n\nclass RandomBrightness(object):\n    def __init__(self, delta=32):\n        assert delta >= 0.0\n        assert delta <= 255.0\n        self.delta = delta\n\n    def __call__(self, image, polygons=None):\n        image = image.astype(np.float32)\n        if random.randint(2):\n            delta = random.uniform(-self.delta, self.delta)\n            image += delta\n        return np.clip(image, 0, 255), polygons\n\n\nclass RandomErasing(object):\n    def __init__(self, sr=(0.0004, 0.01), scale=(0.5, 3), ratio=0.2, Type =\"Erasing\"):\n        \"\"\"\n\n        :param area:\n        :param type: Erasing or Cutout\n        \"\"\"\n        self.sr = sr\n        self.scale= scale\n        self.ratio=ratio\n        self.type=Type\n\n    def __call__(self, img, polygons=None):\n\n        if random.random()< self.ratio:\n            return img, polygons\n        area=img.shape[0]*img.shape[1]\n        target_area=random.randint(*self.sr)*area\n        aspect_ratio=random.uniform(*self.scale)\n        h = int(round(math.sqrt(target_area / aspect_ratio)))\n        w = int(round(math.sqrt(target_area * aspect_ratio)))\n\n        if w < img.shape[1] and h < img.shape[0]:\n            x1 = random.randint(0, img.shape[1] - w)\n            y1 = random.randint(0, img.shape[0] - h)\n            if self.type == \"Erasing\":\n                color=(random.randint(0, 255),random.randint(0, 255),random.randint(0, 255))\n                img[y1:y1+h, x1:x1+h,:]=color\n            else:\n                Gray_value=random.randint(0, 255)\n                color = (Gray_value, Gray_value ,Gray_value)\n                img[y1:y1 + h, x1:x1 + h, :] = color\n\n        return img, polygons\n\n\nclass RandomMixUp(object):\n    def __init__(self, mixup_alpha=2):\n        self.mixup_alpha = mixup_alpha\n\n    def __call__(self, img1, img2, label1=[], label2=[]):\n        beta=np.random.beta(self.mixup_alpha,self.mixup_alpha)\n\n        #image = img1 * Gama + (1 - Gama) * img2\n        image=cv2.addWeighted(img1, beta, img2, (1-beta), 0)\n\n        if label1 is None or label2  is None:\n            return  img1, label1\n        if isinstance(label1, list) and isinstance(label2, list):\n            label=[]\n            for id in range(len(label1)):\n                lab = beta*label1[id]+ (1-beta)*label2[id]\n                label.append(lab)\n            return image, label\n        else:\n            print(\"Error: label is not a list type\")\n\n        return img1, label1\n\n\nclass Rotate(object):\n    def __init__(self, up=30):\n        self.up = up\n\n    @staticmethod\n    def rotate(center, pt, theta):  # 二维图形学的旋转\n        xr, yr = center\n        yr = -yr\n        x, y = pt[:, 0], pt[:, 1]\n        y = -y\n\n        theta = theta / 180 * math.pi\n        cos = math.cos(theta)\n        sin = math.sin(theta)\n\n        _x = xr + (x - xr) * cos - (y - yr) * sin\n        _y = yr + (x - xr) * sin + (y - yr) * cos\n\n        return _x, -_y\n\n    def __call__(self, img, polygons=None):\n        if np.random.randint(2):\n            return img, polygons\n        angle = np.random.normal(loc=0.0, scale=0.5) * self.up  # angle 按照高斯分布\n        rows, cols = img.shape[0:2]\n        M = cv2.getRotationMatrix2D((cols / 2, rows / 2), angle, 1.0)\n        img = cv2.warpAffine(img, M, (cols, rows), borderValue=[0, 0, 0])\n        center = cols / 2.0, rows / 2.0\n        if polygons is not None:\n            for polygon in polygons:\n                x, y = self.rotate(center, polygon.points, angle)\n                pts = np.vstack([x, y]).T\n                polygon.points = pts\n        return img, polygons\n\n\nclass RotatePadding(object):\n    def __init__(self, up=60,colors=True):\n        self.up = up\n        self.colors = colors\n        self.ratio = 0.5\n\n    @staticmethod\n    def rotate(center, pt, theta, movSize=[0, 0], scale=1):  # 二维图形学的旋转\n        (xr, yr) = center\n        yr = -yr\n        x, y = pt[:, 0], pt[:, 1]\n        y = -y\n\n        theta = theta / 180 * math.pi\n        cos = math.cos(theta)\n        sin = math.sin(theta)\n\n        x = (x - xr) * scale\n        y = (y - yr) * scale\n\n        _x = xr + x * cos - y * sin + movSize[0]\n        _y = -(yr + x * sin + y * cos) + movSize[1]\n\n        return _x, _y\n\n    @staticmethod\n    def shift(size, degree):\n        angle = degree * math.pi / 180.0\n        width = size[0]\n        height = size[1]\n\n        alpha = math.cos(angle)\n        beta = math.sin(angle)\n        new_width = int(width * math.fabs(alpha) + height * math.fabs(beta))\n        new_height = int(width * math.fabs(beta) + height * math.fabs(alpha))\n\n        size = [new_width, new_height]\n        return size\n\n    def __call__(self, image, polygons=None, scale=1.0):\n        if np.random.random() <= self.ratio:\n            return image, polygons\n        angle = np.random.normal(loc=0.0, scale=0.5) * self.up  # angle 按照高斯分布\n        rows, cols = image.shape[0:2]\n        center = (cols / 2.0, rows / 2.0)\n        newSize = self.shift([cols * scale, rows * scale], angle)\n        movSize = [int((newSize[0] - cols) / 2), int((newSize[1] - rows) / 2)]\n\n        M = cv2.getRotationMatrix2D(center, angle, scale)\n        M[0, 2] += int((newSize[0] - cols) / 2)\n        M[1, 2] += int((newSize[1] - rows) / 2)\n\n        if self.colors:\n            H, W, _ = image.shape\n            mask = np.zeros_like(image)\n            (h_index, w_index) = (np.random.randint(0, H * 7 // 8), np.random.randint(0, W * 7 // 8))\n            img_cut = image[h_index:(h_index + H // 9), w_index:(w_index + W // 9)]\n            img_cut = cv2.resize(img_cut, (newSize[0], newSize[1]))\n            mask = cv2.warpAffine(mask, M, (newSize[0], newSize[1]), borderValue=[1, 1, 1])\n            image = cv2.warpAffine(image, M, (newSize[0], newSize[1]), borderValue=[0,0,0])\n            image=image+img_cut*mask\n        else:\n            color = [0, 0, 0]\n            image = cv2.warpAffine(image, M, (newSize[0], newSize[1]), borderValue=color)\n\n        if polygons is not None:\n            for polygon in polygons:\n                x, y = self.rotate(center, polygon.points, angle,movSize,scale)\n                pts = np.vstack([x, y]).T\n                polygon.points = pts\n        return image, polygons\n\n\nclass SquarePadding(object):\n\n    def __call__(self, image, polygons=None):\n\n        H, W, _ = image.shape\n\n        if H == W:\n            return image, polygons\n\n        padding_size = max(H, W)\n        (h_index, w_index) = (np.random.randint(0, H*7//8),np.random.randint(0, W*7//8))\n        img_cut = image[h_index:(h_index+H//9),w_index:(w_index+W//9)]\n        expand_image = cv2.resize(img_cut,(padding_size, padding_size))\n        #expand_image = np.zeros((padding_size, padding_size, 3), dtype=image.dtype)\n        #expand_image=img_cut[:,:,:]\n        if H > W:\n            y0, x0 = 0, (H - W) // 2\n        else:\n            y0, x0 = (W - H) // 2, 0\n        if polygons is not None:\n            for polygon in polygons:\n                polygon.points += np.array([x0, y0])\n        expand_image[y0:y0+H, x0:x0+W] = image\n        image = expand_image\n\n        return image, polygons\n\n\nclass RandomImgCropPatch(object):\n    def __init__(self, up=30, beta=0.3):\n        self.up = up\n        self.beta=0.3\n        self.scale = 10\n\n    @staticmethod\n    def get_contour_min_area_box(contour):\n        rect = cv2.minAreaRect(contour)\n        box = cv2.boxPoints(rect)\n        box = np.int0(box)\n        return box\n\n    def CropWH(self, image, cut_w, cut_h, polygons=None):\n        h_axis, w_axis, polys = crop_first(image, polygons, scale=self.scale)\n        h, w, _ = image.shape\n        pad_h = h // self.scale\n        pad_w = w // self.scale\n        # TODO try Flip\n        xx = np.random.choice(w_axis, size=2)\n        xmin = np.min(xx) - pad_w\n        xmax = xmin + cut_w\n        yy = np.random.choice(h_axis, size=2)\n        ymin = np.min(yy) - pad_h\n        ymax = ymin + cut_h\n        if polys.shape[0] != 0:\n            poly_axis_in_area = (polys[:, :, 0] >= xmin) & (polys[:, :, 0] <= xmax) \\\n                                & (polys[:, :, 1] >= ymin) & (polys[:, :, 1] <= ymax)\n            selected_polys = np.where(np.sum(poly_axis_in_area, axis=1) == 4)[0]\n        else:\n            selected_polys = []\n\n        cropped = image[ymin:ymax + 1, xmin:xmax + 1, :]\n        polygons_new = []\n        for idx in selected_polys:\n            polygon = polygons[idx]\n            polygon.points -= np.array([xmin, ymin])\n            polygons_new.append(polygon)\n        image = cropped\n        polygon = polygons_new\n\n        return image, polygon\n\n    def __call__(self, images, polygons_list=None):\n        I_x, I_y = 1024,1024\n\n        w = int(round(I_x * random.beta(self.beta, self.beta)))\n        h = int(round(I_y * random.beta(self.beta, self.beta)))\n        w_ = [w, I_x - w, w, I_x - w]\n        h_ = [h, h, I_y - h, I_y - h]\n        new_img = np.zeros((I_x, I_y, 3), dtype=images[0].dtype)\n        imgs=[]\n        new_polygons=[]\n        for i, im in enumerate(images):\n           img, polygons = self.CropWH(im,  w_[i],  h_[i], polygons=polygons_list[i])\n           imgs.append(img)\n           new_polygons.append(polygons)\n        new_img[0:w, 0:h, :] = imgs[0]\n        new_img[w:I_x, 0:h, :] = imgs[1]\n        new_img[0:w, h:I_y, :] = imgs[2]\n        new_img[w:I_x, h:I_y, :] = imgs[3]\n        for polygon in new_polygons[1]:\n            polygon.points += np.array([w, 0])\n        for polygon in new_polygons[2]:\n            polygon.points += np.array([0, h])\n        for polygon in new_polygons[3]:\n            polygon.points += np.array([w, h])\n\n        polygons=new_polygons[0]+new_polygons[1]+new_polygons[2]+new_polygons[3]\n\n        return new_img, polygons\n\n\nclass RandomCropFlip(object):\n\n    def __init__(self, min_crop_side_ratio=0.01):\n        self.scale = 10\n        self.ratio = 0.2\n        self.epsilon = 10.0\n        self.min_crop_side_ratio = min_crop_side_ratio\n\n    def __call__(self, image, polygons=None):\n\n        if polygons is None:\n            return image, polygons\n\n        if np.random.random() <= self.ratio:\n            return image, polygons\n\n        # 计算 有效的Crop区域, 方便选取有效的种子点\n        h_axis, w_axis, pp_polys = crop_first(image, polygons, scale =self.scale)\n        if len(h_axis) == 0 or len(w_axis) == 0:\n            return image, polygons\n\n        # TODO try crop\n        attempt = 0\n        h, w, _ = image.shape\n        area = h * w\n        pad_h = h // self.scale\n        pad_w = w // self.scale\n        while attempt < 10:\n            attempt += 1\n            polygons_new = []\n            xx = np.random.choice(w_axis, size=2)\n            xmin = np.min(xx) - pad_w\n            xmax = np.max(xx) - pad_w\n            xmin = np.clip(xmin, 0, w - 1)\n            xmax = np.clip(xmax, 0, w - 1)\n            yy = np.random.choice(h_axis, size=2)\n            ymin = np.min(yy) - pad_h\n            ymax = np.max(yy) - pad_h\n            ymin = np.clip(ymin, 0, h - 1)\n            ymax = np.clip(ymax, 0, h - 1)\n            if (xmax - xmin) * (ymax - ymin) < area * self.min_crop_side_ratio:\n                # area too small\n                continue\n\n            pts = np.stack([[xmin, xmax, xmax, xmin], [ymin, ymin, ymax, ymax]]).T.astype(np.int32)\n            pp = Polygon(pts).buffer(0)\n            Fail_flag = False\n            for polygon in polygons:\n                ppi = Polygon(polygon.points).buffer(0)\n                ppiou = float(ppi.intersection(pp).area)\n                if np.abs(ppiou - float(ppi.area)) > self.epsilon and np.abs(ppiou) > self.epsilon:\n                    Fail_flag = True\n                    break\n                if np.abs(ppiou - float(ppi.area)) < self.epsilon:\n                    polygons_new.append(polygon)\n\n            if Fail_flag:\n                continue\n            else:\n                break\n\n        if len(polygons_new) == 0:\n            cropped = image[ymin:ymax, xmin:xmax, :]\n            select_type = random.randint(3)\n            if select_type == 0:\n                img = np.ascontiguousarray(cropped[:, ::-1])\n            elif select_type == 1:\n                img = np.ascontiguousarray(cropped[::-1, :])\n            else:\n                img = np.ascontiguousarray(cropped[::-1, ::-1])\n            image[ymin:ymax, xmin:xmax, :] = img\n            return image, polygons\n\n        else:\n            cropped = image[ymin:ymax, xmin:xmax, :]\n            height, width, _ = cropped.shape\n            select_type = random.randint(3)\n            if select_type == 0:\n                img = np.ascontiguousarray(cropped[:, ::-1])\n                for polygon in polygons_new:\n                    polygon.points[:, 0] = width - polygon.points[:, 0] + 2 * xmin\n            elif select_type == 1:\n                img = np.ascontiguousarray(cropped[::-1, :])\n                for polygon in polygons_new:\n                    polygon.points[:, 1] = height - polygon.points[:, 1] + 2 * ymin\n            else:\n                img = np.ascontiguousarray(cropped[::-1, ::-1])\n                for polygon in polygons_new:\n                    polygon.points[:, 0] = width - polygon.points[:, 0] + 2 * xmin\n                    polygon.points[:, 1] = height - polygon.points[:, 1] + 2 * ymin\n            image[ymin:ymax, xmin:xmax, :] = img\n\n        return image, polygons\n\n\nclass RandomResizedCrop(object):\n    def __init__(self, min_crop_side_ratio=0.1):\n        self.scale = 10\n        self.epsilon = 1e-2\n        self.min_crop_side_ratio = min_crop_side_ratio\n\n    def __call__(self, image, polygons):\n\n        if polygons is None:\n            return image, polygons\n\n        # 计算 有效的Crop区域, 方便选取有效的种子点\n        h_axis, w_axis, pp_polys = crop_first(image, polygons, scale =self.scale)\n        if len(h_axis) == 0 or len(w_axis) == 0:\n            return image, polygons\n\n        # TODO try crop\n        attempt = 0\n        h, w, _ = image.shape\n        area = h * w\n        pad_h = h // self.scale\n        pad_w = w // self.scale\n        while attempt < 10:\n            attempt += 1\n            xx = np.random.choice(w_axis, size=2)\n            xmin = np.min(xx) - pad_w\n            xmax = np.max(xx) - pad_w\n            xmin = np.clip(xmin, 0, w - 1)\n            xmax = np.clip(xmax, 0, w - 1)\n            yy = np.random.choice(h_axis, size=2)\n            ymin = np.min(yy) - pad_h\n            ymax = np.max(yy) - pad_h\n            ymin = np.clip(ymin, 0, h - 1)\n            ymax = np.clip(ymax, 0, h - 1)\n            if (xmax - xmin)*(ymax - ymin) <area*self.min_crop_side_ratio:\n                # area too small\n                continue\n            if pp_polys.shape[0] != 0:\n                poly_axis_in_area = (pp_polys[:, :, 0] >= xmin) & (pp_polys[:, :, 0] <= xmax) \\\n                                    & (pp_polys[:, :, 1] >= ymin) & (pp_polys[:, :, 1] <= ymax)\n                selected_polys = np.where(np.sum(poly_axis_in_area, axis=1) == 4)[0]\n            else:\n                selected_polys = []\n\n            if len(selected_polys) == 0:\n                continue\n            else:\n                pts = np.stack([[xmin, xmax, xmax, xmin], [ymin, ymin, ymax, ymax]]).T.astype(np.int32)\n                pp = Polygon(pts).buffer(0)\n                polygons_new = []\n                Fail_flag = False\n                for polygon in copy.deepcopy(polygons):\n                    ppi = Polygon(polygon.points).buffer(0)\n                    ppiou = float(ppi.intersection(pp).area)\n                    if np.abs(ppiou - float(ppi.area)) > self.epsilon and np.abs(ppiou) > self.epsilon:\n                        Fail_flag = True\n                        break\n                    elif np.abs(ppiou - float(ppi.area)) < self.epsilon:\n                        # polygon.points -= np.array([xmin, ymin])\n                        polygons_new.append(polygon)\n\n                if Fail_flag:\n                    continue\n                else:\n                    cropped = image[ymin:ymax + 1, xmin:xmax + 1, :]\n                    for polygon in polygons_new:\n                        polygon.points -= np.array([xmin, ymin])\n\n                    return cropped, polygons_new\n\n        return image, polygons\n\n\nclass RandomResizeScale(object):\n    def __init__(self, size=512, ratio=(3./4, 5./2)):\n        self.size = size\n        self.ratio = ratio\n\n    def __call__(self, image, polygons=None):\n\n        aspect_ratio = np.random.uniform(self.ratio[0], self.ratio[1])\n        h, w, _ = image.shape\n        scales = self.size*1.0/max(h, w)\n        aspect_ratio = scales * aspect_ratio\n        aspect_ratio = int(w * aspect_ratio)*1.0/w\n        image = cv2.resize(image, (int(w * aspect_ratio), int(h*aspect_ratio)))\n        scales = np.array([aspect_ratio, aspect_ratio])\n        if polygons is not None:\n            for polygon in polygons:\n                polygon.points = polygon.points * scales\n\n        return image, polygons\n\n\nclass Resize(object):\n    def __init__(self, size=1024):\n        self.size = size\n        self.SP = SquarePadding()\n\n    def __call__(self, image, polygons=None):\n        h, w, _ = image.shape\n        image = cv2.resize(image, (self.size,\n                                   self.size))\n        scales = np.array([self.size / w, self.size / h])\n\n        if polygons is not None:\n            for polygon in polygons:\n                polygon.points = polygon.points * scales\n\n        return image, polygons\n\n\nclass ResizeSquare(object):\n    def __init__(self, size=(480, 1280)):\n        self.size = size\n\n    def __call__(self, image, polygons=None):\n        h, w, _ = image.shape\n        img_size_min = min(h, w)\n        img_size_max = max(h, w)\n\n        if img_size_min < self.size[0]:\n            im_scale = float(self.size[0]) / float(img_size_min)  # expand min to size[0]\n            if np.ceil(im_scale * img_size_max) > self.size[1]:  # expand max can't > size[1]\n                im_scale = float(self.size[1]) / float(img_size_max)\n        elif img_size_max > self.size[1]:\n            im_scale = float(self.size[1]) / float(img_size_max)\n        else:\n            im_scale = 1.0\n\n        new_h = int(int(h * im_scale/32)*32)\n        new_w = int(int(w * im_scale/32)*32)\n        # if new_h*new_w > 1600*1920:\n        #     im_scale = 1600 / float(img_size_max)\n        #     new_h = int(int(h * im_scale/32)*32)\n        #     new_w = int(int(w * im_scale/32)*32)\n        image = cv2.resize(image, (new_w, new_h))\n        scales = np.array([new_w / w, new_h / h])\n        if polygons is not None:\n            for polygon in polygons:\n                polygon.points = polygon.points * scales\n\n        return image, polygons\n\n\nclass ResizeLimitSquare(object):\n    def __init__(self, size=512, ratio=0.6):\n        self.size = size\n        self.ratio = ratio\n        self.SP = SquarePadding()\n\n    def __call__(self, image, polygons=None):\n        if np.random.random() <= self.ratio:\n            image, polygons = self.SP(image, polygons)\n        h, w, _ = image.shape\n        image = cv2.resize(image, (self.size,self.size))\n        scales = np.array([self.size*1.0/ w, self.size*1.0 / h])\n\n        if polygons is not None:\n            for polygon in polygons:\n                polygon.points = polygon.points * scales\n\n        return image, polygons\n\n\nclass RandomResizePadding(object):\n    def __init__(self, size=512, random_scale=np.array([0.75, 1.0, 1.25,1.5,2.0]),stride=32, ratio=0.6667):\n        self.random_scale = random_scale\n        self.size = size\n        self.ratio=ratio\n        self.stride=stride\n        self.SP=SquarePadding()\n\n        ###########Random size for different eproches ########################\n        rd_scale = np.random.choice(self.random_scale)\n        step_num = round(np.random.normal(loc=0.0, scale=0.35) * 8)  # step 按照高斯分布\n        self.input_size = np.clip(int(self.size * rd_scale + step_num * self.stride),\n                                  (int(self.size * self.random_scale[0] - self.stride)),\n                                  int(self.size * self.random_scale[-1] + self.stride))\n        ############################ end ########################\n\n    def __call__(self, image, polygons=None):\n\n        if np.random.random() <= self.ratio:\n            image, polygons = self.SP(image, polygons)\n        h, w, _ = image.shape\n        image = cv2.resize(image, (self.input_size,self.input_size))\n        scales = np.array([self.input_size*1.0/ w, self.input_size*1.0 / h])\n\n        if polygons is not None:\n            for polygon in polygons:\n                polygon.points = polygon.points * scales\n\n        return image, polygons\n\ntransform_type_dict = dict(\n    brightness=ImageEnhance.Brightness, contrast=ImageEnhance.Contrast,\n    sharpness=ImageEnhance.Sharpness,   color=ImageEnhance.Color\n)\n\n\nclass RandomDistortion(object):\n    def __init__(self, transform_dict, prob=0.5):\n        self.transforms = [(transform_type_dict[k], transform_dict[k]) for k in transform_dict]\n        self.prob = prob\n\n    def __call__(self, img, target):\n        if random.random() > self.prob:\n            return img, target\n        out = Image.fromarray(img)\n        rand_num = np.random.uniform(0, 1, len(self.transforms))\n\n        for i, (transformer, alpha) in enumerate(self.transforms):\n            r = alpha * (rand_num[i] * 2.0 - 1.0) + 1  # r in [1-alpha, 1+alpha)\n            out = transformer(out).enhance(r)\n\n        return np.array(out), target\n\n\nclass Augmentation(object):\n    def __init__(self, size, mean, std):\n        self.size = size\n        self.mean = mean\n        self.std = std\n        self._transform_dict = {'brightness': 0.5, 'contrast': 0.5, 'sharpness': 0.8386, 'color': 0.5}\n        self.augmentation = Compose([\n            RandomCropFlip(),\n            RandomResizeScale(size=self.size, ratio=(3. / 8, 5. / 2)),\n            RandomResizedCrop(),\n            RotatePadding(up=60, colors=True),  # pretrain on Syn is \"up=30\", else is \"up=60\"\n            ResizeLimitSquare(size=self.size),\n            RandomMirror(),\n            RandomDistortion(self._transform_dict),\n            Normalize(mean=self.mean, std=self.std),\n        ])\n\n    def __call__(self, image, polygons=None):\n        return self.augmentation(image, polygons)\n\n\nclass BaseTransform(object):\n    def __init__(self, size, mean, std):\n        self.size = size\n        self.mean = mean\n        self.std = std\n        self.augmentation = Compose([\n            # Resize(size=640),\n            ResizeSquare(size=self.size),\n            Normalize(mean, std)\n        ])\n\n    def __call__(self, image, polygons=None):\n        return self.augmentation(image, polygons)\n\n\nclass BaseTransformNresize(object):\n    def __init__(self, mean, std):\n        self.mean = mean\n        self.std = std\n        self.augmentation = Compose([\n            Normalize(mean, std)\n        ])\n\n    def __call__(self, image, polygons=None):\n        return self.augmentation(image, polygons)\n"
  },
  {
    "path": "util/canvas.py",
    "content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n__author__ = '古溪'\n\nimport numpy as np\nimport random\nimport matplotlib.pyplot as plt\n\n\ndef heatmap(im_gray):\n    cmap = plt.get_cmap('jet')\n    rgba_img = cmap(255 - im_gray)\n    Hmap = np.delete(rgba_img, 3, 2)\n    # print(Hmap.shape, Hmap.max(), Hmap.min())\n    # cv2.imshow(\"heat_img\", Hmap)\n    # cv2.waitKey(0)\n    return Hmap\n\n\ndef loss_ploy(loss_list, steps, period, name=\"\"):\n    fig1, ax1 = plt.subplots(figsize=(16, 9))\n    ax1.plot(range(steps // period), loss_list)\n    ax1.set_title(\"Average loss vs step*{}\".format(period))\n    ax1.set_xlabel(\"step*{}\".format(period))\n    ax1.set_ylabel(\"Current loss\")\n    plt.savefig('{}@loss_vs_step*{}.png'.format(name,period))\n    plt.clf()\n\n\ndef plt_ploys(ploys, period, name=\"\"):\n    fig1, ax1 = plt.subplots(figsize=(16, 9))\n    cnames = ['aliceblue','antiquewhite','aqua','aquamarine','azure',\n               'blanchedalmond','blue','blueviolet','brown','burlywood',\n               'coral','cornflowerblue','cornsilk','crimson','cyan',\n               'darkblue','deeppink','deepskyblue','dodgerblue','forestgreen',\n               'gold','goldenrod','green','greenyellow','honeydew','hotpink',\n               'lawngreen','lightblue','lightgreen','lightpink','lightsalmon',\n               'lightseagreen','lightsteelblue','lightyellow','lime','limegreen',\n               'mediumseagreen','mediumspringgreen','midnightblue','orange','orangered',\n               'pink','red','royalblue','seagreen','skyblue','springgreen','steelblue',\n               'tan','teal','thistle','yellow','yellowgreen']\n\n    color = random.sample(cnames, len(ploys.keys()))\n    for ii, key in enumerate(ploys.keys()):\n        ax1.plot(range(1, len(ploys[key])+1), ploys[key],color=color[ii], label=key)\n    ax1.set_title(\"Loss Carve line\")\n    ax1.set_xlabel(\"step*{}\".format(period))\n    ax1.set_ylabel(\"Current loss\")\n    plt.legend(ploys.keys())\n    plt.savefig('{}@loss_vs_step*{}.png'.format(name, period))\n    plt.clf()\n\nif __name__ == '__main__':\n    # TODO ADD CODE\n    pass"
  },
  {
    "path": "util/detection.py",
    "content": "# c++ version pse based on opencv 3+\nfrom pse import decode as pse_decode\nfrom cfglib.config import config as cfg\n\n\nclass TextDetector(object):\n\n    def __init__(self, model):\n        # evaluation mode\n        self.model = model\n        model.eval()\n        # parameter\n        self.scale = cfg.scale\n        self.threshold = cfg.threshold\n\n    def detect(self, image, img_show):\n        # get model output\n        preds = self.model.forward(image)\n        preds, boxes, contours = pse_decode(preds[0], self.scale, self.threshold)\n\n        output = {\n            'image': image,\n            'tr': preds,\n            'bbox': boxes\n        }\n        return contours, output\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n"
  },
  {
    "path": "util/eval.py",
    "content": "import os\nimport cv2\nimport numpy as np\nimport subprocess\nfrom cfglib.config import config as cfg\nfrom util.misc import mkdirs\n\n\ndef osmkdir(out_dir):\n    import shutil\n    if os.path.exists(out_dir):\n        shutil.rmtree(out_dir)\n    os.makedirs(out_dir)\n\n\ndef analysize_result(source_dir, fid_path, outpt_dir, name):\n\n    bad_txt = open(\"{}/eval.txt\".format(outpt_dir), 'w')\n    all_eval = open(\"{}/{}/{}_eval.txt\".format(cfg.output_dir, \"Analysis\", name), 'a+')\n    sel_list = list()\n    with open(fid_path) as f:\n        lines = f.read().split(\"\\n\")\n        for line in lines:\n            line_items = line.split(\" \")\n            id = line_items[0]\n            precision = float(line_items[2].split('=')[-1])\n            recall = float(line_items[4].split('=')[-1])\n            if id != \"ALL\" and (precision < 0.5 or recall < 0.5):\n                img_path = os.path.join(source_dir, line_items[0].replace(\".txt\", \".jpg\"))\n                if os.path.exists(img_path):\n                    os.system('cp {} {}'.format(img_path, outpt_dir))\n                sel_list.append((int(id.replace(\".txt\", \"\").replace(\"img\", \"\").replace(\"_\", \"\")), line))\n            if id == \"ALL\":\n                all_eval.write(\"{} {} {}\\n\".format(\n                    outpt_dir.split('/')[-1],\n                    \"{}/{}\".format(cfg.dis_threshold, cfg.cls_threshold),\n                    line))\n    sel_list = sorted(sel_list, key=lambda its: its[0])\n    bad_txt.write('\\n'.join([its[1] for its in sel_list]))\n    all_eval.close()\n    bad_txt.close()\n\n\ndef deal_eval_total_text(debug=False):\n    # compute DetEval\n    eval_dir = os.path.join(cfg.output_dir, \"Analysis\", \"output_eval\")\n    if not os.path.exists(eval_dir):\n        os.makedirs(eval_dir)\n\n    print('Computing DetEval in {}/{}'.format(cfg.output_dir, cfg.exp_name))\n    subprocess.call(\n        ['python', 'dataset/total_text/Evaluation_Protocol/Python_scripts/Deteval.py', cfg.exp_name, '--tr', '0.7',\n         '--tp', '0.6'])\n    subprocess.call(\n        ['python', 'dataset/total_text/Evaluation_Protocol/Python_scripts/Deteval.py', cfg.exp_name, '--tr', '0.8',\n         '--tp', '0.4'])\n\n    if debug:\n        source_dir = os.path.join(cfg.vis_dir, '{}_test'.format(cfg.exp_name))\n        outpt_dir_base = os.path.join(cfg.output_dir, \"Analysis\", \"eval_view\", \"total_text\")\n        if not os.path.exists(outpt_dir_base):\n            mkdirs(outpt_dir_base)\n\n        outpt_dir1 = os.path.join(outpt_dir_base, \"{}_{}_{}_{}_{}\"\n                                  .format(cfg.test_size[0], cfg.test_size[1], cfg.checkepoch, 0.7, 0.6))\n        osmkdir(outpt_dir1)\n        fid_path1 = '{}/Eval_TotalText_{}_{}.txt'.format(eval_dir, 0.7, 0.6)\n\n        analysize_result(source_dir, fid_path1, outpt_dir1, \"totalText\")\n\n        outpt_dir2 = os.path.join(outpt_dir_base, \"{}_{}_{}_{}_{}\"\n                                  .format(cfg.test_size[0], cfg.test_size[1], cfg.checkepoch, 0.8, 0.4))\n        osmkdir(outpt_dir2)\n        fid_path2 = '{}/Eval_TotalText_{}_{}.txt'.format(eval_dir, 0.8, 0.4)\n\n        analysize_result(source_dir, fid_path2, outpt_dir2, \"totalText\")\n\n    print('End.')\n\n\ndef deal_eval_ctw1500(debug=False):\n    # compute DetEval\n    eval_dir = os.path.join(cfg.output_dir, \"Analysis\", \"output_eval\")\n    if not os.path.exists(eval_dir):\n        os.makedirs(eval_dir)\n\n    print('Computing DetEval in {}/{}'.format(cfg.output_dir, cfg.exp_name))\n    subprocess.call(['python', 'dataset/ctw1500/Evaluation_Protocol/ctw1500_eval.py', cfg.exp_name])\n\n    if debug:\n        source_dir = os.path.join(cfg.vis_dir, '{}_test'.format(cfg.exp_name))\n        outpt_dir_base = os.path.join(cfg.output_dir, \"Analysis\", \"eval_view\", \"ctw1500\")\n        if not os.path.exists(outpt_dir_base):\n            mkdirs(outpt_dir_base)\n\n        outpt_dir = os.path.join(outpt_dir_base, \"{}_{}_{}\".format(cfg.test_size[0], cfg.test_size[1], cfg.checkepoch))\n        osmkdir(outpt_dir)\n        fid_path1 = '{}/Eval_ctw1500_{}.txt'.format(eval_dir, 0.5)\n\n        analysize_result(source_dir, fid_path1, outpt_dir, \"ctw1500\")\n\n    print('End.')\n\n\ndef deal_eval_icdar15(debug=False):\n    # compute DetEval\n    eval_dir = os.path.join(cfg.output_dir, \"Analysis\", \"output_eval\")\n    if not os.path.exists(eval_dir):\n        os.makedirs(eval_dir)\n\n    input_dir = 'output/{}'.format(cfg.exp_name)\n    father_path = os.path.abspath(input_dir)\n    print(father_path)\n    print('Computing DetEval in {}/{}'.format(cfg.output_dir, cfg.exp_name))\n    subprocess.call(['sh', 'dataset/icdar15/eval.sh', father_path])\n\n    if debug:\n        source_dir = os.path.join(cfg.vis_dir, '{}_test'.format(cfg.exp_name))\n        outpt_dir_base = os.path.join(cfg.output_dir, \"Analysis\", \"eval_view\", \"icdar15\")\n        if not os.path.exists(outpt_dir_base):\n            mkdirs(outpt_dir_base)\n\n        outpt_dir = os.path.join(outpt_dir_base, \"{}_{}_{}\".format(cfg.test_size[0], cfg.test_size[1], cfg.checkepoch))\n        osmkdir(outpt_dir)\n        fid_path1 = '{}/Eval_icdar15.txt'.format(eval_dir)\n\n        analysize_result(source_dir, fid_path1, outpt_dir, \"icdar15\")\n\n    print('End.')\n\n    pass\n\n\ndef deal_eval_TD500(debug=False):\n    # compute DetEval\n    eval_dir = os.path.join(cfg.output_dir, \"Analysis\", \"output_eval\")\n    if not os.path.exists(eval_dir):\n        os.makedirs(eval_dir)\n\n    input_dir = 'output/{}'.format(cfg.exp_name)\n    father_path = os.path.abspath(input_dir)\n    print(father_path)\n    print('Computing DetEval in {}/{}'.format(cfg.output_dir, cfg.exp_name))\n    subprocess.call(['sh', 'dataset/TD500/eval.sh', father_path])\n\n    if debug:\n        source_dir = os.path.join(cfg.vis_dir, '{}_test'.format(cfg.exp_name))\n        outpt_dir_base = os.path.join(cfg.output_dir, \"Analysis\", \"eval_view\", \"TD500\")\n        if not os.path.exists(outpt_dir_base):\n            mkdirs(outpt_dir_base)\n\n        outpt_dir = os.path.join(outpt_dir_base, \"{}_{}_{}\".format(cfg.test_size[0], cfg.test_size[1], cfg.checkepoch))\n        osmkdir(outpt_dir)\n        fid_path1 = '{}/Eval_TD500.txt'.format(eval_dir)\n\n        analysize_result(source_dir, fid_path1, outpt_dir, \"TD500\")\n\n    print('End.')\n\n\ndef data_transfer_ICDAR(contours):\n    cnts = list()\n    for cont in contours:\n        rect = cv2.minAreaRect(cont)\n        if min(rect[1][0], rect[1][1]) <= 5:\n            continue\n        points = cv2.boxPoints(rect)\n        points = np.int0(points)\n        # print(points.shape)\n        # points = np.reshape(points, (4, 2))\n        cnts.append(points)\n    return cnts\n\n\ndef data_transfer_TD500(contours, res_file, img=None):\n    with open(res_file, 'w') as f:\n        for cont in contours:\n            rect = cv2.minAreaRect(cont)\n            if min(rect[1][0], rect[1][1]) <= 5:\n                continue\n            points = cv2.boxPoints(rect)\n            box = np.int0(points)\n            cv2.drawContours(img, [box], 0, (0, 255, 0), 3)\n\n            cx, cy = rect[0]\n            w_, h_ = rect[1]\n            angle = rect[2]\n            mid_ = 0\n            if angle > 45:\n                angle = 90 - angle\n                mid_ = w_;\n                w_ = h_;\n                h_ = mid_\n            elif angle < -45:\n                angle = 90 + angle\n                mid_ = w_;\n                w_ = h_;\n                h_ = mid_\n            angle = angle / 180 * 3.141592653589\n\n            x_min = int(cx - w_ / 2)\n            x_max = int(cx + w_ / 2)\n            y_min = int(cy - h_ / 2)\n            y_max = int(cy + h_ / 2)\n            f.write('{},{},{},{},{}\\r\\n'.format(x_min, y_min, x_max, y_max, angle))\n\n    return img\n\n\ndef data_transfer_MLT2017(contours, res_file):\n    with open(res_file, 'w') as f:\n        for cont in contours:\n            rect = cv2.minAreaRect(cont)\n            if min(rect[1][0], rect[1][1]) <= 5:\n                 continue\n            ploy_area = cv2.contourArea(cont)\n            rect_area = rect[1][0]*rect[1][1]\n            solidity = ploy_area/rect_area\n            width = rect[1][0] - np.clip(rect[1][0] * (1-np.sqrt(solidity)), 0, 6)\n            height = rect[1][1] - np.clip(rect[1][1] * (1-np.sqrt(solidity)), 0, 4)\n            points = cv2.boxPoints((rect[0], (width, height), rect[2]))\n            points = np.int0(points)\n            p = np.reshape(points, -1)\n            f.write('{},{},{},{},{},{},{},{},{}\\r\\n'\n                    .format(p[0], p[1], p[2], p[3], p[4], p[5], p[6], p[7], 1))\n\n\n\n"
  },
  {
    "path": "util/graph.py",
    "content": "from __future__ import print_function\nfrom __future__ import division\nfrom __future__ import absolute_import\n\nimport numpy as np\nimport time\nfrom util.misc import norm2\n\nclass Data(object):\n    def __init__(self, name):\n        self.__name = name\n        self.__links = set()\n\n    @property\n    def name(self):\n        return self.__name\n\n    @property\n    def links(self):\n        return set(self.__links)\n\n    def add_link(self, other, score):\n        self.__links.add(other)\n        other.__links.add(self)\n\n\ndef connected_components(nodes, score_dict, th):\n    '''\n    conventional connected components searching\n    '''\n    result = []\n    nodes = set(nodes)\n    while nodes:\n        n = nodes.pop()\n        group = {n}\n        queue = [n]\n        while queue:\n            n = queue.pop(0)\n            if th is not None:\n                neighbors = {l for l in n.links if score_dict[tuple(sorted([n.name, l.name]))] >= th}\n            else:\n                neighbors = n.links\n            neighbors.difference_update(group)\n            nodes.difference_update(neighbors)\n            group.update(neighbors)\n            queue.extend(neighbors)\n        result.append(group)\n    return result\n\n\ndef connected_components_constraint(nodes, max_sz, score_dict=None, th=None):\n    '''\n    only use edges whose scores are above `th`\n    if a component is larger than `max_sz`, all the nodes in this component are added into `remain` and returned for next iteration.\n    '''\n    result = []\n    remain = set()\n    nodes = set(nodes)\n    while nodes:\n        n = nodes.pop()\n        group = {n}\n        queue = [n]\n        valid = True\n        while queue:\n            n = queue.pop(0)\n            if th is not None:\n                neighbors = {l for l in n.links if score_dict[tuple(sorted([n.name, l.name]))] >= th}\n            else:\n                neighbors = n.links\n            neighbors.difference_update(group)\n            nodes.difference_update(neighbors)\n            group.update(neighbors)\n            queue.extend(neighbors)\n            if len(group) > max_sz or len(remain.intersection(neighbors)) > 0:\n                # if this group is larger than `max_sz`, add the nodes into `remain`\n                valid = False\n                remain.update(group)\n                break\n        if valid: # if this group is smaller than or equal to `max_sz`, finalize it.\n            result.append(group)\n    return result, remain\n\n\ndef graph_propagation_naive(edges, score, th, bboxs=None, dis_thresh=50, pool='avg'):\n\n    edges = np.sort(edges, axis=1)\n\n    score_dict = {}  # score lookup table\n    if pool is None:\n        for i, e in enumerate(edges):\n            score_dict[e[0], e[1]] = score[i]\n    elif pool == 'avg':\n        for i, e in enumerate(edges):\n            if bboxs is not None:\n                box1 = bboxs[e[0]][:8].reshape(4, 2)\n                box2 = bboxs[e[1]][:8].reshape(4, 2)\n                c1 = np.mean(box1, 0); c2 = np.mean(box2, 0)\n                dst = norm2(c1 - c2)\n                if dst > dis_thresh:\n                    score[i] = 0\n            if (e[0], e[1]) in score_dict:\n                score_dict[e[0], e[1]] = 0.5 * (score_dict[e[0], e[1]] + score[i])\n            else:\n                score_dict[e[0], e[1]] = score[i]\n\n    elif pool == 'max':\n        for i, e in enumerate(edges):\n            if (e[0], e[1]) in score_dict:\n                score_dict[e[0], e[1]] = max(score_dict[e[0], e[1]], score[i])\n            else:\n                score_dict[e[0], e[1]] = score[i]\n    else:\n        raise ValueError('Pooling operation not supported')\n\n    nodes = np.sort(np.unique(edges.flatten()))\n    mapping = -1 * np.ones((nodes.max()+1), dtype=np.int)\n    mapping[nodes] = np.arange(nodes.shape[0])\n    link_idx = mapping[edges]\n    vertex = [Data(n) for n in nodes]\n    for l, s in zip(link_idx, score):\n        vertex[l[0]].add_link(vertex[l[1]], s)\n\n    # first iteration\n    comps = connected_components(vertex, score_dict,th)\n\n    return comps\n\n\ndef graph_search(edges, scores, edges_num, th=None):\n    # graph search\n    scores = scores.reshape((-1, edges_num))\n    select_index = np.argsort(scores, axis=1)[:, -2:]\n    edges = np.sort(edges, axis=1).reshape((-1, edges_num, 2))\n\n    score_dict = {}\n    for i, ips in enumerate(select_index):\n        edg = edges[i]\n        si = scores[i]\n        for j, idx in enumerate(ips):\n            e = edg[idx, :]\n            if (e[0], e[1]) in score_dict:\n                score_dict[e[0], e[1]] = 0.5 * (score_dict[e[0], e[1]] + si[j])\n            else:\n                score_dict[e[0], e[1]] = si[j]\n\n    nodes = np.sort(np.unique(edges.flatten()))\n    vertex = [Data(n) for n in nodes]\n    for (key, value) in score_dict.items():\n        vertex[key[0]].add_link(vertex[key[1]], value)\n\n    comps = connected_components(vertex, score_dict, th)\n\n    return comps\n\n\ndef graph_propagation(edges, score, max_sz, step=0.1, beg_th=0.5, pool=None):\n\n    edges = np.sort(edges, axis=1)\n    th = score.min()\n    # th = beg_th\n    # construct graph\n    score_dict = {} # score lookup table\n    if pool is None:\n        for i,e in enumerate(edges):\n            score_dict[e[0], e[1]] = score[i]\n    elif pool == 'avg':\n        for i,e in enumerate(edges):\n            if (e[0], e[1]) in score_dict:\n                score_dict[e[0], e[1]] = 0.5*(score_dict[e[0], e[1]] + score[i])\n            else:\n                score_dict[e[0], e[1]] = score[i]\n\n    elif pool == 'max':\n        for i,e in enumerate(edges):\n            if (e[0],e[1]) in score_dict:\n                score_dict[e[0], e[1]] = max(score_dict[e[0], e[1]] , score[i])\n            else:\n                score_dict[e[0], e[1]] = score[i]\n    else:\n        raise ValueError('Pooling operation not supported')\n\n    nodes = np.sort(np.unique(edges.flatten()))\n    mapping = -1 * np.ones((nodes.max()+1), dtype=np.int)\n    mapping[nodes] = np.arange(nodes.shape[0])\n    link_idx = mapping[edges]\n    vertex = [Data(n) for n in nodes]\n    for l, s in zip(link_idx, score):\n        vertex[l[0]].add_link(vertex[l[1]], s)\n\n    # first iteration\n    comps, remain = connected_components_constraint(vertex, max_sz)\n\n    # iteration\n    components = comps[:]\n    while remain:\n        th = th + (1 - th) * step\n        comps, remain = connected_components_constraint(remain, max_sz, score_dict, th)\n        components.extend(comps)\n    return components\n\n\ndef graph_propagation_soft(edges, score, max_sz, step=0.1, **kwargs):\n\n    edges = np.sort(edges, axis=1)\n    th = score.min()\n\n    # construct graph\n    score_dict = {} # score lookup table\n    for i,e in enumerate(edges):\n        score_dict[e[0], e[1]] = score[i]\n\n    nodes = np.sort(np.unique(edges.flatten()))\n    mapping = -1 * np.ones((nodes.max()+1), dtype=np.int)\n    mapping[nodes] = np.arange(nodes.shape[0])\n    link_idx = mapping[edges]\n    vertex = [Data(n) for n in nodes]\n    for l, s in zip(link_idx, score):\n        vertex[l[0]].add_link(vertex[l[1]], s)\n\n    # first iteration\n    comps, remain = connected_components_constraint(vertex, max_sz)\n    first_vertex_idx = np.array([mapping[n.name] for c in comps for n in c])\n    fusion_vertex_idx = np.setdiff1d(np.arange(nodes.shape[0]), first_vertex_idx, assume_unique=True)\n    # iteration\n    components = comps[:]\n    while remain:\n        th = th + (1 - th) * step\n        comps, remain = connected_components_constraint(remain, max_sz, score_dict, th)\n        components.extend(comps)\n    label_dict = {}\n    for i,c in enumerate(components):\n        for n in c:\n            label_dict[n.name] = i\n    print('Propagation ...')\n    prop_vertex = [vertex[idx] for idx in fusion_vertex_idx]\n    label, label_fusion = diffusion(prop_vertex, label_dict, score_dict, **kwargs)\n    return label, label_fusion\n\n\ndef diffusion(vertex, label, score_dict, max_depth=5, weight_decay=0.6, normalize=True):\n    class BFSNode():\n        def __init__(self, node, depth, value):\n            self.node = node\n            self.depth = depth\n            self.value = value\n            \n    label_fusion = {}\n    for name in label.keys():\n        label_fusion[name] = {label[name]: 1.0}\n    prog = 0\n    prog_step = len(vertex) // 20\n    start = time.time()\n    for root in vertex:\n        if prog % prog_step == 0:\n            print(\"progress: {} / {}, elapsed time: {}\".format(prog, len(vertex), time.time() - start))\n        prog += 1\n        #queue = {[root, 0, 1.0]}\n        queue = {BFSNode(root, 0, 1.0)}\n        visited = [root.name]\n        root_label = label[root.name]\n        while queue:\n            curr = queue.pop()\n            if curr.depth >= max_depth: # pruning\n                continue\n            neighbors = curr.node.links\n            tmp_value = []\n            tmp_neighbor = []\n            for n in neighbors:\n                if n.name not in visited:\n                    sub_value = score_dict[tuple(sorted([curr.node.name, n.name]))] * weight_decay * curr.value\n                    tmp_value.append(sub_value)\n                    tmp_neighbor.append(n)\n                    if root_label not in label_fusion[n.name].keys():\n                        label_fusion[n.name][root_label] = sub_value\n                    else:\n                        label_fusion[n.name][root_label] += sub_value\n                    visited.append(n.name)\n                    #queue.add([n, curr.depth+1, sub_value])\n            sortidx = np.argsort(tmp_value)[::-1]\n            for si in sortidx:\n                queue.add(BFSNode(tmp_neighbor[si], curr.depth+1, tmp_value[si]))\n    if normalize:\n        for name in label_fusion.keys():\n            summ = sum(label_fusion[name].values())\n            for k in label_fusion[name].keys():\n                label_fusion[name][k] /= summ\n    return label, label_fusion\n\n\ndef clusters2labels(clusters, n_nodes):\n    labels = (-1)* np.ones((n_nodes,))\n    for ci, c in enumerate(clusters):\n        for xid in c:\n            labels[xid.name] = ci\n    assert np.sum(labels < 0) < 1\n    return labels\n\n\ndef single_remove(bbox, pred):\n    single_idcs = np.zeros_like(pred)\n    pred_unique = np.unique(pred)\n    for u in pred_unique:\n        idcs = pred == u\n        if np.sum(idcs) == 1:\n            single_idcs[np.where(idcs)[0][0]] = 1\n    remain_idcs = [i for i in range(len(pred)) if not single_idcs[i]]\n    remain_idcs = np.asarray(remain_idcs)\n    return bbox[remain_idcs, :], pred[remain_idcs]\n\n"
  },
  {
    "path": "util/io.py",
    "content": "#coding=utf-8\n'''\nCreated on 2016年9月27日\n\n@author: dengdan\n\nTool  functions for file system operation and I/O. \nIn the style of linux shell commands\n'''\nimport os\nimport pickle as pkl\nimport subprocess\nimport logging\nfrom . import strs, io\n\n\ndef mkdir(path):\n    \"\"\"\n    If the target directory does not exists, it and its parent directories will created. \n    \"\"\"\n    path = get_absolute_path(path)\n    if not exists(path):\n        os.makedirs(path)\n    return path\n\ndef make_parent_dir(path):\n    \"\"\"make the parent directories for a file.\"\"\"\n    parent_dir = get_dir(path)\n    mkdir(parent_dir)\n    \n    \ndef pwd():\n    return os.getcwd()\n\ndef dump(path, obj):\n    path = get_absolute_path(path)\n    parent_path = get_dir(path)\n    mkdir(parent_path)\n    with open(path, 'w') as f:\n        logging.info('dumping file:' + path);\n        pkl.dump(obj, f)\n\ndef load(path):\n    path = get_absolute_path(path)\n    with open(path, 'r') as f:\n        data = pkl.load(f)\n    return data\n\ndef join_path(a, *p):\n    return os.path.join(a, *p)\n\ndef is_dir(path):\n    path = get_absolute_path(path)\n    return os.path.isdir(path)\n\nis_directory = is_dir\n\ndef is_path(path):\n    path = get_absolute_path(path)\n    return os.path.ispath(path)\n    \ndef get_dir(path):\n    '''\n    return the directory it belongs to.\n    if path is a directory itself, itself will be return \n    '''\n    path = get_absolute_path(path)\n    if is_dir(path):\n        return path;\n    return os.path.split(path)[0]\n\ndef get_parent_dir(path):\n    current_dir = get_dir(path)\n    return get_absolute_path(join_path(current_dir, '..'))\n\ndef get_filename(path):\n    return os.path.split(path)[1]\n\ndef get_absolute_path(p):\n    if p.startswith('~'):\n        p = os.path.expanduser(p)\n    return os.path.abspath(p)\n\ndef cd(p):\n    p = get_absolute_path(p)\n    os.chdir(p)\n    \ndef ls(path = '.', suffix = None):\n    \"\"\"\n    list files in a directory.\n    return file names in a list\n    \"\"\"\n    path = get_absolute_path(path)\n    files = os.listdir(path)\n\n    if suffix is None:       \n        return files\n        \n    filtered = []\n    for f in files:\n        if string.ends_with(f, suffix, ignore_case = True):\n            filtered.append(f)\n    \n    return filtered\n\ndef find_files(pattern):\n    import glob\n    return glob.glob(pattern)\n\ndef read_lines(p):\n    \"\"\"return the text in a file in lines as a list \"\"\"\n    p = get_absolute_path(p)\n    f = open(p,'rU')\n    return f.readlines()\n    \ndef write_lines(p, lines, append_break = False):\n    p = get_absolute_path(p)\n    make_parent_dir(p)\n    with open(p, 'w') as f:\n        for line in lines:\n            if append_break:\n                f.write(line + '\\n')\n            else:\n                f.write(line)\n\ndef cat(p):\n    \"\"\"return the text in a file as a whole\"\"\"\n    cmd = 'cat ' + p\n    return subprocess.getoutput(cmd)\n\ndef exists(path):\n    path = get_absolute_path(path)\n    return os.path.exists(path)\n\ndef not_exists(path):\n    return not exists(path)\n\ndef load_mat(path):\n    import scipy.io as sio\n    path = get_absolute_path(path)\n    return sio.loadmat(path)\n\ndef dump_mat(path, dict_obj, append = True):\n    import scipy.io as sio\n    path = get_absolute_path(path)\n    make_parent_dir(path)\n    sio.savemat(file_name = path, mdict =  dict_obj, appendmat = append)\n    \ndef dir_mat(path):\n    '''\n    list the variables in mat file.\n    return a list: [(name, shape, dtype), ...]\n    '''\n    import scipy.io as sio\n    path = get_absolute_path(path)\n    return sio.whosmat(path)\n    \nSIZE_UNIT_K = 1024\nSIZE_UNIT_M = SIZE_UNIT_K ** 2\nSIZE_UNIT_G = SIZE_UNIT_K ** 3\ndef get_file_size(path, unit = SIZE_UNIT_K):\n    size = os.path.getsize(get_absolute_path(path))\n    return size * 1.0 / unit\n    \n    \ndef create_h5(path):\n    import h5py\n    path = get_absolute_path(path)\n    make_parent_dir(path)\n    return h5py.File(path, 'w');\n\ndef open_h5(path, mode = 'r'):\n    import h5py\n    path = get_absolute_path(path)\n    return h5py.File(path, mode);\n    \ndef read_h5(h5, key):\n    return h5[key][:]\ndef read_h5_attrs(h5, key, attrs):\n    return h5[key].attrs[attrs]\n    \ndef copy(src, dest):\n    io.make_parent_dir(dest)\n    import shutil\n    shutil.copy(get_absolute_path(src), get_absolute_path(dest))\n    \ncp = copy\n\ndef remove(p):\n    import os\n    os.remove(get_absolute_path(p))\nrm = remove\n\ndef search(pattern, path, file_only = True):\n    \"\"\"\n    Search files whose name matches the give pattern. The search scope\n    is the directory and sub-directories of 'path'. \n    \"\"\"\n    path = get_absolute_path(path)\n    pattern_here = io.join_path(path, pattern)\n    targets = []\n    \n    # find matchings in current directory\n    candidates = find_files(pattern_here)\n    for can in candidates:\n        if io.is_dir(can) and file_only:\n            continue\n        else:\n            targets.append(can)\n            \n    # find matching in sub-dirs\n    files = ls(path)\n    for f in files:\n        fpath = io.join_path(path, f)\n        if is_dir(fpath):\n            targets_in_sub_dir = search(pattern, fpath, file_only)\n            targets.extend(targets_in_sub_dir)\n    return targets\n\ndef dump_json(path, data):\n    import ujson as json\n    path = get_absolute_path(path)\n    make_parent_dir(path)\n\n    with open(path, 'w') as f:\n        json.dump(data, f)\n    return path\n\ndef load_json(path):\n    import ujson as json\n    path = get_absolute_path(path)\n    with open(path, 'r')  as f:\n        return json.load(f)\n"
  },
  {
    "path": "util/logging.py",
    "content": "from __future__ import absolute_import\nimport os\nimport sys\nimport numpy as np\nimport tensorflow as tf\nimport scipy.misc \ntry:\n  from StringIO import StringIO  # Python 2.7\nexcept ImportError:\n  from io import BytesIO         # Python 3.x\n\nfrom .osutils import mkdir_if_missing\n\nfrom config import get_args\nglobal_args = get_args(sys.argv[1:])\n\nif global_args.run_on_remote:\n  import moxing as mox\n  mox.file.shift(\"os\", \"mox\")\n\nclass Logger(object):\n  def __init__(self, fpath=None):\n    self.console = sys.stdout\n    self.file = None\n    if fpath is not None:\n      if global_args.run_on_remote:\n        dir_name = os.path.dirname(fpath)\n        if not mox.file.exists(dir_name):\n          mox.file.make_dirs(dir_name)\n          print('=> making dir ', dir_name)\n        self.file = mox.file.File(fpath, 'w')\n        # self.file = open(fpath, 'w')\n      else:\n        mkdir_if_missing(os.path.dirname(fpath))\n        self.file = open(fpath, 'w')\n\n  def __del__(self):\n    self.close()\n\n  def __enter__(self):\n    pass\n\n  def __exit__(self, *args):\n    self.close()\n\n  def write(self, msg):\n    self.console.write(msg)\n    if self.file is not None:\n      self.file.write(msg)\n\n  def flush(self):\n    self.console.flush()\n    if self.file is not None:\n      self.file.flush()\n      os.fsync(self.file.fileno())\n\n  def close(self):\n    self.console.close()\n    if self.file is not None:\n      self.file.close()\n\n\nclass TFLogger(object):\n  def __init__(self, log_dir=None):\n    \"\"\"Create a summary writer logging to log_dir.\"\"\"\n    if log_dir is not None:\n      mkdir_if_missing(log_dir)\n    self.writer = tf.summary.FileWriter(log_dir)\n\n  def scalar_summary(self, tag, value, step):\n    \"\"\"Log a scalar variable.\"\"\"\n    summary = tf.Summary(value=[tf.Summary.Value(tag=tag, simple_value=value)])\n    self.writer.add_summary(summary, step)\n    self.writer.flush()\n\n  def image_summary(self, tag, images, step):\n    \"\"\"Log a list of images.\"\"\"\n\n    img_summaries = []\n    for i, img in enumerate(images):\n      # Write the image to a string\n      try:\n        s = StringIO()\n      except:\n        s = BytesIO()\n      scipy.misc.toimage(img).save(s, format=\"png\")\n\n      # Create an Image object\n      img_sum = tf.Summary.Image(encoded_image_string=s.getvalue(),\n                                 height=img.shape[0],\n                                 width=img.shape[1])\n      # Create a Summary value\n      img_summaries.append(tf.Summary.Value(tag='%s/%d' % (tag, i), image=img_sum))\n\n    # Create and write Summary\n    summary = tf.Summary(value=img_summaries)\n    self.writer.add_summary(summary, step)\n    self.writer.flush()\n        \n  def histo_summary(self, tag, values, step, bins=1000):\n    \"\"\"Log a histogram of the tensor of values.\"\"\"\n\n    # Create a histogram using numpy\n    counts, bin_edges = np.histogram(values, bins=bins)\n\n    # Fill the fields of the histogram proto\n    hist = tf.HistogramProto()\n    hist.min = float(np.min(values))\n    hist.max = float(np.max(values))\n    hist.num = int(np.prod(values.shape))\n    hist.sum = float(np.sum(values))\n    hist.sum_squares = float(np.sum(values**2))\n\n    # Drop the start of the first bin\n    bin_edges = bin_edges[1:]\n\n    # Add bin edges and counts\n    for edge in bin_edges:\n      hist.bucket_limit.append(edge)\n    for c in counts:\n      hist.bucket.append(c)\n\n    # Create and write Summary\n    summary = tf.Summary(value=[tf.Summary.Value(tag=tag, histo=hist)])\n    self.writer.add_summary(summary, step)\n    self.writer.flush()\n\n  def close(self):\n    self.writer.close()"
  },
  {
    "path": "util/meters.py",
    "content": "from __future__ import absolute_import\n\n\nclass AverageMeter(object):\n  \"\"\"Computes and stores the average and current value\"\"\"\n\n  def __init__(self):\n    self.val = 0\n    self.avg = 0\n    self.sum = 0\n    self.count = 0\n\n  def reset(self):\n    self.val = 0\n    self.avg = 0\n    self.sum = 0\n    self.count = 0\n\n  def update(self, val, n=1):\n    self.val = val\n    self.sum += val * n\n    self.count += n\n    self.avg = self.sum / self.count"
  },
  {
    "path": "util/misc.py",
    "content": "import numpy as np\nimport errno\nimport os\nimport cv2\nimport math\nfrom shapely.geometry import Polygon\nfrom cfglib.config import config as cfg\nfrom scipy import ndimage as ndimg\n\ndef to_device(*tensors):\n    if len(tensors) < 2:\n        return tensors[0].to(cfg.device, non_blocking=True)\n    return (t.to(cfg.device, non_blocking=True) for t in tensors)\n\n\ndef mkdirs(newdir):\n    \"\"\"\n    make directory with parent path\n    :param newdir: target path\n    \"\"\"\n    try:\n        if not os.path.exists(newdir):\n            os.makedirs(newdir)\n    except OSError as err:\n        # Reraise the error unless it's about an already existing directory\n        if err.errno != errno.EEXIST or not os.path.isdir(newdir):\n            raise\n\n\ndef rescale_result(image, bbox_contours, H, W):\n    ori_H, ori_W = image.shape[:2]\n    image = cv2.resize(image, (W, H))\n    contours = list()\n    for cont in bbox_contours:\n        # if cv2.contourArea(cont) < 300:\n        #     continue\n        cont[:, 0] = (cont[:, 0] * W / ori_W).astype(int)\n        cont[:, 1] = (cont[:, 1] * H / ori_H).astype(int)\n        contours.append(cont)\n    return image, contours\n\n\ndef fill_hole(input_mask):\n    h, w = input_mask.shape\n    canvas = np.zeros((h + 2, w + 2), np.uint8)\n    canvas[1:h + 1, 1:w + 1] = input_mask.copy()\n\n    mask = np.zeros((h + 4, w + 4), np.uint8)\n\n    cv2.floodFill(canvas, mask, (0, 0), 1)\n    canvas = canvas[1:h + 1, 1:w + 1].astype(np.bool)\n\n    return (~canvas | input_mask.astype(np.uint8))\n\n\ndef regularize_sin_cos(sin, cos):\n    # regularization\n    scale = np.sqrt(1.0 / (sin ** 2 + cos ** 2))\n    return sin * scale, cos * scale\n\n\ndef gaussian2D(shape, sigma=1):\n    m, n = [(ss - 1.) / 2. for ss in shape]\n    y, x = np.ogrid[-m:m + 1, -n:n + 1]\n\n    h = np.exp(-(x * x + y * y) / (2 * sigma * sigma))\n    h[h < np.finfo(h.dtype).eps * h.max()] = 0\n    return h\n\n\ndef draw_gaussian(heatmap, center, radius, k=1, delte=6):\n    diameter = 2 * radius + 1\n    gaussian = gaussian2D((diameter, diameter), sigma=diameter / delte)\n\n    x, y = center\n\n    height, width = heatmap.shape[0:2]\n\n    left, right = min(x, radius), min(width - x, radius + 1)\n    top, bottom = min(y, radius), min(height - y, radius + 1)\n\n    masked_heatmap = heatmap[y - top:y + bottom, x - left:x + right]\n    masked_gaussian = gaussian[radius - top:radius + bottom, radius - left:radius + right]\n    np.maximum(masked_heatmap, masked_gaussian * k, out=masked_heatmap)\n\n\ndef gaussian_radius(det_size, min_overlap=0.7):\n    height, width = det_size\n\n    a1 = 1\n    b1 = (height + width)\n    c1 = width * height * (1 - min_overlap) / (1 + min_overlap)\n    sq1 = np.sqrt(b1 ** 2 - 4 * a1 * c1)\n    r1 = (b1 + sq1) / 2\n\n    a2 = 4\n    b2 = 2 * (height + width)\n    c2 = (1 - min_overlap) * width * height\n    sq2 = np.sqrt(b2 ** 2 - 4 * a2 * c2)\n    r2 = (b2 + sq2) / 2\n\n    a3 = 4 * min_overlap\n    b3 = -2 * min_overlap * (height + width)\n    c3 = (min_overlap - 1) * width * height\n    sq3 = np.sqrt(b3 ** 2 - 4 * a3 * c3)\n    r3 = (b3 + sq3) / 2\n    return min(r1, r2, r3)\n\n\ndef point_dist_to_line(line, p3):\n    # 计算点到直线的距离\n    # line = (p1, p2)\n    # compute the distance from p3 to p1-p2 #cross(x,y)矩阵的叉积，norm()求范数\n    # np.linalg.norm(np.cross(p2 - p1, p1 - p3)) * 1.0 / np.linalg.norm(p2 - p1)\n    # compute the distance from p3 to p1-p2\n    p1, p2 = line\n    d = p2 - p1\n\n    def l2(p):\n        return math.sqrt(p[0] * p[0]+ p[1]*p[1])\n\n    if l2(d) > 0:\n        distance = abs(d[1] * p3[0] - d[0] * p3[1] + p2[0] * p1[1] - p2[1] * p1[0]) / l2(d)\n    else:\n        distance = math.sqrt((p3[0]-p2[0])**2 + (p3[1]-p2[1])**2)\n\n    return distance\n\n\nclass AverageMeter(object):\n    \"\"\"Computes and stores the average and current value\"\"\"\n    def __init__(self):\n        self.reset()\n\n    def reset(self):\n        self.val = 0\n        self.avg = 0\n        self.sum = 0\n        self.count = 0\n\n    def update(self, val, n=1):\n        self.val = val\n        self.sum += val * n\n        self.count += n\n        self.avg = self.sum / self.count\n\n\ndef norm2(x, axis=None):\n    if axis:\n        return np.sqrt(np.sum(x ** 2, axis=axis))\n    return np.sqrt(np.sum(x ** 2))\n\n\ndef cos(p1, p2):\n    return (p1 * p2).sum() / (norm2(p1) * norm2(p2))\n\n\ndef vector_sin(v):\n    assert len(v) == 2\n    # sin = y / (sqrt(x^2 + y^2))\n    l = np.sqrt(v[0] ** 2 + v[1] ** 2) + 1e-5\n    return v[1] / l\n\n\ndef vector_cos(v):\n    assert len(v) == 2\n    # cos = x / (sqrt(x^2 + y^2))\n    l = np.sqrt(v[0] ** 2 + v[1] ** 2) + 1e-5\n    return v[0] / l\n\n\ndef find_bottom(pts):\n\n    if len(pts) > 4:\n        e = np.concatenate([pts, pts[:3]])\n        candidate = []\n        for i in range(1, len(pts) + 1):\n            v_prev = e[i] - e[i - 1]\n            v_next = e[i + 2] - e[i + 1]\n            if cos(v_prev, v_next) < -0.875:\n                candidate.append((i % len(pts), (i + 1) % len(pts), norm2(e[i] - e[i + 1])))\n\n        if len(candidate) != 2 or candidate[0][0] == candidate[1][1] or candidate[0][1] == candidate[1][0]:\n            # if candidate number < 2, or two bottom are joined, select 2 farthest edge\n            mid_list = []\n            dist_list = []\n            if len(candidate) > 2:\n\n                bottom_idx = np.argsort([angle for s1, s2, angle in candidate])[0:2]\n                bottoms = [candidate[bottom_idx[0]][:2], candidate[bottom_idx[1]][0:2]]\n                long_edge1, long_edge2 = find_long_edges(pts, bottoms)\n                edge_length1 = [norm2(pts[e1] - pts[e2]) for e1, e2 in long_edge1]\n                edge_length2 = [norm2(pts[e1] - pts[e2]) for e1, e2 in long_edge2]\n                l1 = sum(edge_length1)\n                l2 = sum(edge_length2)\n                len1 = len(edge_length1)\n                len2 = len(edge_length2)\n\n                if l1 > 2*l2 or l2 > 2*l1 or len1 == 0 or len2 == 0:\n                    for i in range(len(pts)):\n                        mid_point = (e[i] + e[(i + 1) % len(pts)]) / 2\n                        mid_list.append((i, (i + 1) % len(pts), mid_point))\n\n                    for i in range(len(pts)):\n                        for j in range(len(pts)):\n                            s1, e1, mid1 = mid_list[i]\n                            s2, e2, mid2 = mid_list[j]\n                            dist = norm2(mid1 - mid2)\n                            dist_list.append((s1, e1, s2, e2, dist))\n                    bottom_idx = np.argsort([dist for s1, e1, s2, e2, dist in dist_list])[-1]\n                    bottoms = [dist_list[bottom_idx][:2], dist_list[bottom_idx][2:4]]\n            else:\n                mid_list = []\n                for i in range(len(pts)):\n                    mid_point = (e[i] + e[(i + 1) % len(pts)]) / 2\n                    mid_list.append((i, (i + 1) % len(pts), mid_point))\n\n                dist_list = []\n                for i in range(len(pts)):\n                    for j in range(len(pts)):\n                        s1, e1, mid1 = mid_list[i]\n                        s2, e2, mid2 = mid_list[j]\n                        dist = norm2(mid1 - mid2)\n                        dist_list.append((s1, e1, s2, e2, dist))\n                bottom_idx = np.argsort([dist for s1, e1, s2, e2, dist in dist_list])[-2:]\n                bottoms = [dist_list[bottom_idx[0]][:2], dist_list[bottom_idx[1]][:2]]\n        else:\n            bottoms = [candidate[0][:2], candidate[1][:2]]\n    else:\n        d1 = norm2(pts[1] - pts[0]) + norm2(pts[2] - pts[3])\n        d2 = norm2(pts[2] - pts[1]) + norm2(pts[0] - pts[3])\n        bottoms = [(0, 1), (2, 3)] if d1 < d2 else [(1, 2), (3, 0)]\n        # bottoms = [(0, 1), (2, 3)] if 2 * d1 < d2 and d1 > 32 else [(1, 2), (3, 0)]\n    assert len(bottoms) == 2, 'fewer than 2 bottoms'\n    return bottoms\n\n\ndef split_long_edges(points, bottoms):\n    \"\"\"\n    Find two long edge sequence of and polygon\n    \"\"\"\n    b1_start, b1_end = bottoms[0]\n    b2_start, b2_end = bottoms[1]\n    n_pts = len(points)\n\n    i = b1_end + 1\n    long_edge_1 = []\n    while i % n_pts != b2_end:\n        long_edge_1.append((i - 1, i))\n        i = (i + 1) % n_pts\n\n    i = b2_end + 1\n    long_edge_2 = []\n    while i % n_pts != b1_end:\n        long_edge_2.append((i - 1, i))\n        i = (i + 1) % n_pts\n    return long_edge_1, long_edge_2\n\n\ndef find_long_edges(points, bottoms):\n    b1_start, b1_end = bottoms[0]\n    b2_start, b2_end = bottoms[1]\n    n_pts = len(points)\n    i = (b1_end + 1) % n_pts\n    long_edge_1 = []\n\n    while i % n_pts != b2_end:\n        start = (i - 1) % n_pts\n        end = i % n_pts\n        long_edge_1.append((start, end))\n        i = (i + 1) % n_pts\n\n    i = (b2_end + 1) % n_pts\n    long_edge_2 = []\n    while i % n_pts != b1_end:\n        start = (i - 1) % n_pts\n        end = i % n_pts\n        long_edge_2.append((start, end))\n        i = (i + 1) % n_pts\n    return long_edge_1, long_edge_2\n\n\ndef split_edge_seqence(points, n_parts):\n    pts_num = points.shape[0]\n    long_edge = [(i, (i + 1) % pts_num) for i in range(pts_num)]\n    edge_length = [norm2(points[e1] - points[e2]) for e1, e2 in long_edge]\n    point_cumsum = np.cumsum([0] + edge_length)\n    total_length = sum(edge_length)\n    length_per_part = total_length / n_parts\n\n    cur_node = 0  # first point\n    splited_result = []\n\n    for i in range(1, n_parts):\n        cur_end = i * length_per_part\n\n        while cur_end > point_cumsum[cur_node + 1]:\n            cur_node += 1\n\n        e1, e2 = long_edge[cur_node]\n        e1, e2 = points[e1], points[e2]\n\n        # start_point = points[long_edge[cur_node]]\n        end_shift = cur_end - point_cumsum[cur_node]\n        ratio = end_shift / edge_length[cur_node]\n        new_point = e1 + ratio * (e2 - e1)\n        # print(cur_end, point_cumsum[cur_node], end_shift, edge_length[cur_node], '=', new_point)\n        splited_result.append(new_point)\n\n    # add first and last point\n    p_first = points[long_edge[0][0]]\n    p_last = points[long_edge[-1][1]]\n    splited_result = [p_first] + splited_result + [p_last]\n    return np.stack(splited_result)\n\n\ndef split_edge_seqence_with_cell_division(points, n_parts):\n    points_seq = list(points)\n    pts_num = len(points_seq)\n\n    if pts_num <= n_parts:\n        long_edge = [(i, (i + 1) % pts_num) for i in range(pts_num)]\n        edge_length = [int(norm2(points[e1] - points[e2])) for e1, e2 in long_edge]\n        while pts_num < n_parts:\n            e = np.argmax(np.array(edge_length))\n            new_pts = (points_seq[e] + points_seq[(e+1) % pts_num])*0.5\n            points_seq.insert(e+1, new_pts)\n            d = int(0.5 * (edge_length[e]-1))\n            edge_length[e] = d\n            edge_length.insert(e+1, d)\n            pts_num = len(points_seq)\n    else:\n        pass\n\n    return np.stack(points_seq).astype(int)\n\n\ndef split_edge_seqence_by_step(points, long_edge1, long_edge2, step=16.0):\n\n    edge_length1 = [norm2(points[e1] - points[e2]) for e1, e2 in long_edge1]\n    edge_length2 = [norm2(points[e1] - points[e2]) for e1, e2 in long_edge2]\n    # 取长边 计算bbox个数\n    total_length = (sum(edge_length1)+sum(edge_length2))/2\n    n_parts = math.ceil(float(total_length) / step)\n    try:\n        inner1 = split_edge_seqence(points, long_edge1, n_parts=n_parts)\n        inner2 = split_edge_seqence(points, long_edge2, n_parts=n_parts)\n    except:\n        print(edge_length1)\n        print(edge_length2)\n\n    return inner1, inner2\n\n\ndef disjoint_find(x, F):\n    if F[x] == x:\n        return x\n    F[x] = disjoint_find(F[x], F)\n    return F[x]\n\n\ndef disjoint_merge(x, y, F):\n    x = disjoint_find(x, F)\n    y = disjoint_find(y, F)\n    if x == y:\n        return False\n    F[y] = x\n    return True\n\n\ndef merge_polygons(polygons, merge_map):\n\n    def merge_two_polygon(p1, p2):\n        p2 = Polygon(p2)\n        merged = p1.union(p2)\n        return merged\n\n    merge_map = [disjoint_find(x, merge_map) for x in range(len(merge_map))]\n    merge_map = np.array(merge_map)\n    final_polygons = []\n\n    for i in np.unique(merge_map):\n        merge_idx = np.where(merge_map == i)[0]\n        if len(merge_idx) > 0:\n            merged = Polygon(polygons[merge_idx[0]])\n            for j in range(1, len(merge_idx)):\n                merged = merge_two_polygon(merged, polygons[merge_idx[j]])\n            x, y = merged.exterior.coords.xy\n            final_polygons.append(np.stack([x, y], axis=1).astype(int))\n\n    return final_polygons\n\n\ndef get_sample_point(text_mask, num_points, approx_factor, scales=None):\n    # get sample point in contours\n    contours, _ = cv2.findContours(text_mask.astype(np.uint8), cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE)\n    epsilon = approx_factor * cv2.arcLength(contours[0], True)\n    approx = cv2.approxPolyDP(contours[0], epsilon, True).reshape((-1, 2))\n    # approx = contours[0].reshape((-1, 2))\n    if scales is None:\n        ctrl_points = split_edge_seqence(approx, num_points)\n    else:\n        ctrl_points = split_edge_seqence(approx*scales, num_points)\n    ctrl_points = np.array(ctrl_points[:num_points, :]).astype(np.int32)\n\n    return ctrl_points\n\n\n"
  },
  {
    "path": "util/pbox.py",
    "content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n__author__ = '古溪'\n\nimport numpy as np\nfrom typing import List\n\n\ndef functools_reduce(a):\n    # 使用functools內建模块\n    import functools\n    import operator\n    return functools.reduce(operator.concat, a)\n\n\ndef minConnectPath(list_all: List[list]):\n    list_nodo = list_all.copy()\n    res = []\n    ept = [0, 0]\n\n    def norm2(a, b):\n        \"\"\"计算两点之间的距离\"\"\"\n        return ((a[0] - b[0]) ** 2 + (a[1] - b[1]) ** 2) ** 0.5\n\n    dict00 = {}  # 格式   {距离,(起点坐标,终点坐标)}\n    dict11 = {}  # 格式   {距离,(起点坐标,终点坐标)}\n    # 放入一个初始值\n    ept[0] = list_nodo[0]  # left end point\n    ept[1] = list_nodo[0]  # right end point\n    list_nodo.remove(list_nodo[0])\n    while list_nodo:\n        for i in list_nodo:  # i 待处理的\n            length0 = norm2(i, ept[0])  # 端点0终点距离\n            dict00[length0] = [i, ept[0]]\n            length1 = norm2(ept[1], i)  # 端点0终点距离\n            dict11[length1] = [ept[1], i]\n        key0 = min(dict00.keys())\n        key1 = min(dict11.keys())\n\n        if key0 <= key1:\n            ss = dict00[key0][0]\n            ee = dict00[key0][1]\n            res.insert(0, [list_all.index(ss), list_all.index(ee)])\n            list_nodo.remove(ss)\n            ept[0] = ss\n        else:\n            ss = dict11[key1][0]\n            ee = dict11[key1][1]\n            res.append([list_all.index(ss), list_all.index(ee)])\n            list_nodo.remove(ee)\n            ept[1] = ee\n\n        dict00 = {}\n        dict11 = {}\n\n    path = functools_reduce(res)\n    path = sorted(set(path), key=path.index)  # 去重\n\n    return res, path\n\n\ndef bbox_transfor_inv(radius_map, sin_map, cos_map, score_map, wclip=(2, 8), expend=1.0):\n    xy_text = np.argwhere(score_map > 0)\n    # sort the text boxes via the y axis\n    xy_text = xy_text[np.argsort(xy_text[:, 0])]\n    origin = xy_text\n    radius = radius_map[xy_text[:, 0], xy_text[:, 1], :]\n    sin = sin_map[xy_text[:, 0], xy_text[:, 1]]\n    cos = cos_map[xy_text[:, 0], xy_text[:, 1]]\n    dtx = radius[:, 0] * cos * expend\n    dty = radius[:, 0] * sin * expend\n    ddx = radius[:, 1] * cos * expend\n    ddy = radius[:, 1] * sin * expend\n    topp = origin + np.stack([dty, dtx], axis=-1)\n    botp = origin - np.stack([ddy, ddx], axis=-1)\n    width = (radius[:, 0] + radius[:, 1]) // 3\n    width = np.clip(width, wclip[0], wclip[1])\n\n    top1 = topp - np.stack([width * cos, -width * sin], axis=-1)\n    top2 = topp + np.stack([width * cos, -width * sin], axis=-1)\n    bot1 = botp - np.stack([width * cos, -width * sin], axis=-1)\n    bot2 = botp + np.stack([width * cos, -width * sin], axis=-1)\n\n    bbox = np.stack([top1, top2, bot2, bot1], axis=1)[:, :, ::-1]\n    bboxs = np.zeros((bbox.shape[0], 9), dtype=np.float32)\n    bboxs[:, :8] = bbox.reshape((-1, 8))\n    bboxs[:, 8] = score_map[xy_text[:, 0], xy_text[:, 1]]\n\n    return bboxs\n\n\n"
  },
  {
    "path": "util/serialization.py",
    "content": "from __future__ import print_function, absolute_import\nimport json\nimport os\nimport sys\n# import moxing as mox\nimport os.path as osp\nimport shutil\n\nimport torch\nfrom torch.nn import Parameter\n\nfrom .osutils import mkdir_if_missing\n\nfrom config import get_args\nglobal_args = get_args(sys.argv[1:])\n\nif global_args.run_on_remote:\n  import moxing as mox\n\n\ndef read_json(fpath):\n  with open(fpath, 'r') as f:\n    obj = json.load(f)\n  return obj\n\n\ndef write_json(obj, fpath):\n  mkdir_if_missing(osp.dirname(fpath))\n  with open(fpath, 'w') as f:\n    json.dump(obj, f, indent=4, separators=(',', ': '))\n\n\ndef save_checkpoint(state, is_best, fpath='checkpoint.pth.tar'):\n  print('=> saving checkpoint ', fpath)\n  if global_args.run_on_remote:\n    dir_name = osp.dirname(fpath)\n    if not mox.file.exists(dir_name):\n      mox.file.make_dirs(dir_name)\n      print('=> makding dir ', dir_name)\n    local_path = \"local_checkpoint.pth.tar\"\n    torch.save(state, local_path)\n    mox.file.copy(local_path, fpath)\n    if is_best:\n      mox.file.copy(local_path, osp.join(dir_name, 'model_best.pth.tar'))\n  else:\n    mkdir_if_missing(osp.dirname(fpath))\n    torch.save(state, fpath)\n    if is_best:\n      shutil.copy(fpath, osp.join(osp.dirname(fpath), 'model_best.pth.tar'))\n\n\ndef load_checkpoint(fpath):\n  if global_args.run_on_remote:\n    mox.file.shift('os', 'mox')\n    checkpoint = torch.load(fpath)\n    print(\"=> Loaded checkpoint '{}'\".format(fpath))\n    return checkpoint\n  else:\n    load_path = fpath\n\n    if osp.isfile(load_path):\n      checkpoint = torch.load(load_path)\n      print(\"=> Loaded checkpoint '{}'\".format(load_path))\n      return checkpoint\n    else:\n      raise ValueError(\"=> No checkpoint found at '{}'\".format(load_path))\n\n\ndef copy_state_dict(state_dict, model, strip=None):\n  tgt_state = model.state_dict()\n  copied_names = set()\n  for name, param in state_dict.items():\n    if strip is not None and name.startswith(strip):\n      name = name[len(strip):]\n    if name not in tgt_state:\n      continue\n    if isinstance(param, Parameter):\n      param = param.data\n    if param.size() != tgt_state[name].size():\n      print('mismatch:', name, param.size(), tgt_state[name].size())\n      continue\n    tgt_state[name].copy_(param)\n    copied_names.add(name)\n\n  missing = set(tgt_state.keys()) - copied_names\n  if len(missing) > 0:\n      print(\"missing keys in state_dict:\", missing)\n\n  return model"
  },
  {
    "path": "util/shedule.py",
    "content": "from torch.optim.lr_scheduler import _LRScheduler\n\nclass FixLR(_LRScheduler):\n    \"\"\"Sets the learning rate of each parameter group to the initial lr\n    decayed by gamma every step_size epochs. When last_epoch=-1, sets\n    initial lr as lr.\n\n    Args:\n        optimizer (Optimizer): Wrapped optimizer.\n        step_size (int): Period of learning rate decay.\n        gamma (float): Multiplicative factor of learning rate decay.\n            Default: 0.1.\n        last_epoch (int): The index of last epoch. Default: -1.\n\n    Example:\n        >>> # Fixed leraning rate\n        >>> scheduler = FixLR(optimizer, step_size=30, gamma=0.1)\n        >>> for epoch in range(100):\n        >>>     scheduler.step()\n        >>>     train(...)\n        >>>     validate(...)\n    \"\"\"\n\n    def __init__(self, optimizer, last_epoch=-1):\n        super().__init__(optimizer, last_epoch)\n\n    def get_lr(self):\n        return self.base_lrs\n"
  },
  {
    "path": "util/strs.py",
    "content": "# encoding = utf-8\ndef int_array_to_str(arr):\n    \"\"\"turn an int array to a str\"\"\"\n    return \"\".join(map(chr, arr))\n\n\ndef join(arr, splitter=','):\n    temp = []\n    for e in arr:\n        temp.append(e)\n        temp.append(splitter)\n    temp.pop()\n    return \"\".join(temp)\n\n\ndef is_str(s):\n    return type(s) == str\n\n\ndef to_lowercase(s):\n    return str.lower(s)\n\n\ndef to_uppercase(s):\n    return str.upper(s)\n\n\ndef ends_with(s, suffix, ignore_case = False):\n    \"\"\"\n    suffix: str, list, or tuple\n    \"\"\"\n    if is_str(suffix):\n        suffix = [suffix]\n    suffix = list(suffix)\n    if ignore_case:\n        for idx, suf in enumerate(suffix):\n            suffix[idx] = to_lowercase(suf)    \n        s = to_lowercase(s)\n    suffix = tuple(suffix)\n    return s.endswith(suffix)\n\n\ndef starts_with(s, prefix, ignore_case = False):\n    \"\"\"\n    prefix: str, list, or tuple\n    \"\"\"\n    if is_str(prefix):\n        prefix = [prefix]\n    prefix = list(prefix)\n    if ignore_case:\n        for idx, pre in enumerate(prefix):\n            prefix[idx] = to_lowercase(pre)    \n        s = to_lowercase(s)\n    prefix = tuple(prefix)\n    return s.startswith(prefix)\n\n\ndef contains(s, target, ignore_case = False):\n    if ignore_case:\n        s = to_lowercase(s)\n        target = to_lowercase(target)\n    return s.find(target) >= 0\n\n\ndef index_of(s, target):\n    return s.find(target)\n\n\ndef replace_all(s, old, new, reg = False):\n    if reg:\n        import re\n        targets = re.findall(old, s)\n        for t in targets:\n            s = s.replace(t, new)\n    else:\n        s = s.replace(old, new)\n    return s\n\n\ndef remove_all(s, sub):\n    return replace_all(s, sub, '')\n\n\ndef split(s, splitter, reg = False):\n    if not reg:\n        return s.split(splitter)\n    import re\n    return re.split(splitter, s)   \n\n\ndef remove_invisible(s):\n    s = replace_all(s, ' ', '')\n    s = replace_all(s, '\\n', '')\n    s = replace_all(s, '\\t', '')\n    s = replace_all(s, '\\r', '')\n    s = replace_all(s, '\\xef\\xbb\\xbf', '')\n    return s\n\n\ndef find_all(s, pattern):\n    import re\n    return re.findall(pattern, s)\n\n\ndef is_none_or_empty(s):\n    if s is None:\n        return True\n    return len(s)==0; \n\n\ndef to_json(obj):\n    import ujson\n    return ujson.dumps(obj)\n\n\ndef to_list(obj):\n    items=obj.replace(\"(\", '').replace(\")\",\"\")\n    items=items.split(\",\")\n    lst=[float(i) for i in items]\n\n    return  lst\n\n\ndef to_tuple(obj):\n    items=obj.replace(\"(\", '').replace(\")\",\"\")\n    items=items.split(\",\")\n    tpl=tuple([float(i) for i in items])\n    return  tpl\n"
  },
  {
    "path": "util/summary.py",
    "content": "from tensorboardX import SummaryWriter\nfrom util.misc import mkdirs\n\n\nclass LogSummary(object):\n\n    def __init__(self, log_path):\n\n        mkdirs(log_path)\n        self.writer = SummaryWriter(log_path)\n\n    def write_scalars(self, scalar_dict, n_iter, tag=None):\n\n        for name, scalar in scalar_dict.items():\n            if tag is not None:\n                name = '/'.join([tag, name])\n            self.writer.add_scalar(name, scalar, n_iter)\n\n    def write_hist_parameters(self, net, n_iter):\n        for name, param in net.named_parameters():\n            self.writer.add_histogram(name, param.clone().cpu().numpy(), n_iter)\n\n\n\n\n\n"
  },
  {
    "path": "util/vis_flux.py",
    "content": "import sys\nimport scipy.io as sio\nimport math\nimport numpy as np\nimport cv2\nimport matplotlib\nmatplotlib.use('agg')\nimport pylab as plt\nfrom matplotlib import cm\nimport os\n\ndef label2color(label):\n\n    label = label.astype(np.uint16)\n    \n    height, width = label.shape\n    color3u = np.zeros((height, width, 3), dtype=np.uint8)\n    unique_labels = np.unique(label)\n\n    if unique_labels[-1] >= 2**24:       \n        raise RuntimeError('Error: label overflow!')\n\n    for i in range(len(unique_labels)):\n    \n        binary = '{:024b}'.format(unique_labels[i])\n        # r g b 3*8 24\n        r = int(binary[::3][::-1], 2)\n        g = int(binary[1::3][::-1], 2)\n        b = int(binary[2::3][::-1], 2)\n\n        color3u[label == unique_labels[i]] = np.array([r, g, b])\n\n    return color3u\n\n\ndef vis_direction_field(gt_flux):\n\n    norm_gt = np.sqrt(gt_flux[1, :, :] ** 2 + gt_flux[0, :, :] ** 2)\n    angle_gt = 180 / math.pi * np.arctan2(gt_flux[1, :, :], gt_flux[0, :, :])\n\n    fig = plt.figure(figsize=(10, 6))\n\n    ax1 = fig.add_subplot(121)\n    ax1.set_title('Norm_gt')\n    ax1.set_autoscale_on(True)\n    im1 = ax1.imshow(norm_gt, cmap=cm.jet)\n    plt.colorbar(im1, shrink=0.5)\n\n    ax2 = fig.add_subplot(122)\n    ax2.set_title('Angle_gt')\n    ax2.set_autoscale_on(True)\n    im2 = ax2.imshow(angle_gt, cmap=cm.jet)\n    plt.colorbar(im2, shrink=0.5)\n\n    plt.savefig('1.png')\n    plt.close(fig)\n\n\ndef vis_flux(vis_image, pred_flux, gt_flux, gt_mask, image_name, save_dir):\n\n    vis_image = vis_image.data.cpu().numpy()[0, ...]\n    pred_flux = pred_flux.data.cpu().numpy()[0, ...]\n    gt_flux = gt_flux.data.cpu().numpy()[0, ...]\n    gt_mask = gt_mask.data.cpu().numpy()[0, ...]\n    \n    image_name = image_name[0]\n\n    norm_pred = np.sqrt(pred_flux[1,:,:]**2 + pred_flux[0,:,:]**2)\n    angle_pred = 180/math.pi*np.arctan2(pred_flux[1,:,:], pred_flux[0,:,:])\n\n    norm_gt = np.sqrt(gt_flux[1,:,:]**2 + gt_flux[0,:,:]**2)\n    angle_gt = 180/math.pi*np.arctan2(gt_flux[1,:,:], gt_flux[0,:,:])\n\n    fig = plt.figure(figsize=(10,6))\n\n    ax0 = fig.add_subplot(231)\n    ax0.imshow(vis_image[:,:,::-1])\n\n    ax1 = fig.add_subplot(232)\n    ax1.set_title('Norm_gt')\n    ax1.set_autoscale_on(True)\n    im1 = ax1.imshow(norm_gt, cmap=cm.jet)\n    plt.colorbar(im1,shrink=0.5)\n\n    ax2 = fig.add_subplot(233)\n    ax2.set_title('Angle_gt')\n    ax2.set_autoscale_on(True)\n    im2 = ax2.imshow(angle_gt, cmap=cm.jet)\n    plt.colorbar(im2, shrink=0.5)\n\n    ax5 = fig.add_subplot(234)\n    color_mask = label2color(gt_mask)\n    ax5.imshow(color_mask)\n\n    ax4 = fig.add_subplot(235)\n    ax4.set_title('Norm_pred')\n    ax4.set_autoscale_on(True)\n    im4 = ax4.imshow(norm_pred, cmap=cm.jet)\n    plt.colorbar(im4,shrink=0.5)\n\n    ax5 = fig.add_subplot(236)\n    ax5.set_title('Angle_pred')\n    ax5.set_autoscale_on(True)\n    im5 = ax5.imshow(angle_pred, cmap=cm.jet)\n    plt.colorbar(im5, shrink=0.5)\n\n    plt.savefig(save_dir + image_name + '.png')\n    plt.close(fig)\n"
  },
  {
    "path": "util/visualize.py",
    "content": "import torch\nimport numpy as np\nimport cv2\nimport os\nimport math\nfrom cfglib.config import config as cfg\nfrom util import canvas as cav\nimport matplotlib\nmatplotlib.use('agg')\nimport pylab as plt\nfrom matplotlib import cm\nimport torch.nn.functional as F\n\n\ndef visualize_network_output(output_dict, input_dict, mode='train'):\n    vis_dir = os.path.join(cfg.vis_dir, cfg.exp_name + '_' + mode)\n    if not os.path.exists(vis_dir):\n        os.mkdir(vis_dir)\n\n    fy_preds = F.interpolate(output_dict[\"fy_preds\"], scale_factor=cfg.scale, mode='bilinear')\n    fy_preds = fy_preds.data.cpu().numpy()\n\n    py_preds = output_dict[\"py_preds\"][1:]\n    init_polys = output_dict[\"py_preds\"][0]\n    inds = output_dict[\"inds\"]\n\n    image = input_dict['img']\n    tr_mask = input_dict['tr_mask'].data.cpu().numpy() > 0\n    distance_field = input_dict['distance_field'].data.cpu().numpy()\n    direction_field = input_dict['direction_field']\n    weight_matrix = input_dict['weight_matrix']\n    gt_tags = input_dict['gt_points'].cpu().numpy()\n    ignore_tags = input_dict['ignore_tags'].cpu().numpy()\n\n    b, c, _, _ = fy_preds.shape\n    for i in range(b):\n\n        fig = plt.figure(figsize=(12, 9))\n\n        mask_pred = fy_preds[i, 0, :, :]\n        distance_pred = fy_preds[i, 1, :, :]\n        norm_pred = np.sqrt(fy_preds[i, 2, :, :] ** 2 + fy_preds[i, 3, :, :] ** 2)\n        angle_pred = 180 / math.pi * np.arctan2(fy_preds[i, 2, :, :], fy_preds[i, 3, :, :] + 0.00001)\n\n        ax1 = fig.add_subplot(341)\n        ax1.set_title('mask_pred')\n        # ax1.set_autoscale_on(True)\n        im1 = ax1.imshow(mask_pred, cmap=cm.jet)\n        # plt.colorbar(im1, shrink=0.5)\n\n        ax2 = fig.add_subplot(342)\n        ax2.set_title('distance_pred')\n        # ax2.set_autoscale_on(True)\n        im2 = ax2.imshow(distance_pred, cmap=cm.jet)\n        # plt.colorbar(im2, shrink=0.5)\n\n        ax3 = fig.add_subplot(343)\n        ax3.set_title('norm_pred')\n        # ax3.set_autoscale_on(True)\n        im3 = ax3.imshow(norm_pred, cmap=cm.jet)\n        # plt.colorbar(im3, shrink=0.5)\n\n        ax4 = fig.add_subplot(344)\n        ax4.set_title('angle_pred')\n        # ax4.set_autoscale_on(True)\n        im4 = ax4.imshow(angle_pred, cmap=cm.jet)\n        # plt.colorbar(im4, shrink=0.5)\n\n        mask_gt = tr_mask[i]\n        distance_gt = distance_field[i]\n        # gt_flux = 0.999999 * direction_field[i] / (direction_field[i].norm(p=2, dim=0) + 1e-9)\n        gt_flux = direction_field[i].cpu().numpy()\n        norm_gt = np.sqrt(gt_flux[0, :, :] ** 2 + gt_flux[1, :, :] ** 2)\n        angle_gt = 180 / math.pi * np.arctan2(gt_flux[0, :, :], gt_flux[1, :, :]+0.00001)\n\n        ax11 = fig.add_subplot(345)\n        # ax11.set_title('mask_gt')\n        # ax11.set_autoscale_on(True)\n        im11 = ax11.imshow(mask_gt, cmap=cm.jet)\n        # plt.colorbar(im11, shrink=0.5)\n\n        ax22 = fig.add_subplot(346)\n        # ax22.set_title('distance_gt')\n        # ax22.set_autoscale_on(True)\n        im22 = ax22.imshow(distance_gt, cmap=cm.jet)\n        # plt.colorbar(im22, shrink=0.5)\n\n        ax33 = fig.add_subplot(347)\n        # ax33.set_title('norm_gt')\n        # ax33.set_autoscale_on(True)\n        im33 = ax33.imshow(norm_gt, cmap=cm.jet)\n        # plt.colorbar(im33, shrink=0.5)\n\n        ax44 = fig.add_subplot(348)\n        # ax44.set_title('angle_gt')\n        # ax44.set_autoscale_on(True)\n        im44 = ax44.imshow(angle_gt, cmap=cm.jet)\n        # plt.colorbar(im44, shrink=0.5)\n\n        img_show = image[i].permute(1, 2, 0).cpu().numpy()\n        img_show = ((img_show * cfg.stds + cfg.means) * 255).astype(np.uint8)\n        img_show = np.ascontiguousarray(img_show[:, :, ::-1])\n        shows = []\n        gt = gt_tags[i]\n        gt_idx = np.where(ignore_tags[i] > 0)\n        gt_py = gt[gt_idx[0], :, :]\n        index = torch.where(inds[0] == i)[0]\n        init_py = init_polys[index].detach().cpu().numpy()\n\n        image_show = img_show.copy()\n        cv2.drawContours(image_show, init_py.astype(np.int32), -1, (255, 255, 0), 2)\n        cv2.drawContours(image_show, gt_py.astype(np.int32), -1, (0, 255, 0), 2)\n        shows.append(image_show)\n        for py in py_preds:\n            contours = py[index].detach().cpu().numpy()\n            image_show = img_show.copy()\n            cv2.drawContours(image_show, init_py.astype(np.int32), -1, (255, 255, 0), 2)\n            cv2.drawContours(image_show, gt_py.astype(np.int32), -1, (0, 255, 0), 2)\n            cv2.drawContours(image_show, contours.astype(np.int32), -1, (0, 0, 255), 2)\n            shows.append(image_show)\n\n        for idx, im_show in enumerate(shows):\n            axb = fig.add_subplot(3, 4, 9+idx)\n            # axb.set_title('boundary_{}'.format(idx))\n            # axb.set_autoscale_on(True)\n            im11 = axb.imshow(im_show, cmap=cm.jet)\n            # plt.colorbar(im11, shrink=0.5)\n\n        path = os.path.join(vis_dir, '{}.png'.format(i))\n        plt.savefig(path)\n        plt.close(fig)\n\n\ndef visualize_gt(image, contours, label_tag):\n\n    image_show = image.copy()\n    image_show = np.ascontiguousarray(image_show[:, :, ::-1])\n\n    image_show = cv2.polylines(image_show,\n                               [contours[i] for i, tag in enumerate(label_tag) if tag >0], True, (0, 0, 255), 3)\n    image_show = cv2.polylines(image_show,\n                               [contours[i] for i, tag in enumerate(label_tag) if tag <0], True, (0, 255, 0), 3)\n\n    show_gt = cv2.resize(image_show, (320, 320))\n\n    return show_gt\n\n\ndef visualize_detection(image, output_dict, meta=None):\n    image_show = image.copy()\n    image_show = np.ascontiguousarray(image_show[:, :, ::-1])\n\n    cls_preds = F.interpolate(output_dict[\"fy_preds\"], scale_factor=cfg.scale, mode='bilinear')\n    cls_preds = cls_preds[0].data.cpu().numpy()\n\n    py_preds = output_dict[\"py_preds\"][1:]\n    init_polys = output_dict[\"py_preds\"][0]\n    shows = []\n\n    init_py = init_polys.data.cpu().numpy()\n    path = os.path.join(cfg.vis_dir, '{}_test'.format(cfg.exp_name),\n                        meta['image_id'][0].split(\".\")[0] + \"_init.png\")\n\n    im_show0 = image_show.copy()\n    for i, bpts in enumerate(init_py.astype(np.int32)):\n        cv2.drawContours(im_show0, [bpts.astype(np.int32)], -1, (255, 255, 0), 2)\n        for j, pp in enumerate(bpts):\n            if j == 0:\n                cv2.circle(im_show0, (int(pp[0]), int(pp[1])), 3, (255, 0, 255), -1)\n            elif j == 1:\n                cv2.circle(im_show0, (int(pp[0]), int(pp[1])), 3, (0, 255, 255), -1)\n            else:\n                cv2.circle(im_show0, (int(pp[0]), int(pp[1])), 3, (0, 0, 255), -1)\n\n    cv2.imwrite(path, im_show0)\n\n    for idx, py in enumerate(py_preds):\n        im_show = im_show0.copy()\n        contours = py.data.cpu().numpy()\n        cv2.drawContours(im_show, contours.astype(np.int32), -1, (0, 0, 255), 2)\n        for ppts in contours:\n            for j, pp in enumerate(ppts):\n                if j == 0:\n                    cv2.circle(im_show, (int(pp[0]), int(pp[1])), 3, (255, 0, 255), -1)\n                elif j == 1:\n                    cv2.circle(im_show, (int(pp[0]), int(pp[1])), 3, (0, 255, 255), -1)\n                else:\n                    cv2.circle(im_show, (int(pp[0]), int(pp[1])), 3, (0, 255, 0), -1)\n        path = os.path.join(cfg.vis_dir, '{}_test'.format(cfg.exp_name),\n                             meta['image_id'][0].split(\".\")[0] + \"_{}iter.png\".format(idx))\n        cv2.imwrite(path, im_show)\n        shows.append(im_show)\n\n    # init_py = init_polys.data.cpu().numpy()\n    # im_show_score = image_show.copy()\n    # for in_py in init_py:\n    #     mask = np.zeros_like(cls_preds[0], dtype=np.uint8)\n    #     cv2.drawContours(mask, [in_py.astype(np.int32)], -1, (1,), -1)\n    #     score = cls_preds[0][mask > 0].mean()\n    #     if score > 0.9:\n    #         cv2.drawContours(im_show_score, [in_py.astype(np.int32)], -1, (0, 255, 0), 2)\n    #     else:\n    #         cv2.drawContours(im_show_score, [in_py.astype(np.int32)], -1, (255, 0, 255), 2)\n    #     cv2.putText(im_show_score, \"{:.2f}\".format(score),\n    #                 (int(np.mean(in_py[:, 0])), int(np.mean(in_py[:, 1]))), 1, 1, (0, 255, 255), 2)\n    #     print(score)\n\n    # path = os.path.join(cfg.vis_dir, '{}_test'.format(cfg.exp_name),\n    #                     meta['image_id'][0].split(\".\")[0] + \"init.png\")\n    # cv2.imwrite(path, im_show_score)\n\n    show_img = np.concatenate(shows, axis=1)\n    show_boundary = cv2.resize(show_img, (320 * len(py_preds), 320))\n\n    # fig = plt.figure(figsize=(5, 4))\n    # ax1 = fig.add_subplot(111)\n    # # ax1.set_title('distance_field')\n    # ax1.set_autoscale_on(True)\n    # im1 = ax1.imshow(cls_preds[0], cmap=cm.jet)\n    # plt.colorbar(im1, shrink=0.75)\n    # plt.axis(\"off\")\n    # path = os.path.join(cfg.vis_dir, '{}_test'.format(cfg.exp_name),\n    #                     meta['image_id'][0].split(\".\")[0] + \"_cls.png\")\n    # plt.savefig(path, dpi=300)\n    # plt.close(fig)\n    #\n    # fig = plt.figure(figsize=(5, 4))\n    # ax1 = fig.add_subplot(111)\n    # # ax1.set_title('distance_field')\n    # ax1.set_autoscale_on(True)\n    # im1 = ax1.imshow(np.array(cls_preds[1] / np.max(cls_preds[1])), cmap=cm.jet)\n    # plt.colorbar(im1, shrink=0.75)\n    # plt.axis(\"off\")\n    # path = os.path.join(cfg.vis_dir, '{}_test'.format(cfg.exp_name),\n    #                     meta['image_id'][0].split(\".\")[0] + \"_dis.png\")\n    # plt.savefig(path, dpi=300)\n    # plt.close(fig)\n\n    cls_pred = cav.heatmap(np.array(cls_preds[0] * 255, dtype=np.uint8))\n    dis_pred = cav.heatmap(np.array(cls_preds[1] * 255, dtype=np.uint8))\n\n    heat_map = np.concatenate([cls_pred*255, dis_pred*255], axis=1)\n    heat_map = cv2.resize(heat_map, (320 * 2, 320))\n\n    return show_boundary, heat_map"
  },
  {
    "path": "vis/README.md",
    "content": "Save the visualization image here"
  }
]